Robert O'Callahan, former CMUer and current Mozilla hacker extraordinaire, has a couple
thoughts on research directions for PL:
I also suggested that the worst code is not necessarily buggy code, but code that is unnecessarily complex. Detecting that would be an interesting new direction for program analysis.
And also:
...[T]he state of parallel programming models, languages and tools remains pathetic for general-purpose single-user programs and no breakthrough should be expected. My position is that for regular desktop software to scale to 32 cores by 2011 (as roadmaps predict) we'd have to rewrite everything above the kernel, starting today, using some parallel programming model that doesn't suck. Since that model doesn't exist, it's already too late. Probably we will scale out to a handful of cores, with some opportunistic task or data parallelism, and then hit Amdahl's law, hard. It is probably therefore more fruitful to focus on new kinds of applications which we think we have reasonable ideas for parallelizing. I think virtual worlds (which are not just "games", people!) are a prime candidate. That's a direction I think the PL/software engineering research community should be pushing in.
No comments:
Post a Comment