Andrew Chien, Dave Patterson and I have each written articles on the challenges and opportunities inherent in multicore hardware and software for the Community Computing Consortium (CCC) blog. My recent article, on the challenge of software, is now posted. In the article, I argued that we must re-envision parallel computing and a new generation of applications that explicitly exploit the scale and heterogeneity of multicore. You can read the article on the CCC blog or below.
Multicore: It's The Software
For over thirty years, we have watched the great cycle of innovation defined by the commodity hardware/software ecosystem – faster processors enable software with new features and capabilities that in turn require faster processors, which beget new software. The great wheel has turned, but it no more, as power constraints and device physics now limit the performance achievable with single microprocessors.
Multicore chips – those with multiple, lower power processors per chip – are now the norm. Moreover, current multicore chips (those with 4-8 cores/chip) are but the beginning. We can expect hundreds of cores per chip in the future, with diverse functionality (graphics, packet protocol processing, DSP, cryptography and other features).
The software research challenge is clear – developing effective programming abstractions and tools that hide the diversity of multicore chips and features while exploiting their performance for important applications. Hence, we need a vibrant community of researchers exploring diverse approaches to parallel programming – languages, libraries, compilers, tools – and their applicability to multiple application domains.
Microsoft researchers are investigating all of these approaches, from coordination languages for robots and distributed systems to mobile phones to desktops and data center clouds. To engage the academic community, Microsoft funds multicore research projects and many sites, and we have partnered with Intel to fund the Universal Parallel Computing Research Centers (UPCRCs) at the University of California at Berkeley and the University of Illinois at Urbana-Champaign.
As Richard Hamming famously noted, "The purpose of computing is insight, not numbers." In that spirit, I believe our research challenge is to break free from the limitations of the desktop metaphor and exploit the ever greater performance of multicore chips to create new human-computer interaction metaphors that are more natural and intuitive. This will require new approaches to parallel computing education and increased collaboration with researchers in application domains.
As an example, consider one possible future – "spatial computing" – where real-time vision and speech processing, coupled with knowledge bases, distributed sensors and responsive objects, enhance human activities in contextually relevant ways while remaining otherwise unobtrusive. Such an infosphere would adapt to its user's needs and behavior and move seamlessly across home, work and play.
Multicore brings enormously interesting intellectual challenges and the opportunity to rethink much of how we approach computing. Let's embrace the opportunity!
Comments
You can follow this conversation by subscribing to the comment feed for this post.