Computing shares many features with other scientific instruments, which allow us to probe further into the unknown. Whether it is deep field images from the Hubble telescope and insights into the origins of the universe, the high energy detectors of Fermilab and CERN and refinements to the Standard Model of subatomic particles, or large-scale genetic sequencers, insights into the limitations of the Central Dogma and an understanding of the deep biological basis of life and disease, theory and experiment march together, enabled and supported by one another.
Today, computing is the third pillar of scientific discovery, complementing and mutually enabling theory and experiment. Each advance in computing enables new validation of theoretical predictions, particularly when circumstances prevent experimental testing. It also allows experimentalists to capture and analyze the torrent of data being produced by a new generation of scientific instruments and sensors, themselves made possible by advances in computing and microelectronics.
However, one aspect of computing distinguishes it from other scientific instruments – its universality as an intellectual amplifier. Powerful new telescopes advance astronomy, but not materials science. Powerful new particle accelerators advance high energy physics, but not genetics. In contrast, computing advances all of science and engineering, because all disciplines benefit from model predictions, theoretical validations and experimental data analysis. It is only slightly hyperbolic to say that all science is now computing enabled eScience.
In this sense, computing must be defined broadly, not simply as high-performance computing but as sensors/actuators, networks, data management and mining, visualization, computing and collaboration systems – the entire ecosystem of computing capabilities that supports our social, economic and intellectual endeavors. (Though, personally, I will always feel the emotional pull from those long lines of black racks on the raised floors. What can I say? “Big iron” is a lifelong love!)
Although incremental advances in computing continue to bring research advantages, there are transition points, where major advances in computing have qualitatively changed the range of problems that can be solved. We are now at such a fundamental transition point, with the emergence of multicore processors (see Petascale and Multicore Redux) and, consequently, ubiquitous consumer parallelism, truly inexpensive sensors and broadband networks, prodigious data storage capability and rich, though still incomplete, software tools and systems. This realization has precipitated many reports, including the NSF Atkins report on cyberinfrastructure, and both national and international programs, notably U.S. initiatives via the NSF Office of Cyberinfrastructure, the U.K. eScience program, the EU Seventh Research Framework and others.
During October 21-23 in Chapel Hill, RENCI and Microsoft will be hosting the 2007 Microsoft eScience Workshop, which Tony Hey and I will co-chair. The workshop is free, but registration is required. The workshop is explicitly cross-disciplinary, with the goal of bringing together scientists from different areas to share their research and how computing is shaping their work, providing new insights and changing what can be done in science.
Recognizing the breadth and depth of the connections between science and computing, we invite contributions from all areas of eScience, from knowledge discovery and digital publishing, through teaching and learning, robotic instrumentation, high-performance computing, disaster modeling and healthcare. On behalf of my good friend Tony Hey, I hope to see you in Chapel Hill during the fall.