On Thursday, May 8, I testified to the U.S. Senate Committee on Commerce, Science and Technology. The full committee hearing was on improving the "Capacity of U.S. Climate Modeling for Decision Makers and End-Users." The other members of the hearing panel were
- Alexander (Sandy) MacDonald, Director, Earth System Research Laboratory, Deputy Assistant Administrator for Laboratories and Cooperative Institutes, Office of Oceanic and Atmospheric Research, National Oceanic and Atmospheric Administration
- James (Jim) Hack, Director, National Center for Computational Sciences, Oak Ridge National Laboratory
- Edward Sarachik, Co-Director, Center for Science in the Earth System, Professor Emeritus of Atmospheric Sciences, Joint Institute for the Study of the Atmosphere and Ocean, University of Washington
- Bruce Carlisle, Assistant Director, Massachusetts Office of Coastal Zone Management
- John Walsh, President's Professor of Climate Change, Chief Scientist, International Arctic Research Center, University of Alaska
Jim Hack and I represented the computing and computational science issues, and the other four focused on the climate aspects. Within a few weeks, our written testimony will be posted on the Committee's hearing page, and in due time (many months), our oral testimony will appear in the Congressional Record.
If you have never attended or participated in a hearing, they follow a standard script, as I have learned from multiple experiences. First, one submits written testimony, typically due 48 hours before the hearing. Committee staffers use this material and their own sources to prepare briefing materials for the Senators (or Representatives for a House hearing) who will attend the hearing. At the hearing itself, each witness typically gets five minutes to deliver an oral position statement, usually the key points from their written testimony.
I've always found it useful to practice the delivery of my oral statements a few times, as it can be unnerving to see the timer on the witness table ticking down to activation of the red light. After all of the witnesses have given their oral statements, questions begin. That is the process, now on to the content.
Regional Fidelity
The thrust of the hearing was our ability (or inability) to predict the regional effects of climate change, such things as weather changes (e.g., precipitation levels), storm surges and flooding in coastal areas and hurricane frequency and severity. Senator John Kerry, who chaired the hearing, was particularly interested in the limitations of our current computing infrastructure – did we have enough high-performance computing (HPC) infrastructure and were our current models adequate to assist decision makers at local and regional levels.
The answer to both questions, rather obviously, is no. I pointed to the recent Department of Energy's exascale report, Modeling and Simulation at the Exascale for Energy and the Environment, to illustrate the need for greater computational fidelity. The climate experts rightly noted the need for additional observational data, particularly at regional scales, and the continued need for scientific and mathematical research to improve the models.
The deeper issue is the need to understand the practical effects of climate change in local areas (in addition to the deeper issue of what we do at a global level). Changes in precipitation patterns affect agriculture and commerce, rises in sea levels create coastal flooding due to storm surge, and long-range trends shift the frequency and severity of local weather. Local and state agencies need information to formulate public policies, such as flood plain insurance rates and building codes, among others.
Strategic Planning
Interestingly, the climate experts (and I make no claims to be an expert in that domain) observed that one of their primary obstacles was a coherent, coordinated multiagency process that integrated expertise and resources. This is a perpetual problem, reflecting both the strengths and weaknesses of our federated research processes and mechanisms.
How many times have all of us in computing made the point for an integrated, coherent research and development program that is based on a long-term, strategic plan? Jim Hack and I argued again that we needed balanced investment in hardware, software, algorithms and science, and I pointed to the 2003 High-End Computing Revitalization Task Force (HECRTF) community report, the 2005 PITAC computational science report and the 2007 PCAST computing research report. I also made the point that I am worried about our lack of investment in architectures better suited for scientific applications and the concomitant programming difficulties. As we all know, multicore processors exacerbate this already difficult problem.
Struggling for an analogy that would be intelligible to non-experts in computing, I opined that our transition from vector computing to commodity clusters was a transition from a single bulldozer to a thousand shovels. I perhaps tortured the analogy too much by observing that multicore processors were a larger number of even smaller shovels. I was tempted to compare large clusters of large-scale multicore processors to millions of teaspoons, but that seemed overly cute.
My oral testimony follows, with the key points highlighted. The details are all in my written testimony. As this was a hearing about science policy, I testified as CRA board chair, not as a Microsoft expert.
Oral Testimony
Good afternoon, Mr. Chairman and Members of the Committee. I am Daniel Reed, Chair of the Board of Directors for the Computing Research Association (CRA) and a high-performance computing researcher.
Today, I would like to make four points regarding the status and future of high-performance computing (HPC) for climate change modeling.
High-end Computational Science: Enabling Climate Studies
We now face true life and death questions – the potential effects of human activities and natural processes on our climate. I believe HPC and computational science provide among our best options to gain that understanding.
HPC systems now bring detailed computational climate models to life. However, a recent Department of Energy (DOE) study estimated that climate modeling could use an exascale HPC system effectively. That's a computer one thousand times faster than today's most powerful petascale system and nearly one billion times faster than a PC. Computing Research Association (CRA) and a high-performance computing researcher.
Why are climate models so complex? First, one must simulate many years to validate the models against observational data. Second, to understand possible environmental changes, one must model sensitivity to many conditions, including CO2 emissions. Third, to understand the interplay of biogeochemical processes with public policies, one must evaluate model ensembles. Finally, one must study detailed regional effects such as hurricanes and storm surges, not just global ones. (In my previous position in North Carolina, we partnered with domain researchers to generate new flood plain maps for the state, based on high-performance computing.)
This leads to my second point: HPC availability for climate studies.
High-Performance Computing Resource Availability
In the 1980s, the importance of computing to science and the dearth of HPC facilities for research stimulated creation of the National Science Foundation (NSF) and Department of Energy's (DOE) Office of Science supercomputing centers. They now provide much of the U.S. scientific HPC resources, including for climate change.
Without question, our HPC infrastructure is enormously greater than twenty years ago, but so too are our expectations and our needs. Equally tellingly, most HPC resources are shared across many scientific disciplines; only a portion of them support climate change studies.
This leads to my third point: HPC technology trends.
Computing Evolution: Lessons and Challenges
Until the mid-1980s, HPC was defined by custom-designed vector computers, such as those designed by Seymour Cray. The ubiquitous PC changed that, creating a new HPC model – large clusters of PCs. By analogy, this was a shift from a single bulldozer to a thousand shovels. However, our twenty year "free ride" of increasing microprocessor performance (bigger shovels) has ended, and a second transition – multiple processors per chip (lots of small shovels) – is underway. This multicore revolution will be even more disruptive, profoundly affecting both industry and climate researchers. Simply put, we are now suffering the delayed consequences of limited federal research investment in this area.
Moreover, the scientific data deluge from new instruments threatens to overwhelm our research institutions and the ability of climate researchers to integrate data with multidisciplinary models.
This leads to my last point: climate HPC R&D and procurement models.
Actions: A Sustainable, Integrated Approach
We must explore new HPC hardware designs that better support scientific and defense applications, recognizing that the design costs for these systems are rarely repaid by commercial sales. Thus, we must rethink our models for HPC R&D and procurement. (A million rowboats are no substitute for an aircraft carrier.)
We also need new programming models that simplify application development for multicore processors. Today, climate modeling teams spend inordinate amounts of time tailoring software to HPC systems, time better spent on climate research. Climate change analysis requires diverse computing support, and HPC facilities must be balanced with investments in software, storage, algorithms and tools.
In 2005, I chaired the President's IT Advisory Committee (PITAC) report on computational science and in 2007 co-chaired the President's Council of Advisors on Science and Technology (PCAST) review of computing research. Both reports recommended an interagency strategic roadmap for research computing and HPC infrastructure.
In summary, our challenges are to sustain both research and the deployment of the HPC systems needed to ensure our planet's health.
Thank you very much for your time and attention. I would be pleased to answer any questions you might have.
Comments
You can follow this conversation by subscribing to the comment feed for this post.