This is a personal blog updated regularly by Professor Daniel Reed at the University of Utah.
These musing on the current and future state of technology, scientific research, innovation, and personal history, are my own and don’t necessarily represent the University of Utah's positions, strategies, or opinions.
The Iowa City Press Citizen recently published an opinion piece that was a précis of my recent Iowa City Chamber of Commerce after dinner speech. In both the speech and the opinion piece, I discussed the 50th anniversary of the War on Poverty, the need for lifelong education and engaged public-private partnerships.
N.B. I also write for the Communications of the ACM (CACM). The following essay recently appeared on the CACMblog.
In straitened financial times, time horizons shrink. This observation is self-similar across scales, applying equally to individuals and families, small businesses and corporations, and countries and economic blocs. If you find yourself struggling to pay bills, even after eliminating luxuries, then you defer some purchases, often painfully. Indeed, if you are homeless, cold and hungry, physical needs shrink time to the here and now – the next meal and a warm place to sleep trump all else. That is something worth remembering about those less fortunate as we face the post-holiday winter and a near-record cold across much of the continental United States.
When times are difficult, as an individual, you keep driving that aging car or truck, even as its reliability declines and the risks of major failure increase. As a small business owner, you defer that infrastructure upgrade, making do with what you have. As a CEO, you avoid risks, focusing on expense reduction and weathering the financial storm. As a country, you collectively focus on the short term, avoiding or militating the effects of recession, prioritizing short-term expenditures over long-term investments.
These sacrifices are natural and rational – in the short term. If continued too long, however, they ultimately lead to calamity and loss, as individuals suffer, infrastructure fails and the future becomes shrouded in a miasma of unfulfilled dreams. For all our physical wants and needs, we are creatures of dreams.
Make no mistake; balancing the immediate, pressing and real needs of the here and now against the uncertain and ill-defined future is a difficult task, made no easier by a cacophony of competing petitioners, each with compelling arguments and considerable needs. Yet it is precisely such a time when wisdom and foresight are required; it is the very definition of leadership. Although the needs of the present are real, dreams of the future must not be sacrificed on the altar of exigency.
Telling the Future the Past
Today, in the U.S. we face difficult challenges, with a growing backlog of deferred intellectual maintenance. Over-worked and sleep-deprived drivers are steering many of our vehicles of discovery on balding tires across potholed roads. Stripping away the metaphor, we are struggling to sustain appropriate investments in basic research infrastructure and facilities operations, and our dispirited researchers face ever-diminishing odds of research funding as they work to keep laboratories operational and students and post-doctoral research associates funded.
It is worth remembering that the Computer Science and Telecommunications Board (CSTB) of the U.S. National Academies released and updated the famous "tire tracks" diagram, illustrating the path from basic computing research ideas to major industries. In almost every case, the time from discovery to major societal impact was a decade or more, yet few could have imagined that impact at the time of discovery.
Today's ubiquitous smartphone or tablet has its roots in Engelbart's 1960s "mother of all demos," and a host of other advances in microprocessors, memory and storage systems, web services, and wired and wireless broadband communications. The same is true of cloud computing, advanced robotics, streaming multimedia, global positioning systems, and supercomputing. Each capability is the evolving outgrowth of decades of basic and applied research by tens of thousands of dedicated and passionate researchers. Their dreams of what might be became the computing and communications infrastructure that underpins today's society.
The impact of basic research is no less profound in a host of other domains, and today's choices have long-term implications for national and international competitiveness. As history has repeatedly shown, investment in the future – basic research – is integral to economic recovery and long-term growth. Yet by its very definition, the intellectual and pragmatic outcomes of specific research projects and directions are unpredictable. It is only in retrospect that we see the clear and unmistakable benefits – in medicine and public health, in design and manufacturing, in energy production and efficiency and yes, in computing and communications.
The Road Ahead
The past speaks urgently to the present about the future. It whispers about what could be, about dreams deferred and opportunities lost, about innovation and economic success, and about creativity.
It is time to put some new tires on the vehicle of scientific discovery and head out to the future. As Kerouac noted in On the Road, there's "Nothing behind me, everything ahead of me, as is ever so on the road." We do not know what we will find, but the journey itself is the destination. It leads to the future and a better world.
You can read the hearing charter and my extended, written testimony on the hearing web site and watch an archived video of the hearing. In my written and oral testimony, I made four points, along with a specific set of recommendations. Many of these points and recommendations are echoes of my previous testimony, along with recommendations from many previous high-performance computing studies.
What that backdrop, here is what I said during the hearing.
Oral Testimony
First, high-performance computing (HPC) is unique among scientific instruments, distinguished by its universality as an intellectual amplifier.
New, more powerful supercomputers and computational models yield insights across all scientific and engineering disciplines. Advanced computing is also essential to analyzing the torrent of experimental data produced by scientific instruments and sensors. However, it is about more than just science. With advanced computing, real-time data fusion and powerful numerical models, we have the potential to predict the tracks of devastating tornadoes such as the recent one Oklahoma, saving lives and ensuring our children's futures.
Second, the future of U.S. computing and HPC leadership is uncertain.
Today, HPC systems from DOE's Oak Ridge, Lawrence Livermore and Argonne National Laboratories occupy the first, second and fourth places on the list of the world's fastest computers. One might surmise that all is well. Yet U.S. leadership in both deployed HPC capability and in the technologies needed to create future HPC systems is under challenge.
Other nations are investing strategically in HPC to advance national priorities. The U.S. research community has repeatedly warned of the eroding U.S. leadership in computing and HPC and the need for sustained, strategic investment. I have chaired many of those studies as a member of PITAC, PCAST, and National Academies boards. Yet these warnings have largely been unheeded.
Third, there is a deep interdependence among basic research, a vibrant U.S. computing industry and HPC capability.
It has long been axiomatic that the U.S. is the world's leader in computing and HPC. However, global leadership is not a U.S. birthright. As Andrew Grove, the former CEO of Intel, noted in his famous aphorism, "only the paranoid survive." U.S. leadership has been repeatedly earned and hard fought, based on a continued Federal government commitment to basic research, translation of research into technological innovations, and the creation of new products.
Fourth, computing is in deep transition to a new era, with profound implications for the future of U.S. industry and HPC.
U.S. consumers and businesses are an increasingly small minority of the global market for mobile devices and cloud services. We live in a "post-PC" world where U.S. companies compete in a global device ecosystem. Unless we are vigilant, these economic and technical changes could further shift the center of enabling technology R&D away from the U.S.
Recommendations for the Future
First, and most importantly, we must change our model for HPC research and deployment if the U.S. is to sustain its leadership. This must include much deeper and sustained interagency collaborations, defined by a regularly updated strategic R&D plan and associated verifiable metrics, and commensurate budget allocations and accountability to realize the plan's goals. DOE, NSF, DOD, NIST and NIH must be active and engaged partners in complementary roles, along with long-term industry engagement.
Second, advanced HPC system deployments are crucial, but the computing R&D journey is more important than any single system deployment by a pre-determined date. A vibrant U.S. ecosystem of talented and trained people and technical innovation is the true lifeblood of sustainable exascale computing.
Finally, we must embrace balanced, "dual use" technology R&D, supporting both HPC and ensuring the competitiveness of the U.S. computing industry. Neither HPC nor big data R&D can sacrificed to advance the other, nor can hardware R&D dominate investments in algorithms, software and applications.
Warning: This is a long post, reflecting the complexity and nuance of Internet governance.
Just before the 2012 holidays, the World Conference on International Telecommunications (WCIT), pronounced as “wicket” by the cognoscenti, concluded in Dubai. In the buildup to the WCIT, there was much handwaving and fearmongering, political position jockeying, and backroom negotiations by multinational companies, governments, non-governmental organizations (NGOs), technology policy wonks and political pundits. There were frequent stories in the trade and policy rags and the mainstream press, including the New York Times. How, you might ask, could such a seemingly obscure conference engender such an international frenzy?
In the spirit of full disclosure, I should reveal that I spent a good portion of the last three years working on this issue, while heading Microsoft's Technology Policy Group. I traveled the world, meeting telecommunications policy leaders, trade associations, companies and senior representatives of international governments, including ones from the United States Departments of State, Commerce and Federal Communications Commission (FCC). With that disclosure, as well as noting that what follows are my personal opinions, a bit of background and history is in order.
Telecommunications History
The International Telecommunications Union (ITU), which is a United Nations organization, organizes the WCIT. The ITU began as the International Telegraph Union, which suggests some its origins and history. Originally responsible for coordinating global use of the radio spectrum and, more recently, assisting in international assignment of satellite orbits, the ITU now operates under three units, the ITU-R (radiocommunications), ITU-T (standardization) and ITU-D (development).
As a UN organization, the ITU's members are countries, rather than individuals, companies or NGOs. Despite the one country, one vote governance model, there is multistakeholder participation in some informal aspects of the ITU. I was a frequent visitor to the ITU, and I participated in ITU retreats and CTO roundtables. I know the ITU leadership well.
Because major ITU policies and approaches, including tariffs, are codified as international treaties – the International Telecommunication Regulations (ITRs) – policy changes are fraught with all the complexities that accompanies any international treaty. Indeed, before the 2012 WCIT, the ITRs were last updated in 1988. The languid pace of treaty change is but one of many challenges posed by the ITRs. It has also brought into question the relevance of the ITU and its policies and approaches. The country voting model and UN culture do not match the freewheeling Internet world.
Internet Time
Looking back twenty four years to the previous WCIT takes one almost to prehistory in Internet time. The TCP/IP protocols were in place, but international networking was largely a research or private network curiosity and dialup modems defined the consumer network experience. The World Wide Web was still a gleam in the eye of Tim Berners-Lee; the Mosaic web browser had not burst on the scene; the dot.com frenzy was yet to come; mobile phones were rare, expensive and bulky; and plummeting international telephony charges due to IP-based audio and video calling were still over the horizon. In short, the telecommunications world of 2012-2013 is radically different from the staid landscape of 1988.
The technological and economic changes wrought by the Internet in those fourteen years are equaled by the social and cultural changes. Geographically and politically separated regions are now digitally interconnected in ways unthinkable just a few years ago. The global flow of information has challenged governments, companies, NGOs and individuals to adapt legal frameworks, regulations, technical operations, social expectations and national security operations. (See Globally Connected, Globally Worriedand Information Privacy: Changing Norms and Expectationsfor a bit of perspective.)
Multifaceted Internet Governance
One can parse Internet governance challenges in several ways: technical, legal and economic, social and ethical and national security. Let's begin with the technical issues, which are perhaps the simplest. Operating the global Internet effectively requires de facto adherence to an evolving set of technical standards. The Internet Corporation for Assigned Names and Numbers (ICANN) manages the domain name system (DNS) and IP address blocks on behalf of the international community.
Although not without controversy, ICANN and its sister organization, the Internet Society'sInternet Engineering Task Force (IETF), have generally been effective in creating a multistakeholder governance model for Internet operations. However, the tizzy over generic top-level domain names (gTLDs) and intellectual property protection has heightened government desires to control ICANN.
Shifting to economics, companies, countries, NGOs and privacy advocates all worry about the transnational flow of data, differing laws and intellectual property protections, jurisdictional constraints and safe harbors. If you are a Kenyan national working for a German company with operations in Brazil, and you travel to India, just whose laws govern you and your company? The answer is murky, but generally all of the above, plus some others.
Nevertheless, Internet service operators must respond to civil and criminal legal requests for data every day. For those subject to multiple legal jurisdictions, it is sometimes impossible to satisfy the laws of countries involved. For those interested and suffering from insomnia, I highly recommend reading the Bank of Nova Scotia case (United States v. Bank of Nova Scotia, 740 F.2d 817 (11th Circuit, 1984).
Mixed with all of these legal and economic issues are elements of international trade, protectionism and economic development. The rise of cloud computing, whose economics have driven massive scale and data consolidation, has exacerbated these concerns about extraterritorial jurisdiction and control worldwide, particularly when U.S. companies dominate cloud computing and many fear the U.S. PATRIOT Act.
Then there are the crucial issues of human rights and free speech. How does one reconcile global communication, which brings differing international norms and expectations for freedom of speech into direct and day-to-day conflict? It is more than an abstract question for millions of Internet users, and one with no simple answers.
One can debate the ethics, human rights records and legitimacy of certain governments, but the sovereignty of nations and their right to establish laws within their territory is an unquestioned aspect of international law. I have seen senior representatives of governments with widely divergent views on freedom of expression all agree that their governments have a critical role to play in Internet governance. Those same representatives differed markedly in their delineation of appropriate governmental roles, their definitions of free speech and its appropriate limitations.
Finally, there are issues of national security, information security and cyberwarfare that are beyond the scope of this essay. In a knowledge economy, information is advantage, and economic or technical disruption via the Internet itself can be a form of low-grade warfare. Likewise, with military capabilities themselves increasingly dependent on smart, network-connected weapons, they are themselves objects of defense and attack.
Back to the WCIT
This interplay of technology, international law, economics and trade, social norms and human rights, and national security is a witch's brew of complexity, with diverse stakeholders and perspectives. Many of them encamped in Dubai at last December's WCIT.
Thus, it is not surprising there was acrimony and controversy, with claims and counterclaims; one could hardly expect otherwise. Central to these debates were concerns, some say unfounded, though others disagree vehemently, that the U.N., via the ITU, might assume greater control of Internet governance, shifting from the multistakeholder model toward greater centralization and government control.Some of this was also tied up in the ITU’s own search for new relevance.
What emerged from two weeks of painful negotiations was what can best be described as an uneasy peace. After much debate, 89 of 152 countries signed an amended version of the ITRs. The U.S., Japan, Canada, Germany, India, and the U.K. were not among the signatories, objecting to attempts to curtail multistakeholder governance. In short, it was an ugly mess.
Futures
What's next? The global community is strongly divided along ideological lines. Thus, we are likely to see even greater dichotomy in government intervention, more uncertainty in international law, and limits on global information flow. Despite this, I believe the Internet will continue to grow and evolve organically. Too many stakeholders want that to happen, and their voices must be heard.
Thus, I am a strong proponent of the multistakeholder model. I do recognize, though, that governments have an important role to play, just as they do in other domains. This is a messy process, and it will undoubtedly continue that way. Such is the nature of debate.
After the United States Congress Joint Select Committee on Deficit Reduction, otherwise known as the Supercommittee, failed to create an acceptable bipartisan proposal for addressing the U.S. Federal budget deficit, both parties decided to defer further discussions until after the November 2012 elections. As the January 2013 deadline for automatic, across the board cuts is now drawing ever nearer, the discussions have begun again, albeit with accusations and finger pointing on both sides of the political aisle.
Research Interests
Amidst this backdrop, all of us in the research community have been sounding the alarm regarding the consequences of the research cuts that sequestration would necessitate. There is no doubt that substantial cuts to basic research would adversely affect the long-term future of U.S. innovation and global competitiveness, upset already strained university budgets, damage current research projects in a wide range of disciplines, and disrupt the lives of thousands of faculty, post-doctoral associates and students.
That said, is important to acknowledge that we in research are a special interest group, though one whose interests are vital to the future. I realize that some would take umbrage at my description of research as a special interest group, but in the political lexicon, we are, just as are health care and environmental protection. Unless one accepts the realpolitikof budget exigencies and the conflicting goals and objectives of large, disparate multiparty negotiations, the research community will neither be effective in making the case nor realistic in managing the process and likely outcomes.
One cannot simply cry, "This is good, or this is bad," one must make cogent arguments about why certain choices yield differential benefits to the budget negotiators' positions and policies and why those choices are better than other ones. (See Being Bilingual: Speaking Technology and Policy.) Remember, there are far more good ideas than there are resources, and this is equally true in government and science.
Creating Opportunity
Despite the political polarization in Washington, I still believe a budget compromise will emerge. It will not be perfect – such is the very nature of compromise, but I suspect it will include some acceptable combination of revenue increases and budget reductions. Despite its politicization, there is still broad recognition of the importance of basic research; I believe it will fare relatively well when the Sturm und Drang are done.
However, with research proposal success rates plummeting and Hobbesian choices between research infrastructure and investigator support now necessary, we face major challenges. In the apocryphal phrasing of Ernst Rutherford, "We have no money. We must think."
Thinking will undoubtedly mean questioning some perceived verities and deeply held beliefs. NIH R01 awards will no longer be de facto expectations for promotion and tenure. Research infrastructure sharing across institutions may well become the norm, and not just for large-scale instrumentation. Cross-disciplinary tradeoffs about relative investment will become even more pressing. Industry-academic partnerships will rise in importance, as we develop more effective and mutually beneficial industrial collaboration frameworks. However, these industry partnerships will not a surrogate for federal funding of basic research.
Whatever the outcomes, by revisiting some of our assumptions, we can create more free energy in the research system and dedicate precious resources to new and emerging opportunities. I am confident that many of these new opportunities lie at our disciplinary interstices, and hybridization and cross-fertilization can yield new insights and outcomes. More broadly, the coupling of the arts and humanities with public policy, science and engineering, and biomedicine can be transformative. This is consilience in its highest form. However, we must think.
Take heart and keep the faith. The future can be and will be bright – if we make it so.
I am a member of the U.S. National AcademiesBoard on Global Science and Technology (BGST). As the name suggests, the role of BGST is to examine the shifting nature of the global science and technology enterprise and its implications. These include the global flow of intellectual talent and capital, the interplay of government policies, research and development priorities, innovation and technology transfer, and global competitiveness and security. This is a wide-ranging mandate, which is both exciting and challenging.
Lest this seem mere academic punditry, remember that 30-40 percent of U.S. net economic growth in recent decades has been due to advances in information technology. This is not just a tale of Silicon Valley, but also one for the entire country's economy.
Beyond Dennard Scaling
Moore's Law, the notion that the number of transistors in a given silicon area doubles roughly every two years, is not a law or even a theorem. Rather, it was an empirical observation, originally made by Gordon Moore in 1965. For over forty years, it has continued to hold true by virtue of enormous intellectual effort, ongoing architectural and software innovation, and billions of dollars of investments in process technology and silicon foundries. In turn, consumers, businesses and governments have been the happy beneficiaries of faster microprocessors, more powerful, inexpensive and ubiquitous computing devices and a rich and varied suite of software applications.
However, there is bad news. The continued and seemingly miraculous improvement in general processor performance is now over, a consequence of physical limits on transistor shrinkage that was itself a happy consequent of Dennard scaling. To be clear, Moore's law continues, with the number of transistors on a chip continuing to double, but the transistors no longer become more energy efficient as they shrink. The result has been bounds on microprocessor clock rates due to energy dissipation constraints and the concomitant rise of multicore chips and function-specific accelerators such as GPUs. (See Battling Evil: Dark Silicon and Nothing Is Forever for a few reflections and details.)
This radical shift breaks a virtuous cycle of mutually reinforcing benefits, one where software developers created feature-filled applications and stimulated demand for faster general-purpose processors, which then drove the creation of even more advanced applications. Simply put, we are the reluctant, wide-eyed denizens of a brave new world, one where the cherished and popular expectation that applications would run faster without change each time a new backward-compatible processor appeared.
There is a technical way forward, but it means embracing application parallelism and retargeting software to each new generation of non-compatible, heterogeneous multicore processors. As over fifty years of research in parallel computing has shown, this is a path fraught with pain and difficulty. In turn, this has profound implications for the future of the silicon ecosystem and the nature and locus of continued innovation. It is the trillion-dollar inflection point, pivoting on chip performance, business models and global markets.
Ecosystem Implications
The end of Dennard scaling and the emergence of heterogeneous multicore processors has coincided with another shift, the transition from an era dominated by PCs to one defined by smartphones and tablets. For much of the world, the smartphone is now the primary computing system, and in developing economies, the aspirational feature phone is the only computing device. More to the point, the majority of PC and smartphone users are not in the U.S., nor will they ever be again. Not surprisingly, these two phenomena, the end of Dennard scaling and the rise of smartphones and tablets are deeply interrelated.
The PC ecosystem has long been driven by the phenomenal success of the x86 microprocessor family and successive generations of processors from Intel and AMD, both U.S. companies. Conversely, the smartphone and tablet ecosystem is largely based on the ARM microprocessor family and low power systems-on-a-chip (SoCs) developed by ARM licensees around the world. Beyond the ongoing competition between PC and smartphone vendors, this is a battle of business models, pitting a closed x86 ecosystem with captive silicon foundries against fabless semiconductor design firms that mix and match function-specific accelerators with ARM cores and use Taiwan Semiconductor Manufacturing Company (TSMC) as a foundry.
Enormous resources are being invested in both silicon ecosystems, with x86 designers seeking to "grow down" by reducing power and integrating functionality on chip to compete in the smartphone and tablet market. Conversely, ARM designers are seeking to "grow up" by increasing performance and adding features to compete with x86 designs in the server market. This battle royal is not winner take all. Rather, it is a competition to define the global nexus of innovation, with profound implications for global IT dominance in the next decade.
Global Competitiveness and BGST
It is with this backdrop that the BGST committee examined the technical consequences from the end of Dennard scaling, the cultural and economic challenges of parallelism, the possible shifts of capital and talent, and national and regional investments in IT research. The report's broad conclusions include the following: (a) the U.S. still leads in basic IT research, but the global gap continues to shrink, (b) IT investment strategies and challenges differ markedly across countries and regions, (c) single chip performance is unlikely to continue as the predominate focus of innovation, (d) there are serious risks that growing international markets will diminish U.S. influence and (e) U.S. national security and defense readiness depends on continued rapid uptake and deployment of advanced IT.
I encourage you to download and read the complete BGST report for additional details and insights.
In IT and innovation circles, Gordon Moore is also famous for another dictum, "only the paranoid survive." What he really said is more nuanced and relevant to the global innovation competition, "Success breeds complacency. Complacency breeds failure. Only the paranoid survive." It is worth pondering as one considers the interplay of science and technology, economics, government policy, and business models.
Acknowledgments
I would be remiss if I did not express my sincere thanks to the members of the BGST report committee: Cong Gao (Nottingham), Tai Ming Cheung (UCSD), John Crawford (Intel), Dieter Ernst (East-West Center), Mark Hill (Wisconsin), Steve Keckler (NVidia), David Liddle (U.S. Venture Partners), and Kathryn McKinley (Microsoft). There were thoughtful, dedicated and indefatigable. Finally, all of the committee members are deeply indebted to Bill Berry, Ethan Chiang and Lynette Millett from the National Academies.
Over the past year, I have been ruminating the seismic shifts rocking public higher education in the United States. The compact between our society and its public research universities is being renegotiated in ways that are as deep as anything seen in the past forty years. State support continues to decline, accelerated by the economic downturn. In turn, a public backlash is building against rising tuition. There is an increasing need for lifelong learning and skills refresh, and new technologies are challenging historical modes of content delivery.
There are also new expectations for research discoveries to stimulate innovation, coupled with often-unrealistic hopes for short-term economic payoffs from basic research. Amidst all of this, globalization and rapid technology shifts are forcing us to address complex societal issues in new ways. Finally, the nature of scientific discovery itself is in flux, with high-performance computing and big data reshaping research in the physical, biological and social sciences, and even in the arts and humanities.
Late last year I decided it was time to return to academia, taking what I have learned at Microsoft back to the university and laboratory world to help address these challenges. Since that time, I have been working quietly to ensure a smooth transition within Microsoft and working with the leadership of several universities and institutions to define the roles I would take on this fall.
Reflecting on Change
Our personal and professional lives are defined by a series of inflection points – graduation, marriage, career choices – shaped by shifting technology and societal norms. Each of us also faces the age-old question. How and where can each of us most make a difference in addressing the big issues and the complex problems surrounding them? How do we build on our experiences while also challenging ourselves to learn new things?
Before I came to Microsoft in late 2007, for me it was nearly twenty-five years of academic roles at the University of Illinois and the University of North Carolina, first as a computer science professor, then department head, supercomputing center director (NCSA), founder of a multidisciplinary research center (RENCI), and finally as a vice-chancellor. Where and how could I best build on that experience, plus insights gained at Microsoft? Was it technology or policy centric, or some combination of both?
Iowa: Research and Education
After weighing several university offers, spanning big data, HPC and policy, I am delighted to be returning to the Midwest. In October, I will be joining the University of Iowa as Vice President for Research and Economic Development and holder of Iowa's inaugural University Computational Science and Bioinformatics Chair, with joint appointments in Computer Science, Electrical and Computing Engineering and Medicine. For details on this, see the University of Iowa announcement.
Many things attracted me to Iowa. First, it is one of this country's great public universities, spanning, as all great universities do, the arts and humanities, science and engineering and associated professional schools. The university is also home to the famed Iowa Writers Workshop, something the aspiring writer in me prizes greatly. It also has a large and highly ranked health care system and a great medical school. (More on that research opportunity in a bit.)
Finally, the University of Iowa is anchored in the Big Ten, where I spent most of my academic career (Illinois) and time in graduate school (Purdue). Yes, it is football season in the U.S., but the Big Ten is more than an athletic conference. The Committee on Institutional Cooperation (CIC), which consists of the Big Ten schools plus the University of Chicago, is a collaborative vehicle for shaping national higher education policy and helping define the future on research issues ranging from institutional data repositories to intellectual property management.
In addition to my role in the university leadership team, as my new title suggests, I will also be delving into computational science and big data, helping the campus address research opportunities and health care futures. It is no secret that the rising cost of health care in the United States, coupled with an aging population and the not yet fully realized potential of personalized medicine, are both challenges and major opportunities. In that spirit, I am delighted that the chair of the University of Washington Department of Anesthesiology, Dr. Debra Schwinn, is joining Iowa as the new Dean of the Carver School of Medicine. I am looking forward to working with her and the rest of the campus.
Whether identifying predictive patterns from clinical records (e.g., predictors of hospital readmission), correlating and extracting insights from massive amounts of new bioinformatics data, or leveraging new sensors and devices for disease and lifestyle monitoring, large-scale data analysis and machine learning are crucial. Likewise, multidisciplinary computational models of biological processes with predictive power are now realizable. Simply put, these are big data and technical computing problems par excellence.
Finally, I will also be keeping my hand in high-performance computing and national policy. I will be spending time in Washington, DC, as a consultant, focusing on issues related to big data and exascale computing.
For me, all of this is very exciting. It is a new adventure and an opportunity to help define higher education in the 21st century. As Theodore Roosevelt said, it is an opportunity to "dare mighty things," working together.
Finally, Thanks to Microsoft
I want to express my deep thanks to Craig Mundie, Rick Rashid and a host of friends at Microsoft for a great five years. When I first joined Microsoft Research, it was to work on new technical approaches to cloud computing hardware, software and applications, drawing on ideas from technical computing. Seeing the scale and scope of truly large-scale infrastructure, far larger than our research high-performance computing systems, was amazing. That eXtreme Computing Group (XCG) activity later morphed into an equally exciting technology policy agenda that has spanned topics as diverse as the application of clouds to scientific and engineering research through telecommunications to Internet governance and digital privacy.
My time at Microsoft has been a truly wonderful experience, working on important problems with talented and passionate people. I have made new friends, built new relationships and learned an incredible amount.
Last week, I had the privilege to participate in the B20 meetings in Los Cabos, a prelude to the G20 economic summit. For those of you who do not know, the B20 is a meeting of international CEOs who gather to discuss economic issues and provide suggestions to the G20 leaders. I was there representing Microsoft and substituting for Steve Ballmer.
The B20 meeting opened with a plenary presentation by the Mexican President, Felipe Calderón, who was also chair of this year's G20 meeting. The President gave a rousing speech advocating free trade and the economic benefits. This was followed by a rather sobering panel conversation with Christine Lagarde, head of the International Monetary Fund, Bob Zoellick, head of the World Bank, and Angel Gurria, secretary general of the OECD.
Not surprisingly, the Eurozone crisis and the then ongoing Greek elections dominated conversation. The frustration and near despair was palpable, with Gurria noting that fiscal union without effective governance was the root of the problem. I found myself thinking that this as an old story. After all, the U.S. fought a civil war to resolve, among other issues, the relative power of states and the Federal government. In any case, there were repeated pleas for strong and consistent action the resolve the financial crisis permanently, but limited faith that would happen.
The spirit of the conversation was captured succinctly in an exchange with a B20 delegate from Africa, who rose to ask why the conversation was dominated by European concerns. It was a quite reasonable and appropriate question at a global economic panel. Each of the panelists smiled wryly and said, in their own words, "In this context, no mention is good news."
My takeaway is that the global economic situation is not going to get better any time soon. The Chinese economy is clearly slowing, the Eurozone crisis seems endless, and the U.S. situation is tenuous at best.
ICT Recommendations
As I mentioned at the outset, I co-chaired the B20 ICT and Innovation task force. The core recommendations focused on broadband access and digital inclusion, with four main pillars, and they were presented in a panel discussion, with comments from the Chilean President, Sebastián Piñera.
Enabling Broadband for All.
This recommendation is an extension of the work I have been doing on spectrum and telecommunications policy, including white spaces. The economic and political circumstances in each country and region are different, with different regulatory regimes and wired and wireless incumbents. However, the economic data are clear and unmistakable. When broadband – at affordable rates – is available to a substantial fraction of the population, net economic growth is higher. The details of the recommendation are in the report, but the focus is on removing regional and national regulatory restrictions, stimulating competition, supporting new business models, and making devices available to stimulate deployment and access.
Developing Content and Applications for the Public Good.
Those of us in developed countries, and particularly English-speaking ones, tend to forget that access to local content and software, in one's native tongue, is more limited in other parts of the world. Local content and services, including government data and transparency, can increase citizen participation in government, increase political transparency and stimulate economic growth.
Ensuring Cybersecurity for All.
These recommendations centered on uptake of best practices globally and public-private partnerships to ensure security access to services.
Promoting Innovation in ICT.
These were the standard, oft-repeated pleas for appropriate support for innovation and risk tasking.
Personal Reflections
Over the past six months, have I had several opportunities to have small group conversations with President Calderón. I have been impressed by his sincerity and commitment to collaboration and open government. It will be interesting to see what happens to Mexican policies after the Mexican President elections in a few months.
As one would expect, there was intense security around the leaders. From this event, my earlier participation in the Asia-Pacific Economic Cooperation (APAC) meeting and visits to the White House, I have come to realize that security details for senior government officials all look very similar. There are dark suits, the subtle but clear bulges of weapons, and the constantly searching gazes.
Finally, the gala events provided an opportunity mingle and discuss politics, economics and policy with a diverse and interesting group. There are not many places where one can first discuss social inclusion with a Peruvian cabinet minister, then join a dinner conversation with the South African ambassador and two CEOs. In addition, of course, there are the obligatory photographs of me in a guayabera.
In recent months, I have spent a great deal of my time talking to governments about science and technology policy and innovation. Money, or more accurately, the lack thereof, is a common theme that runs through all of the conversations, whether in the United States, the European Union, Japan or other parts of Asia. The global economy is still struggling to recover from the recession of 2008, the economic malaise in Europe is very real, and political and economic gridlock in the U.S. threatens the country's ability to chart a competitive future. Despite these challenges, indeed, often because of them, most world leaders are seeking innovative ways to stimulate economic growth and address deep societal problems – aging populations in developed countries, regional environmental concerns and global climate change, shifting education and workforce needs, and rising health care costs.
Today's world leaders and government ministers are surrounded by challenges, buffeted by conflicting (and sometimes irreconcilable) demands, and struggling to cope with rapid technological and societal change. The story is not much better if you are a U.S. governor or the mayor of a large city. Almost all of them all who are willing to be candid will say that the job has never been more difficult. In short, being a senior government leader, at whatever level, is not a great gig right now.
How, you might ask, does this relate to science and engineering research in general, and to high-performance computing in particular? It all devolves to questions of size and scale, something those of us in science understand well, and to a few lessons in realpolitick, where we are often laggards in the classroom of political deal making. (See Being Bilingual: Speaking Technology and Policy)
The Ask and the Close
When most of us are asked to justify government investment in basic research, or to make a case that the government should fund a specific project or research infrastructure, we reach for hoary adages. We claim that we are pushing back the frontiers of human knowledge and seeking answers to questions that have bedeviled or vexed humankind since we first had the intellectual capacity to look around and ask "why?" If truly pressed, we will talk about technology transfer and how research ideas have often birthed multi-billion dollar industries. We may even talk about educating a new generation of scientists and the importance of investment in human capital.
These statements are all true, and like many of you, I have used them many times. They are powerful and effective arguments, given the right initial conditions. However, arguments for science and engineering research funding are not made in vitro; they live or die in vivo. If there were a surfeit of funding, goodness arguments alone can actually suffice. However, I can think of no place, with the possible exception of China, where today's governments find themselves with the budgetary largess to invest in new things. In most cases, governments are struggling to cut or balance budgets, research included.
This brings us to the issue of scale. When initiative or project budgets are measured in the millions of dollars, euros or the equivalent, funding decisions often can be made without rising to the highest levels of political discussion. In most cases, it is a committee or ministerial decision to "plus up" a budget to support such activities. Supercomputing funding was once decided at this level. After all, the U.S. NSF supercomputing centers programs started with an aggregate budget of ~$70M/year in the early 1980s. With single petascale machines costing hundreds of millions of dollars, this is true no longer.
When proposed science and engineering projects or initiatives have estimated costs in the billions of dollars, they cross a critical social and governmental decision threshold. At this level, they compete visibly and directly with other national priorities – national defense, social services, health care, and public infrastructure – and the old arguments are no longer sufficient. A different dialog ensues, one involving participants who are often neither well versed in science and engineering, nor particularly supportive of its interests.
Instead, at the larger scale, one must make arguments for science and engineering investment based on both scientific benefit and other national priorities -- national security (e.g., nuclear weapon stockpile stewardship), national competitiveness (e.g., sustaining a national industrial base) and societal benefit (e.g., creating higher quality, lower cost health care). History has shown that this is difficult, but quite possible. One need only look at the termination of the Superconducting Super Collider (SSC), the fractious debate over completion of the James Webb Space Telescope, and the concerns over continued support for the International Thermonuclear Experimental Reactor (ITER) to see the challenges that accrue to nation-scale and planetary science and engineering projects, particularly when there are cost overruns.
The Exascale Moral
All of our project cost estimates for an exascale initiative place it well in excess of one billion dollars. Thus, the initiative is squarely in the realm of national priority debates, and the science and engineering case, though absolutely necessary, is no longer sufficient. One must also marshal arguments that address national, regional and global needs, and demonstrate differential advantage relative to other competing priorities. This can be done. Remember, the U.S. Accelerated Strategic Computing Initiative (ASCI) was funded using just such a combination of arguments.
Finally, one must create advantage for those government leaders who might be advocates or allies, as well as counter the arguments of opponents. After all, both advocates and opponents face difficult choices and other constituencies who will feel disenfranchised, regardless of the choices. They need arguments that have political heft and popular credibility. This is the essence of realpolitik.
Millions and billions: the difference is not just quantitative; it is politically qualitative. We must couple scientific advances and societal advantage in our arguments to make the case for continued support for scientific computing and exascale infrastructure – hardware, software, and data analysis. We can do this, working together.
In hindsight, it seems obvious that elevating the profile of computing and networking and coordinating activities across the research arms of the Federal government was a great idea. It was far less obvious at the time, as former Vice-President Al Gore reminded us during an anecdote-filled lunchtime reminiscence. There were several hearings over multiple years, with many doubters.
Yet only two years later, the Mosaic web browser was born at NCSA at the University of Illinois, helping birth the Internet revolution and the original dot.com boom. Some of you may not know that Mosaic was intended as a collaboration tool, itself the successor to another NCSA collaboration tool called Collage. Anyone see a naming pattern there? It was all about bringing people together who were otherwise separated by time and space.
At the time of Mosaic's release, NCSA was an anchor site of the NSFNet, the nascent backbone of the Internet we know today. Given the popularity of Mosaic and the fact that NCSA's web site was Mosaic's default home page, the NCSA site, hosted by NCSA HTTPd, was the world's busiest. (Bob McGrath, Thomas Kwan and I wrote an early paper on web traffic analysis, and Will Scullin, Steve Lamm and I developed some real-time traffic visualization tools for the CAVE.)
It was also a time when Illinois undergraduates were asking me if a $250K signing bonus was too small, and the startup mantra was "get big fast," focusing on number of page clicks and customers. Eventually the old economics – the one based on profits – demanded its due, and the crazier startup ideas died. For the record, selling dog food (a low cost, high weight item for shipment) on the Internet may not have been the best business plan, though a few succeeded. More rational business models emerged, today it is hard to imagine the world without e-commerce.
Looking Forward
As the NITRD celebration, I was reflecting on all of this, as well as digesting the technical content of the presentations, which spanned topics as diverse as computational science, the economic impact of computing and the rise of big data. It was in this context that I posed a question to the final panel, a question grounded in the ever-rising importance of information technology and innovation to global economic competitiveness: What national research strategy should we pursue in light of the coordination now present in other parts of the world?
The question was really about whether the HPCC initiative, with all of its economic, scientific and cultural benefits, was a singular event or something replicable in today's political environment. It was a bit of a rhetorical question, but one that seemed appropriate to frame the celebration's context. The unflappable and always thoughtful Chuck Vest gamely responded with some thoughtful observations on the importance of educational investment for the future.
Burnham and Sandburg
Former Vice-President Gore didn't invent the Internet, but fully deserves all the fulsome praise he has received for raising the issues and helping creating the conditions that let it grow and flourish. Musing on this and the NITRD discussion, I found myself thinking about another Illinois story, one that captures the tumultuous change of another century and the singular contributions of another individual.
It's the story of Daniel Burnham and the birth of modern Chicago. Daniel Burnham didn't invent Chicago, but he might as well have, for it bears his indelible stamp. He one of the driving forces behind the 1892 Columbian Exposition, which celebrated the 400th anniversary of Columbus' arrival in the Americas. The Columbian Exposition and the "White City" was Chicago's coming of age party, elevating Chicago in stature as one of the world's great cities. Mind you, this was a mere twenty years after the Great Chicago Fire, which had destroyed much of the city.
One element of the story has Burnham standing on the shore of Lake Michigan on a cold, winter day, pointing into the distance and describing the buildings and the city that would rise from the windy desolation. A visiting architect turns to Burnham and asks, "How can this be?" To which Burnham is reputed to have replied, "It is already decided." And so it was.
The anecdote is probably apocryphal, but the vision and the outcome were decidedly not. As Burnham himself wrote,
Make no little plans. They have no magic to stir men's blood and probably themselves will not be realized. Make big plans; aim high in hope and work, remembering that a noble, logical diagram once recorded will never die, but long after we are gone will be a living thing, asserting itself with ever-growing insistency.
It's the emotion Carl Sandburg also captured in his Chicago poem about Chicago as the tool maker, stacker of wheat and player with railroads, the city of big shoulders. It was the spirit of a city and a young nation, confident and excited about the future. It's the same spirit we all felt in 1991 at the beginning of the HPCC program and then in the web revolution.
No Little Plans
As we look to the future, I believe Burnham was right. It's time to make big plans, defining a truly compelling research and education strategy for the 21st century knowledge economy, one that inspires and compels us all to action. We need not be riven by doubts and troubled by today's financial research malaise. There is another way, one that rebuilds our research institutions, empowers our citizens and creates new opportunities for all. Amazing and transformational things are within reach.
How can this be? The lessons of NITRD and Burnham's Chicago point the way, working together.
Recent Comments