This is a personal blog updated regularly by Dr. Daniel Reed, Vice President for Research and Economic Development at the University of Iowa.
These musing on the current and future state of technology, scientific research and innovation are my own and don’t necessarily represent the University of Iowa's positions, strategies or opinions.
I spent my childhood in a small town, where the human population (825) was substantially outnumbered by the wildlife. It was a place where time seemed to move slowly and events from the outside world percolated slowly into the local environment and rarely affected us immediately. The single individual who filled the role of reporter, editor, circulation manager, printer and advertizing executive of the local newspaper struggled each week to fill its pages with "news." After all, there were only so many deaths, marriages and births to report.
The slowmoving world of my youth has been eclipsed by instant communication, where news flows across the planet in milliseconds, from farflung sources to a globally distributed audience. It has all been made possible by the explosive growth of inexpensive computing systems, mobile communications, the worldwide fiber network, and the rise of cloud services and social networks. Today, there are few places on the planet where one can be truly incommunicado. (See The World Is Small.)
Although the free flow of information is a wonderous thing, the concomitant decline in social and economic hysteresis now means that single events can have global effects on a frighteningly small timescale. Sometimes this can be humorous (e.g., the latest news on an entertainer's social antics), other times it isnewsworthy (e.g., a world sporting event), and still other times, it is profound (e.g., a social or government revolution).
The rapid, worldwide response to news of economic uncertainty has been both newsworthy and worrisome. It has affected European, U.S. and Asian financial markets, with fear and uncertainty in each driving response in the others. Without doubt, it is an exercise in global crowd psychology and behavorial economics that has implications for individuals, families and their future.
In a world where information traveled by sailing ship or steamship, our economic markets were much less tightly coupled. Consequently, there was time for more reflection and deliberation, even if the global markets may not have been efficient in a market sense. I am not suggesting we return to that past. We cannot, nor would we wish to do so.
Like all complex systems, our global economic and social environment operates on multiple time scales. Some of these are amenable to human reflection, others are increasingly driven by real-time communication. It is important that we remain cognizant of the power and value of timely reflection and discussion and of the personal effects of our market actions. (See Disruptive Innovation: Changing Lives.)
This is an exciting time, when the future becomes the present. Who could have imagined access to the world's knowledge base in the palm of your hand, anywhere, any time? Who could have imagined the ability of entrepreneurs to project global business presence without their own IT infrastructure and technical staff?
Such is the power of the cloud. It creates enormous opportunities for businesses – large and small – to create new products and services, operate more nimbly and efficiently, advancing the digital economy and our global leadership. It also creates opportunities for governments to operate more efficiently and offer services and capabilities that were heretofore largely inaccessible. However, like all new technologies, the cloud brings challenges as well.
Today, the TechAmerica Foundation released its cloud computing report. I was privileged to be the commercial co-chair, working with a host of industry and academic leaders drawn from across the spectrum of cloud providers, service and software developers and researchers.
Recommend actions that would ensure the competitiveness of U.S. companies and cloud providers, domestically and internationally.
The report identifies a set of best practices and actions, in response to both charges. These recommendations focus on a diverse set of topics. These include: (a) the need to update U.S. laws (e.g., the 1986 Electronic Communications Privacy Act) to reflect changing technology and legal needs; (b) clearer rules and processes on data breach disclosures to maximize transparency; (c) realization of an identity management ecosystem as envisioned by the National Strategy for Trusted Identities in Cyberspace (NSTIC); (d) sufficient bandwidth and web addresses to enable reliable connectivity to the cloud; and (e) clarity around the rising issues around transnational data flows. (See Information Privacy: Changing Norms and Expectations.) Finally, the report discusses the need for ongoing industry and academic research to advance cloud computing technologies and the cultural issues that often limit the deployment of new technologies.
As William Gibson once noted, the future is here; it is just not evenly distributed. We have the opportunity distribute the future more evenly, making client and cloud services universally available, creating jobs, ensuring American competitiveness and empowering innovation. I believe we can and we will, working together, private and public sector, to ensure U.S. competitiveness and cloud uptake. I encourage you to read the report, offer comments and engage in the ongoing dialog on the rich and rapidly evolving world of client and cloud services.
Finally, on a personal note, I am especially indebted to my colleague, Elizabeth Grossman, who worked long hours under stringent time constraints to help complete this report.
As part of this agreement with the NSF, Microsoft is making available access to Windows Azure, Microsoft's cloud computing platform. In addition, a support team of Microsoft researchers and developers is working with grant recipients to equip them with a set of common tools, applications and data collections that can be shared with the broad academic community, and also providing expertise in research, science and cloud computing. Broader details on the program and its research tools and software for Windows Azure can be found here.
Increasingly, the important scientific questions lie at the intersections of traditional disciplines, and insights from multidisciplinary collaborations drive innovation, economic development and our response to complex problems and natural disasters. This is one of many reasons I am so pleased with the technical diversity among the list of NSF-Microsoft award recipients.
All of the award recipients were selected via NSF's rigorous peer review process, which emphasized the scientific merit of the proposed work. Projects range from exploring scientific applications of cloud computing as well as extensions of cloud computing with new tools and techniques.
Cornell University (Kenneth Birman) - Building Scalable Trust in Cloud Computing
J. Craig Venter Institute (Audrey Tovchigrechko) - Bettering Interactive Protein-Protein Docking
SUNY at Buffalo (Tevfik Kosar) - Enhancing Stork Data Scheduler for Azure
University of California, San Diego (Kenneth Yocum) - Utilizing Continuous Bulk Processing
University of Colorado, Boulder (Richard Han) - Enabling Mobile Cloud Computing
University of Michigan, Ann Arbor (Qiaozhu Mei) - Refining Language Models using Web-scale Language Networks
University of North Carolina at Charlotte (Zhengchang Su) - Predicting Transcription Factor Binding Sites for Genes
Virginia Tech (Wuchun Feng) - Conducting Intensive Biocomputing
Virginia Tech (Kwa-Sur Tam) - Effectively and Widely Using Renewable Energy Sources
The Microsoft/NSF partnership is but one part of a broader international program that the eXtreme Computing Group (XCG) and Microsoft Research are building with the scientific research agencies worldwide. We believe cloud computing, coupled with powerful client tools, can transform the nature of research. This worldwide program currently targets collaborations with several national and international research partnerships, including
National Institute of Informatics of Japan Info-plosion project
European Commission FP7 funding program, Venus-C project
U.K. Engineering and Physical Sciences Research Council (EPSRC) Horizon project
This Microsoft/government partnership is about so much more than access to cloud computing resources. The deep partnership between academic and Microsoft researchers, creation and release of easy-to-use client tools and an exploration of the new world of massive data are all key elements of our shared journey to a new model of data rich analysis enabled by powerful client tools and cloud services.
Any successful technology ultimately becomes invisible, enabling and empowering without requiring the user to focus on the idiosyncrasies of the technology itself. Technical computing can and should be an invisible intellectual amplifier, as easy to use as any other successful consumer technology. As I wrote last year when we first launched this program, it is really about transforming how we conduct research by broadening researcher capabilities, fostering collaborative research communities, and accelerating scientific discovery by shifting the focus from infrastructure to empowerment.
N.B. I also write for the Communications of the ACM (CACM). The following essay recently appeared on the CACMblog.
The human suffering caused by the recent earthquake, tsunami and nuclear reactor damage in Japan have saddened us all. As we watch the ongoing struggle to cope with the complexities of a multifactor crisis, we can offer technical and logistics assistance where appropriate, extend emotional and social support to our friends and colleagues, and draw insights, as we always do, for possible responses to future disasters. Each such challenge reminds us that the interplay among complex, dynamical systems is subtle and deep, and that chaos, not just in its popular definition, but in its deep mathematical sense, is ever present, lurking in the shadows, aided by its old friend entropy.
As I have read the international newspapers, watched the television reports, listened to the radio interviews and scanned the web sites, I have been struck by the challenges each news organization has faced in explaining technical concepts in intuitive and readily accessible terms. Whether it be explaining logarithmic earthquake scales (e.g., the Richter scale), where a unitary delta corresponds to an order of magnitude difference in magnitude; radioactivity and half-lives, where exponential decay reduces the quantity of material; or the subtleties of dose equivalent radiation exposure measurements, Sieverts and food life cycles, successful explanations depend on both the knowledge of the reporters and of the audiences.
All too often, I have heard inaccurate descriptions of scientific processes or hyperbolic assessments of risks, neither grounded in facts or statistics. To be sure, explaining complex concepts is not easy. This is, perhaps, a teachable moment, where we can highlight the importance of scientific and engineering literacy across all of society, not just among a cadre of technical experts.
Regardless of what we might hope, no engineering structure, whether a nuclear reactor, tsunami sea wall, or earthquake resilient building can be perfectly safe, fully effective or absolutely resistant. Nor can all possible outcomes be anticipated in a natural disaster. One can only plan and assess, then execute accordingly, drawing on the best practices and knowledge currently available, and then learning valuable lessons from each failure.
Our society depends deeply on scientific and engineering advances – including those in computing – for they are embedded in our communication systems, manufacturing processes and supply chains, food production and processing, logistics and transportation, energy production and environmental interactions, and economic mechanisms. Societal understanding of scientific processes and terminology, as well a shared appreciation for the engineering design balances among costs, functionality and risks are essential to informed debate and decision making.
In turn, that understanding rests on our continued investment and support for STEM (science, technology, engineering and mathematics) education, from advanced technical degrees to general awareness of science and engineering principles among all our citizens. We must continue to make STEM and computing exciting and accessible, for we live in a technological society, with all its incumbent implications.
Meanwhile, I continue to wish my friends and colleagues in Japan the very best. Their struggle is our struggle.
N.B. I also write for the Communications of the ACM (CACM). The following essay recently appeared on the CACMblog.
Many years ago, Fred Brooks relayed a tale about how he chose the first application target domain for his computer graphics research. It was not long after he had left IBM and completed his work on the IBM System/360. He had just moved to Chapel Hill and taken a faculty position at the University of North Carolina.
As Fred tells the story, with a bit of a twinkle in his eye, he went to see one of the senior university administrators. He told the administrator that as a computer scientist, he was in the intelligence amplification business. Who on the campus, Fred wanted to know, might most benefit from having their intelligence amplified?
I have recalled this story many times, always with a smile, as I have reflected on the nature of computing and its power.
Amplification and Universality
Computing systems share many features with other instruments and machines, in amplifying human abilities. However, one aspect distinguishes them – namely, their general utility as an intellectual amplifier. Like a universal Turing machine, which can simulate any other Turing machine with arbitrary inputs, computing is broadly – dare I say universally – applicable to human intellectual endeavors, much as all the variants of the inclined plane and lever are applicable to human physical endeavors.
The English scientist Sir Humphrey Davy could well have been speaking about computing when he said, two centuries ago:
Nothing tends so much to the advancement of knowledge as the application of a new instrument. The native intellectual powers of men in different times are not so much the causes of the different success of their labors, as the peculiar nature of the means and artificial resources in their possession.
In a phrase – success accrues to the talented with access to the most effective and powerful tools.
Supercomputing and its applications to science and engineering have been canonical examples of this universal benefit. Powerful new telescopes advance astronomy, but not materials science. Powerful new particle accelerators advance high-energy physics, but not genetics. In contrast, supercomputing advances all of science and engineering because all disciplines benefit from high-resolution model predictions and theoretical validations.
As exciting as those opportunities remain, new ones are emerging in the world of big data.
Big Data: Structured and Unstructured
The tsunami of structured scientific data, produced by a new generation of sensors, and the growth of semi-structured and unstructured data from business, entertainment, social networks and popular culture have created new needs for creative application of our intellectual amplifier. As the recent performance of IBM's Watson system on the game show Jeopardy! Illustrated, the combination of large-scale data, rich algorithm suites and powerful computing is opening new vistas. Vannevar Bush's 1940s vision of a Memex, a device capable of storing, indexing and retrieving data from a broad knowledge base, is now within our reach.
It really is about how we use computing as an intellectual amplifier, allowing humans to be more productive and more creative by doing what we do best – asking interesting questions, ones that span multiple disciplines and that illuminate opportunities at their interstices – aided by power analytic and computation engines.
The AMERICA Competes Act outlines target funding levels and priorities for the research and education programs at several federal agencies, including the National Science Foundation and the Department of Energy. As Elizabeth notes, "Robust federal support for breakthrough research conducted throughout the U.S. is critical to fueling the ecosystems of government, industry and universities, allowing the U.S. to make discoveries and turn them into products that improve our nation's ability to compete globally."
After initial crafting, the bill faced a long and arduous process that involved many negotiations and compromises. Hence, final passage of AMERICA Competes is a holiday event worthy of celebration.
N.B. I also write for the Communications of the ACM (CACM). The following essay recently appeared on the CACMblog.
This year, I again had had the honor and privilege to chair the selection committee for the IEEESeymour Cray and Sidney Fernbach awards, both of which were presented at SC10 in New Orleans. These eponymously named awards recognize truly outstanding contributions to high-performance computing, in honor of two early leaders of our field. The Seymour Cray award recognizes individuals whose innovative contributions to high performance computing systems best exemplify the creative spirit demonstrated by Seymour Cray. In turn, the Sidney Fernbach award is given to foroutstanding contribution in the application of high-performance computers using innovative approaches.
It was my pleasure to present the 2010 Seymour Cray award to Dr. Alan Gara, who is an IBM Fellow at the T. J. Watson Research Center. Long before the popular realization that power consumption and reliability would be dominant design considerations at large scale, Al reconceptualized the HPC cluster as a densely packaged system with novel cooling, low power, custom system on a chip (SoCs) and networks and high reliability. The result was the IBM BlueGene system, where Al served as technical project leader and chief system architect. Al's insights and approach in designing the BlueGene system embody the spirit of Seymour Cray's groundbreaking supercomputer designs, which were marvels of integrated, elegant and systemic design.
I was equally pleased to present the 2010 Sidney Fernbach Award to James Demmel, holder of the Dr. Richard Carl Dehmel Distinguished Professorship at the University of California at Berkeley. Jim has long been one of the thought leaders and innovators in the world of parallel numerical software and linear algebra, where his deep theoretical and practical contributions to LAPACK and ScaLAPACK are legion. It is no exaggeration to say that everyone in technical computing is a beneficiary of Jim's work, for his sparse and dense linear system solvers are the foundation technology for a wide array of science and engineering applications. Jim's insights and contributions embody the interplay among mathematics, algorithms, architecture and applications that embody the ideas of Sid Fernbach.
The two awards and the work of the recipients reflect the evolving interplay of technology, software and algorithms in advancing high-performance computing. It is this interplay that has continued to enable innovative scientific research and engineering practice, the vanguard of computational science and engineering.
I am a logophile (λογοπηιλε) – a lover of words. More than that, I am a lover of well crafted, erudite prose that captures and conveys nuance and subtlety. Such prose brings a smile to the lips, both the writer and the reader's reward for dutiful and diligent background research, thoughtful paragraphs that flow as mellifluously as a mountain stream, and phrases that delight with their craftsmanship.
The words matter, for they are the conduit of our ideas and experiences, of our hopes and fears, and of our passions and dreams. They bind us, yet they separate us, for none of us is truly their master.
Between the idea and the reality – the concept and its expression – falls the shadow.
I have a Sunday ritual, practiced with great reverence and solemnity, that cleanses my soul and prepares me to face the viccisidues of the coming week with all the equanimity I can summon. This ritual involves both a large latte, made with freshly ground beans and frothed milk, and the New York Times. When traveling or in extremis, the Times of London or the Washington Postcan be substituted. Under no circumstances, however, will a graven image such as the Podunk Picyune-Gazette be tolerated, for reading it is tantamout to drinking instant coffee, an abomination to the world and all right thinking individuals.
Armed with a massive latte, I spread the New York Times on the kitchen table. (As a non New Yorker, I have never mastered the art of newspaper oragami.) For me, it begins with the front page and its extended stories, then the Week in Review, the editorials and the op-eds. I dally over the obituaries and the wedding announcements, a voyeur inspecting the lives of others.
I scan the business section, seeking surcease for our economic sorrows, then linger over the travel section, thinking about the places I have been and those I have yet to visit. Finally, I turn to the Sunday Book Review, looking for new books to read, and the Sunday Magazine, wondering which topics will warrant extended stories. (And yes, I still miss William Safire's wit in the On Language column.)
I feast my soul on language, sated, albeit briefly, for the coming week.
Make no mistake, small words have huge effects. Both war and peace have hung on the interpretation of a single word or phrase. In such circumstances, creative ambiguity can be prized, allowing each party to apply their own, convenient interpretation. Often, though, such translingual ambiguity is simply the subject of embarrassment.
I am reminded of a conversation with a friend from graduate school. It was about 1980, and we were discussing the characteristics of dumb terminals (i.e., terminals with limited intelligence and connected by RS-232 connections to a remote computing system). Though not a native English speaker, he was quite proficient, rarely pausing to search for a word or phrase, even when conveying complex ideas.
Gesticulating enthusastically, my friend opined on the limitations of such devices, expressed his preferences and, as I recall, offered a few pejorative comments on certain vendors' products. It was a technical discussion filled with facts and analyses, similar to countless others held every day by computer scientists and engineers. He finally paused in mid-sentence, nonplussed by my uproarous laughter.
After gathering my composure, I explained that he had been railing about stupid terminals, rather than dumb terminals. Though superfically synonymous, the first was an inadvertant value judgement; the second was a technical term of art. Like all non-native speakers, he had been ensnared by an American idiom, with the inevitably amusing results. Both connotation and denotation matter.
Compare and Contrast
On occasion, I read newspaper stories where I have personal knowledge of the events and participants described in the story. Rarely do those stories capture all the details correctly. Even when the facts are correct, the tone, the implications and the nuance are either lost or subtlety incorrect. Years ago, I asked a reporter about this, and he readily acknowledged this was the case, noting that it was extraordinarily difficult to capture complexity.
If you have ever read multiple accounts of an international event, written by reporters from different cultures, countries and perspectives, you know this to be true. You may even have wondered if they were reporting the same events. Read thoughtfully, interpolate and extrapolate, seek complexity, seek understanding.
Odds are that almost every adult American remembers a few lines from Thoreau's Walden, "Our life is frittered away by detail. Simplify, simplify, simplify! I say, let your affairs be as two or three, and not a hundred or a thousand; instead of a million count half a dozen, and keep your accounts on your thumbnail." When mixed with a bit of John Donne's Devotions upon Emergent Occasions("No man is an island"), some musings from Cervantes' Don Quixote, and a dose of Shakespeare's Hamlet("To be or not to be")it was a perfect storm of teenaged angst and ennui. (Truth in advertising: I have always preferred Macbeth for tragedy, but I realize that puts me in the minority.)
How, pray tell, doeth such musings inform technology and its societal implications? Perhaps it is all sound and fury, signifying nothing, but I think not. There are lessons here, if we but reflect and learn.
Multitasking and Thrashing
In an August 24, 2010 New York Times article, "Digital Devices Deprive Brain of Needed Downtime," Matt Richtel noted that when we multitask nearly continuously, we forfeit the downtime historically used to think and remember. He went on to quote a University of Michigan study, which "… found that people learned significantly better after a walk in nature than after a walk in a dense urban environment, suggesting that processing a barrage of information leaves people fatigued."
Last week, I found myself dealing with an important issue at work, dashing to the airport, and then being grateful the airplane at WiFi service so I could continue working. In the digital age, most of us are have multitasked, whether it be reading email at the grocery store (guilty), working on vacation (guilty) or bouncing among multiple electronic activities (guilty).
In that spirit, when did the operating system term multitasking enter the common vernacular to describe concurrent human activities? Perhaps we should make people more aware of another operating system term, the one that describes what happens when the number of tasks exceeds available processing capacity and the time slice assigned to each approaches zero. We call it thrashing. Unfortunately (or fortunately), one cannot get a brain hardware or operating system upgrade.
A Different Drummer
I am no technological Luddite or troglodyte. Far from it, I am a frequent blogger, social network denizen and a lover of electronic gadgets. I have always been – and always will be -- a passionate proponent of technology and its power to enrich and enable innovation and human discovery. Yet there are times when I think we need to step back, consider the social implications of these technologies and ask when and where they are best used to our advantage.
Had Thoreau had a smartphone, he would not have been texting his best bud, Ralph (Waldo Emerson), about the joys of solitude, nor would he have been tweeting or posting photos of his house construction. I am rather more confident he would have espoused the healing virtues of periodic digital seclusion and contemplation.
Tis true that no man is an island. Nearly all of us are connected by an electronic web of shared communications, social networks and intelligent devices. However, sometimes being an island of solitude – even for a few minutes – can be a blessing. Sometimes we need to disconnect, contemplate, remember and imagine – reinventing and discovering. This is no quixotic quest; it is what drives discovery.
Simplify, simplify, simplify, indeed.
I'd write more, but I need to catch up on my email.