Let's start with a bit of history. Former NIH Director Harold Varmus charged a committee, co-chaired by Larry Smarr (then at NCSA, now at CalIT2) and David Botstein (Princeton), to explore the cultural barriers preventing greater collaboration between biomedical and computing researchers. Their 1999 report led to the Biomedical Information Science and Technology Initiative (BISTI). All of this was later the impetus for the NIH Roadmap Initiative and its emphasis on multidisciplinary science.
In the meantime, computing and biomedicine continued to evolve and disciplinary funding pressures increased. NIH pay lines declined and even senior biomedical investigators are now struggling to sustain their disciplinary research programs. This may seem counter-intuitive, given the NIH budget doubling, but the number of biomedical investigators in the U.S. also more than doubled during that period. Concurrently, success rates for computing research proposals fell to under ten percent for some programs due to large faculty growth that lagged the dot.com boom. These budget pressures discouraged both biomedical and computing researchers from interdisciplinary collaboration, as both groups have tended to circle the wagons and defend their own.
With this backdrop, last summer, Chris Johnson (University of Utah) and I chaired a joint CRA-NIH workshop, the Computing Research Challenges in Biomedicine, to discuss the structural, cultural, intellectual and financial issues that have limited cross-disciplinary collaboration and application of novel computing technologies to biomedical research. At the outset, Chris, Jill Mesirov (Broad Institute), Lee Hood (Institute for Systems Biology) and I gave brief plenary presentations to set the stage for extended discussion by the workshop attendees, who were a mix of academic researchers (biomedical and computing) and NIH leaders.
As we noted in the workshop report's introduction:
In this context, we use the term “computing” to include the entire range of contemporary approaches in computer science and engineering and information technology. Similar considerations apply to partnerships between computing and the biological domains funded by other agencies, most notably those of NSF, DOE and USDA, which includes basic plant, microbial and environmental biology and applications to bioenergy and agronomy.
The workshop’s focus was on identifying action items; there are already numerous reports on computational biology that describe the challenges and opportunities, but the limitations at this interface remain unaddressed. This document summarizes the outcomes of the workshop.
You can read the details in the report, but the recommendations echo standard themes:
- The need to change legacy government and university structures by significant incentives for change and better coordination.
- The need for more interdisciplinary education, collaboration and communication.
- The need for funding and reward mechanisms that encourage, support and realize significant connections between computing and biomedical research.
Many of the discussions highlighted the criticality of increased interaction and communication. Quite a few NSF-funded computing researchers, for example, have never seen or read an NIH proposal or understand NIH processes. Much as many of us in computing would like to believe that the language of biology is really just an instance of the Chomsky hierarchy, life (literally) is more complex and subtle.
Some NIH-funded researchers have similar myopia about computing research. It’s not just about web mashups, technology toys or the IT group in the hospital basement. There are deep issues about the mathematical limits of computability, the engineering of complex systems and learning processes.
Today, both biomedical and computing researchers are unnecessarily limited by their respective cultures. Both could benefit from mutual education and greater awareness of how much they need one another to ask and answer the big questions. NIH Director Elias Zerhouni and other NIH leaders are working hard to change this, but it will take time and commitment from all of us, biomedical and computing researchers alike. Lee Hood captured the spirit of many workshop discussions when he said, “New ideas require new organizational structures.”
Lee’s Institute for Systems Biology is doing exactly that, bringing together multidisciplinary groups (biologists, computer scientists, engineers and others) to explore biological complexity and system function. It’s a big shift from the historical reductionist approach to biology, and it builds on Lee’s enormous track record -- calling him the father of the DNA sequencer does not do justice to his contributions. In my humble opinion, Lee is on the right track; we need more holistic initiatives like this.
(As an aside, Lee remarked to me once that he had been pleased with the quality of his undergraduate science education and that it had prepared him well to tackle complex problems. Of course, he also noted that he’d had undergraduate physics with Dick Feynman, chemistry with Linus Pauling and biology with Max Delbrück. That's a pretty fair undergraduate education indeed.)
At the Renaissance Computing Institute we are attempting to apply these same notions of flexible structures and multidisciplinary approaches to complex societal problems in disaster response, biomedicine and health care, economic development and the arts. By building a virtual organization that spans North Carolina and that integrates expertise from multiple disciplines and across government, industry and academia, I believe we can make a difference. I’m firmly convinced that 21st century solutions to complex problems require dynamic resource allocation, fluid structures and diverse expertise.
Here’s to bridge building and the reemergence of natural philosophy – a holistic approach to scientific discovery that unites disciplines.