A pioneer of biology's golden age, Craig Venter has always done science on his own terms. In a Seed exclusive, the man who sequenced the human genome explains what's holding science back and how he intends to fix it.
By Craig Venter | Posted November 20, 2008
Craig Venter. Photograph by Mark Mahaney.
Science — particularly my field of biology—has changed dramatically over the past 50 years and continues to evolve. A field once dominated by small research groups working largely in isolation is transforming, in part, into enterprises increasingly reminiscent of the efforts in physics that have led to the Large Hadron Collider (LHC) and other expensive, personnel- and data-intensive projects.
The organizations and projects I have led during the course of my 37-year career illustrate some of the changes that biology has undergone. But there are significant differences between biology and physics; no single large government program dominates biological science like the LHC dominates physics. Rather, the techniques responsible for the industrialization and digitization of biology, and new approaches for funding science, are enabling scientists to achieve unprecedented independence and scale in their work. These changes have had the effect of moving all of us to an age in which more data can be gathered—and, more importantly, grander questions asked and hypotheses discarded or validated—than has ever been possible before.
When I obtained my doctorate degree in 1975, science wasn't much different from the way it had been in the 1950s. There were about 150,000 scientists in the US, and I, like some 70 percent of my fellow PhDs, went into academia. But things have changed. For one thing, there are more than 2.6 million scientists working in America today. But the essentially binary decision I had to make when I left graduate school has largely evaporated. Where for me and my peers it was a decision between academia or industry, today, only about 20 to 30 percent of the more than 7,000 new PhDs in the life sciences will stay in academia. Furthermore, a significant percentage of "academic" biologists at major institutions have at least one foot in at least one biotech company. One reason for this could be that funding from the US government in constant dollars has changed little over the past 40 years, whereas industry funding has increased more than tenfold; as a result, federal money for biological research, once more than twice as great as that coming from industry, is now less than half as much.
It has become de rigueur for traditional scientists to undertake substantial application-oriented research. Many find this situation problematic, and fear that the proliferation of industry and industry funding has somehow tainted "pure" academic research. But this situation has to a substantial degree been quite positive for society. It has spawned the competition and collaboration between private, public, and academic research groups to work on projects that can have a direct impact on people's lives. Nevertheless, my own experiences have demonstrated just how hard it can be to change long-engrained ideas about how science can best be accomplished, and about what shape the science of tomorrow should take.
My time in academia lasted until 1984, when I had the chance to move my entire team to the intramural program at the National Institutes of Health (NIH). I had been a successful extramural grant-funded researcher, but the possibility of being provided a large budget for my research, instead of having to chase grants, greatly appealed to me and my team. Here was a chance to do research that looked far beyond the question of the next grant. However, to my surprise, few at the NIH were thinking like I was. Instead I found that, while almost every lab in the intramural NIH program was very well funded, much of the research being conducted was average at best. Only a fraction of the scientists there took full advantage of having essentially unlimited funds, to take risks and to try new things.
At the same time I made this move to the NIH, biology began to change. The automation of DNA-related technologies began to appear, and suddenly discussions of sequencing the entire human genome seemed realistic. Still, such an undertaking was hamstrung by limited thinking and the day's crude technology; there was no grand vision and, as of yet, no "big science" in genomics.
In 1987 my team took some of the first steps toward the digitization of biology and industrialization of data acquisition. It began when we published the first gene sequences using an automated DNA-sequencing machine. It became clear to me that to get more data faster was simple: I only needed to obtain a second DNA-sequencing machine. To do it, I needed more room, so I moved off the main NIH campus to acquire additional space for an expanding team and more DNA-sequencing machines. Using the expressed sequence tag (EST) method I developed allowed us to rapidly find genes by using mRNA—the messenger molecules copied from expressed DNA—directly from cells. Each additional sequencing machine led to a direct linear increase in the number of genes discovered. I used my expanding NIH budget to buy the biggest and latest computers to help handle the ever-increasing data flow, and hired software engineers and computational experts to develop software we could use to interpret the new sequence data.
My team's success prompted two important developments. First, a major polemic-driven battle broke out when the NIH insisted on filing patents on the sequences we were discovering. Even bigger was the response from my academic colleagues to my EST method. They hated it. To them, it threatened both the budget for the Human Genome Project, which was just about to get under way at the NIH in 1990, and the way science was being conducted. Senior scientists at major academic centers were upset that the EST and genomic approaches could discover—at random—genes that they had been seeking for decades. As with any rapid change or industrialization, an apparently innate conservatism led many to fight against a new approach to science. At the same time funding for genomic research like mine was being opposed because of its big-science, industrial character— every dollar that went to my big group meant less for all the smaller ones.
While these battles were going on, I received multiple offers from biotech and pharmaceutical companies to move my team to join them. I also received offers from venture capitalists looking to start new companies. I didn't like the strings attached in either case, so I proposed a new model of research funding—the investors would form a new biotech company that would, in turn, fund my independently established and managed not-for-profit research institute. The biotech company would provide long-term funding in exchange for rights to the intellectual property developed by the institute. The result was the founding of Human Genome Sciences Inc. (HGS), which in 1992 made a $70 million, ten-year commitment to an organization I established called The Institute for Genomic Research (TIGR). The world's first large-scale DNA-sequencing facility was now funded. With TIGR—and also the efforts of Daniel Cohen and John Wissenbach in France in building the Genethon, an industrial-scale yeast artificial chromosome mapping facility—biology began to enter the realm of "big science."
Page 1 of 3
Bigger Faster Better
By Craig Venter
Posted November 20, 2008
Originally appeared in Seed 19