Oh Goody -- Prizes for Genomes!

But seriously folks...it's good news that prizes are being posted for biological technologies.  A couple of weeks ago, the X Prize Foundation announced a $10 million prize for demonstration of "technology that can successfully map 100 human genomes in 10 days."  This is not the first such offer; Nicholas Wade notes in the New York Times that Craig Venter set up a $500,000 prize in 2003 for achieving the Thousand Dollar Genome.  Venter is now on the board of the X Prize Foundation and it appears his original prize has been expanded into the subject of the current announcement.  We definitely need new ways to fund development of biological technologies.

Here's more coverage, by Antonio Regalado in the Wall Street Journal.  It will be interesting to see if anyone can come up with a way to make a profit on the $10 million prize.

The prize requires sequencing roughly 500 billion bases in 10 days.  It isn't possible to directly compare the prize specs with my published numbers since there is no specification on the number of people involved in the project.  If you throw a million lab monkeys running a million low tech sequencers at the problem, you're set.  Except, of course, for all the repeats, inversions, and rearrangements that require expertise to map and sort out.

According to a news story by Erika Check in Nature, the performance numbers cited by 454 Life Sciences appear to be encouraging: "Using the 454 technique, one person using one machine could easily sequence the 3 billion base pairs in the human genome in a hundred days, [Founder and CEO Jonathan Rothberg] says," which is about 3.75 million bases per person per day.  And he is optimistic about progress in reducing costs:  "As the process gets faster, it gets less expensive. "It's clear that we'll be able to do this much cheaper," predicts Rothberg, who says that in the next few years scientists will be able to assemble a human genome for US$10,000."  At the present pace of improvement, this looks to be about 2015, though new technology could always get there sooner.

There seems to be some divergence of expert opinion about where a winning technology will come from.  Writing in Science, Elizabeth Pennisi, notes:

Charles Cantor, chief scientific officer of SEQUENOM Inc. in San Diego, California, predicts only groups already versed in sequencing DNA will have a chance at the prize. Others disagree. "I think it is unlikely" that the winner will come from the genome-sequencing community, says Leroy Hood, who invented the first automated DNA sequencer. And Venter predicts that the chance that someone will come out of the woodwork to scoop up the $10 million is "close to 100%." The starting gun has sounded. 

Indeed.  I had sworn off thinking about new sequencing technologies, but the prize has got even me to thinking...

Vaccine Development as Foreign Policy

I was fortunate to attend Sci Foo Camp last month, run by O'reilly and Nature, at the Googleplex in Santa Clara.  The camp was full of remarkable people; I definitely felt like a small fish.  (I have a brief contribution to the Nature Podcast from Sci Foo; text, mp3.)  There were a great many big, new ideas floating around during the weekend.  Alas, because the meeting was held under the Chatham House Rule, I cannot share all the cool conversations I had.

However, at the airport on the way to San Jose I bumped into Greg Bear, who also attended Sci Foo, and our chat reminded me of an idea I've been meaning to write about.

In an essay published last year, Synthetic Biology 1.0, I touched briefly on the economic costs of disease as a motivation for developing cheaper drugs.  Building synthetic biological systems to produce those drugs is an excellent example of the potential rewards of improved biological technologies.

But a drug is a response to disease, whereas vaccines are far and away recognized as "the most effective medical intervention" for preventing disease and reducing the cost and impacts of pathogens.  While an inexpensive drug for a disease like malaria would, of course, be a boon to affected countries, drugs do not provide lasting protection.  In contrast, immunization requires less contact with the population to suppress a disease.  Inexpensive and effective vaccines, therefore, would provide even greater human and economic benefit.

How much benefit?  It is extremely hard to measure this sort of thing, because to calculate the economic effect of a disease on any given country you have to find a similar country free of the disease to use as a control.  A report released in 2000 by Harvard and the WHO found that, "malaria slows economic growth in Africa by up to 1.3% each year."  The cumulative effect of that hit to GDP growth is mind-blowing:

...Sub-Saharan Africa's GDP would be up to 32% greater this year if malaria had been eliminated 35 years ago. This would represent up to $100 billion added to sub-Saharan Africa's current GDP of $300 billion. This extra $100 billion would be, by comparison, nearly five times greater than all development aid provided to Africa last year.

The last sentence tells us all we need to know about the value of a malaria vaccine; it could advance the state of the population and economy so far as to swamp the effects of existing foreign aid.  And it would provide a lasting improvement to be built upon by future generations of healthy children.

The economic valuation of vaccines is fraught with uncertainty, but Rappuoli, et al., suggest in Science that if, "policymakers were to include in the calculation the appropriate factors for avoiding disease altogether, the value currently attributed to vaccines would be seen to underestimate their contribution by a factor of 10 to 100."  This is, admittedly, a big uncertainty, but it all lies on the side of underestimation.  And the point is that there is some $20 Billion annually spent on aid, where a fraction of it might be better directed towards western vaccine manufacturers to produce long term solutions.

Vaccine incentives are usually discussed in terms of guaranteeing a certain purchase volume (PDF warning for a long paper here discussing the relevant economics).  But I wonder if we shouldn't re-think government sponsored prizes.  This strategy was recently used in the private sector to great effect and publicity for the X-Prize, and its success had led to considering other applications of the prize incentive structure.

Alas, this isn't generally considered the best way to incentivize vaccine manufacturers.  The Wikipedia entry for "Vaccine" makes only passing reference to prizes for vaccine development.  A 2001 paper in the Bulletin of the World Health Organization, for which a number of experts and pharmaceutical companies were interviewed about ways to improve AIDS vaccine development, concluded, "It was felt that a prize for the development of an AIDS vaccine would have little impact. Pharmaceutical firms were in business to develop and sell products, not to win prizes."

But perhaps the problem is not that prizes are the wrong way to entice Big Pharma, but rather that Big Pharma may not be the right way develop vaccines.  Perhaps we should find a way to encourage a business model that aims to produce a working, safe vaccine at a cost that maximizes profit given the prize value.

So how much would developing a vaccine cost?  According to a recent short article in Nature, funds devoted to developing a malaria vaccine amounted to a whopping measly $65 million in 2003.  The authors go on to to note that, "At current levels, however, if a candidate in phase II clinical trials demonstrated sufficient efficacy, there would be insufficient funding available to proceed to phase III trials."

It may be that The Gates Foundation, a major funder of the malaria work, would step in to provide sufficient funds, but this dependency doesn't strike me as a viable long-term strategy for developing vaccines.  (The Gates Foundation may not be around forever, but we can be certain that infectious disease will.)  Instead, governments, and perhaps large foundations like The Gates, should set aside funds to be paid as a prize.  What size prize?  Of the ~$1-1.5 Billion it supposedly costs to develop a new drug, ~$250 million goes to marketing.  Eliminating the need for marketing with a prize value of $1.5 Billion would provide a reasonable one time windfall, with continued sales providing more profit down the road.

Setting aside as much as $200 million a year would be a small fraction of the U.S. foreign aid budget and would rapidly accumulate into a large cash payout.  Alternatively, we could set it up as a yearly payment to the winning organization.  Spread the $200 million over multiple governments (Europe, Japan, perhaps China), and suddenly it doesn't look so expensive.  In any event, we're talking about a big payoff in both saving lives and improving general quality of life, so a sizable prize is warranted.  I expect $2 Billion is probably the minimum to get international collaborations to seriously compete for the prize.

The foreign policy aspects of this strategy fit perfectly with the goals of the U.S. Department of State to improve national security by reducing poverty abroad.  Here is Gen. Colin Powell, reprinted from Foreign Policy Magazine in 2005 ("No Country Left Behind"):

We see development, democracy, and security as inextricably linked. We recognize that poverty alleviation cannot succeed without sustained economic growth, which requires that policymakers take seriously the challenge of good governance. At the same time, new and often fragile democracies cannot be reliably sustained, and democratic values cannot be spread further, unless we work hard and wisely at economic development. And no nation, no matter how powerful, can assure the safety of its people as long as economic desperation and injustice can mingle with tyranny and fanaticism.

Development is not a "soft" policy issue, but a core national security issue. [emphasis added]  Although we see a link between terrorism and poverty, we do not believe that poverty directly causes terrorism. Few terrorists are poor. The leaders of the September 11 group were all well-educated men, far from the bottom rungs of their societies. Poverty breeds frustration and resentment, which ideological entrepreneurs can turn into support for--or acquiescence to--terrorism, particularly in those countries in which poverty is coupled with a lack of political rights and basic freedoms.

Dr. Condoleezza Rice, in opening remarks to the Senate Foreign Relations Committee (PDF warning) during her confirmation hearings, plainly stated, "...We will strengthen the community of democracies to fight the threats to our common security and alleviate the hopelessness that feeds terror."

Over any time period you might care to examine, it will probably cost vastly less to produce a working malaria vaccine than to continue dribbling out foreign aid.  Even just promoting the prize would bolster the U.S. image abroad in exactly those countries where we are hurting the most, and successful development would have profound consequences for national security through the elimination of human suffering.  Seems like a good bargain.  The longer we wait, the worse it gets.

Bedroom Biology in The Economist

I have yet to see the print version, but evidently I make an appearance in tomorrow's Economist in a Special Report on Synthetic Biology.  (Thanks for the heads-up, Bill.)  I wasn't actually interviewed for the piece, but I've no objections to the text.  There is an accompanying piece that forecasts the coming "Bedroom Biotech", a phrase they seem to prefer to "Garage Biology".  Personally, I prefer to keep my DNA bashing to the garage rather than the bedroom.  Well, okay, most but not all of my DNA bashing.

The story contains a figure showing data from 2002 on productivity changes in DNA sequencing and synthesis, redrawn from my 2003 paper, "The Pace and Proliferation of Biological Technologies", labeling them "Carlson Curves" once again.  Oh well.  The original paper was published in the journal Biosecurity and Bioterrorism (PDF from TMSI, html version at Kurzweilai.net).  It isn't so much that I disavow the name "Carlson Curve" as I want to assert that quantitatively predicting the course of biological technologies is a questionable thing to do.  As Moore made clear in his paper, what became his law is driven by the financing of expensive chip fabs -- banks require a certain payment schedule before they will loan another billion dollars for a new fab -- whereas biology is cheap and progress is much more likely to be governed by basic science and the total number of people participating in the endeavor.

Newer versions of figures from the 2003 paper, as well as additional metrics of progress in biological technologies, will be available in December with the release of "Genome Synthesis & Design Futures: Implications for the US Economy", written with my colleagues at Bio Economic Research Associates (bio-era), and funded by bio-era and the Department of Energy.

To close the circle, I should explain that the "Carlson Curves" were an attempt to figure out how fast biology is changing, an effort prompted by an essay I wrote for the inaugural Shell/Economist Writing Prize, "The World in 2050."  (Here is a PDF of the original essay, which was published in 2001 as "Open Source Biology and its Impact on Industry.")  I received a silver prize, rather than gold, and was always slightly miffed that The Economist only published the first place essay, but I suppose I can't complain about the outcome. 

The Impact of Biofuel Production on Water Supplies

In an earlier post I mentioned briefly that I am concerned plans to grow crops for producing domestic biofuels do not adequately consider how much water this project will require.  I am all for domestic production of biofuels, and have a small project going to examine the possibilities.  But in my experience the people who have already launched businesses to this end, and the venture capitalists who funded them, all evince surprise at the notion water should be part of the engineering model for fuel production.

It seems I'm not the only one thinking along these lines, as Reuters today is reporting that, "biofuels could worsen water shortages".  The International Water Management Institute has just release a report that claims, "Conquering hunger and coping with an estimated 3 billion extra people by 2050 will result in an 80 percent increase in water use for agriculture on rainfed and irrigated lands."

The Western US is already stretched for water supplies; we mine aquifers for water faster that it can be replaced and declining yearly snow packs are producing drought conditions in cities accustomed to profligate summer water usage.  Some improvement could be made in the way we transport and use water, by switching to drip irrigation and lining canals and irrigation ditches to prevent leakage, for example.  But, given the yields from soy or canola, producing sufficient plant matter to replace any significant fraction of petroleum fuels with biofuels could easily require as much water as we already use to grow food crops.  I'm not nearly as bullish on algae for biodiesel now, although we might still figure out how to make it work.

I don't see any sign of the IWMI report online yet, and I quail at reading something compiled by 700 people.  But I will probably have a look when it is available.  This is exactly the sort of thing we have to figure out if we are to produce carbon neutral biofuels at scale.

Comments on Mail Ordering Smallpox Genes

I've been debating whether to respond to James Randerson's recent front page story in The Guardian, "Revealed: the lax laws that could allow assembly of deadly virus DNA", about mail ordering genes for smallpox.  The bottom line is that the story as published is neither well-reported nor a particularly useful contribution to the discussion about emerging biological threats.

Years ago, I was fortunate to take a science writing class from the great science and war correspondent Malcom Browne, who for many years provided exceptional science reporting at The New York Times.  Among his  suggestions for an ideal (!) newspaper story is that it be no longer than a Haiku.  Of course, this makes all articles published in the history of the press less than ideal.  (No news there.)  Here is my version of the Guardian article:

Humans play with fire!
Newspaper sales are lagging!
Set our hair alight!

Alas, I've ignored most of the stylistic requirements for a Haiku (no mention of a season, or of nature), and the exclamation points are unforgivable.  Still, it captures the essence of Mr. Randerson's story.

Although the article does make one, albeit brief, nod to, "Legitimate reasons for researchers to buy lengths of DNA from pathogens, for example in developing treatments or vaccines against them," the majority of the text is simply alarmist and a rehash of arguments that have appeared previously (The New York Times, Wired, Technology Review; the list goes on).

The worst bit, from my perspective, is that Mr. Randerson promulgates the facetious notion that producing a live, infectious 1918 pandemic influenza virus is as easy as ordering out the DNA from a gullible company.  I've written about this before, and refer readers to those posts (here, and here).  This isn't quibbling on my part.  The capabilities of the technology are central to evaluating the immediacy of the threat.

The Guardian article spends many inches (not an Internet concept, those newsprint inches) announcing the need for regulation without even mentioning the potential detrimental effects of limiting access to the technology.  Because the threat is not immenent, instituting regulations would certainly only reduce our capacity to learn who is employing the technology and thus reduce our capacity to respond to any threats that do arise.  Again, arguments I have made extensively elsewhere (in Wired, at Future Brief, and in Biosecurity and Bioterrorism (via Kurzweilai.net), for example).

The short version of why regulation is bad is this: Because it is not physically possible to control access to the reagents or instrumentation used in DNA synthesis, our only defense in this situation is to keep track of, as best we can, who is doing what.  Our sole weapon is information, in other words.  The only thing regulation will do is cause people to be more secretive, whether they have a nefarious or an innocuous intent.  That is, regulation will restrict our ("we" being the good guys, of course) access to information.  Moreover, regulation in the U.K. and/or the U.S. will only limit activity in those countries.  You can order synthetic genes from a large number of convenient countries, these days.

In a companion article, "Lax laws, virus DNA and potential for terror", Mr. Randerson introduces his readers to Synthetic Biology:

Edward Hammond, a biological weapons expert with the Sunshine Project, an NGO that campaigns against the development of biological weapons, said: "The most worrisome thing ... is that [the field of synthetic biology] is going to enable people to create potentially very dangerous diseases that don't otherwise exist or to recreate ones that have been wiped off the face of the earth."

Mr. Randerson makes no effort to explain that you don't need synthetic methods to create new, potentially dangerous organisms.  (Harder to sell newspapers if you don't stoke the fires, after all.)  Breeding and artificial selection can produce pathogens for you, and these tried and true techniques will do a much better job of it.  And if you want a nasty bug ready-made, you just need to visit a poultry farm here in the US, where due to all those fantastic "growth hormones" a soil sample will provide you with Cipro-resistant anthrax. 

I was perplexed through the entire article why no mention was made of Drew Endy's efforts to synthesize novel viruses for the sake of learning how they work.  In other correspondence with Drew, I learned that he had been approached by Mr. Randerson, but was so troubled by the very idea of the article and project that he declined to participate or be interviewed.  Here (PDF warning) is a log of their email exchange.

The most remarkable thing about the email is that it demonstrates Mr. Randerson is hell bent on doing exactly what he warns against, namely letting loose in the world a sequence from a deadly pathogen that has been extinct in the wild for quite some time.  It doesn't matter that he introduced three small changes rendering the gene supposedly incapable of being used to produce a protein.  Those changes would be trivial for any college, and perhaps high school, student to remove (laborious, perhaps, but trivial), thus restoring the functionality of the smallpox gene.

By my reading, Randerson's correspondence with Drew clearly shows The Guardian reporter hasn't thought about the bigger context.  He had his teeth into a story and wedged himself into discussing only his own ill-informed conclusions rather than carefully exploring what it will take to keep us safe from emerging threats.  He simply didn't do his homework.

I hope The Guardian can do better in the future.

(Not Quite Live From) Synthetic Biology 2.0, Part V :: Fin

First off, here is the link for Synthetic Biology 3.0, next year's meeting in Switzerland.

This year's meeting was impressive on many counts.  As I have noted already (Part II), there was a distinct change in the flavor of the presentations.  The first day started out with a Nobel Laureate, followed up by a potential (probable?) future Laureate.  There was a significant amount of money in the room, from corporate representatives of synthesis companies to venture capitalist Vinod Khosla.  With respect to the technical presentations, the sheer diversity of systems and applications compared to two years ago was remarkable.  People are playing with more organisms and more parts (here's the meeting agenda).  The number of genes combined in several of the talks was itself remarkable.  Medical applications are clearly coming down the pike.

Yet I found something lacking.  As in 2004, there was no mention this year of a critical set of tools required in any engineering field.  It may not be sexy, but test and measurement gear is what allows rapid comparison of prediction and experimental outcome.  Without sophisticated test gear, you have no Pentium, no 777, no Honda Element, no SpaceShipOne.  At the moment, while each experiment presented at SB 2.0 may be technically beautiful and impressive, they are primarily one-offs.  There is no common signal, and there is no common way to compare experiments in different organisms.  This will eventually be addressed through some sort of standardization, such as is being attempted with Biobricks.  Yet I have always found the common signal in the Biobricks standard to be confusing.  I forget what it was called originally, but now the input and output relationships of the parts are defined in Polymerases Per Second, or POPS, the number of polymerases running into, or out of, a genetic element in a second.

As I write this, I finally realize why I don't like POPS.  As Drew Endy describes it, POPS is a way to allow abstraction from the level of genes and specific proteins up to devices with a common reference.  I understand this story, and it makes sense to me given the constraints of the biological parts we have to work with.  But here's the thing: measuring POPS is presently exceptionally hard.  You can test each part in a framework that allows the measurement of POPS, probably using a fluorescent protein as an output signal, which is only vaguely quantitative.  It is also not a direct measure of POPS, as there is at least one layer of function between the number of RNA polymerases running down DNA and the number of proteins that get translated from RNA.   But it gets worse; how to you troubleshoot the entire circuit?  Where do you stick the multimeter probes on the fly to see why your circuit isn't behaving as expected?  You don't.  Instead, you resort to microarrays to check RNA expression levels or you use protein assays.  Until there is a magic "POPSometer", there won't be any way to examine a circuit in real time.  Fluorescent proteins will never adequately fill this role, 1) because of the time required to fold and produce a fluorescent signal and 2) because you have to build a new circuit every time you want to stick the test probe in a new spot.

Moreover, tools presently in use provide the illusion to the uninitiated that the physical infrastructure of synthetic biology is already well developed.  It is fairly straightforward to get single cell fluorescence or behavior data at this point, but you have to presume the organism is running the program you wrote.  Separately, it is easy to sequence large amounts of DNA, generally purified from many individuals.  But you can't yet sequence a given bug behaving in a given way to make sure it is following the DNA you put into it.  And readily available sequencing technologies average over variation present in a population that may be critical to understanding function.

This technological mismatch extends to discussions of security.  We heard descriptions of various programs to monitor DNA synthesis efforts, which would tie into a surveillance network using a microbial background signal for the environment.  The later would serve as a reference for efforts to detect novel, and perhaps threatening organisms, in real time.  But there isn't yet any technology that can provide that sort of environmental information, nor will one be available in short order from what I have seen.

In summary, we are still at the beginning of a very long road.  Before chemical engineering came synthetic chemistry, and before biological engineering will come synthetic biology.  I just wish the community had better perspective on how far we have to go.

Synthetic Biology 2.0, Part IV :: What's in a name?

The last session at Synthetic Biology 2.0 was full of hand-wringing about the very name of the thing.  "Synthetic" seems to conjure up too many bogeymen for the likes of of many attendees.  The arguments against the name were all centered around the fact that "synthetic" is un-PC these days.  Never mind that we live in a world consisting entirely of synthetic food, clothes, houses, computers, solar panels, windmills, and liquid fuels.  Synthetic is just bad, evidently.

This debate is essentially about politics.  It seems the new field is scaring people just by it's name.  So perhaps we should choose a new name in order to finesse the acceptance of the science and technology?  After all, why fight more battles than you need to?

Okay, fine.  Go ahead and try to rename it.  I'll just watch this time, thanks.  Besides, I think the present name is both appropriate and inevitable, but more on that in a moment.  We started with a different name, once upon a time, and that one didn't go over so very well either.  In 2000, while trying to describe the way biology was about to change (here is the PDF), or at least the way Drew Endy and I were conceiving of a new biological engineering, I floated the phrase "Intentional Biology."  The text on that web page was last modified in late 2000, but the story is basically the same today.   Through predictive design, biological systems should be both easier to understand and more useful.  These engineered systems would behave as intended, rather than displaying random and mystifying behaviors often encountered when genetically modified organisms are introduced into new environments or set loose in the wild; i.e., unintended behaviors.  Roger Brent, Drew, and I, even organized a meeting to figure out how to make this happen.  "After the Genome 6, Achieving an Intentional Biology", was held in Tucson, AZ, in December of 2000.  Alas, that name had unintended consequences, namely that the biologists attending the meeting thought we were asserting that all prior molecular biology had been unintentional.  If rotten vegetables had been available, I'd have been pelted during my talk.

Not the best start.  Can't win them all.  A good lesson, too.

Fast forward to mid 2001 or so, when Drew and I are at a cocktail party in San Fransisco thrown in celebration of the opening of the new local office for Nature.  We wind up in a conversation with Carlos Bustamante, who regales us with the origin of the field of Synthetic Chemistry, and how this gives us the name for Synthetic Biology.  Drew and I are convinced.  But, of course, it wasn't up to Drew and I to name a new field.  We were simply looking for a name to distinguish what we wanted to do from how things had been done previously.  The phrase "Synthetic Biology" certainly isn't new, and was emerging from other sources at the same time (Steven Benner, in particular, if memory serves).

Drew has flirted with other names in the last 5 years, among them "constructive biology" and "natural engineering".  Craig Venter insists on calling it Synthetic Genomics.  Frankly, these aren't any more compelling to me than Synthetic Biology, and they also seem to require even more explanation.  At this point, I don't really care what it is called.  The work is going to happen regardless, and there is no way to turn back.  The name is only a lightening rod for criticism because, as Oliver Morton and others have pointed out, the community keeps drawing attention to itself and all the bad things it might facilitate.  But where is the good news?  I have tried in this space to point out the connections between Synthetic Biology and vaccines, to the possibility that Synthetic Biology might be our best hope to beat a pandemic, but it appears most people want to focus on the negative aspects of rapid and distributed DNA synthesis.  The recent SB 2.0 meeting started with a focus on biological production of energy, another excellent beneficial application, but any subsequent optimism was lost by the third day.

Now onto why the name is inevitable.  What we are doing has been called Synthetic Biology for almost a century.  Here is some text from my book:


Ch 4: The Second Coming of Synthetic Biology

"I must tell you that I can prepare urea without requiring a kidney of an animal, either man or dog.” With these words, in 1828 Friedrich Wohler announced he had irreversibly changed the world.  In a letter to his former teacher Joens Jacob Berzelius, Wohler wrote that he had witnessed, “The great tragedy of science, the slaying of a beautiful hypothesis by an ugly fact.”  The beautiful idea to which he referred was vitalism, the notion that organic matter, exemplified in this case by urea, was animated and created by a vital force and that it could not be synthesized from inorganic components.  The ugly fact was a dish of urea crystals on his laboratory bench, produced by heating inorganic salts.  Thus was born the field of synthetic organic chemistry.

Around the dawn of the 19th century, chemistry was in revolution right along with the rest of the western world.  The study of chemical transformation, then still known as alchemy, was undergoing systematic quantification.  Rather than rely on vague and mysterious incantations, scientists such as Antoine Lavoisier wanted to create what historian of science and technology Bruce Hevly calls an “objective vocabulary” for chemistry.  Through careful measurement, a set of clear rules governing the synthesis of inorganic, non-living materials gradually emerged.

In contrast, in the early 1800s the study of organic molecules was primarily concerned with understanding how molecules already in existence were put together.  It was a study of chemical compositions and reactions.  Unlike the broader field of chemistry taking shape from alchemy, making new organic things was of lesser concern because it was thought by many that organic molecules were beyond synthesis.  Then, in 1828, Wohler synthesized urea.  Suddenly, with one experiment, the way scientists did organic chemistry changed. The ability to assemble organic molecules from inorganic components altered the way people viewed a large fraction of the natural world because they could conceive of building much of it from simpler pieces.  Building something from scratch, or modifying an existing system, requires understanding more details about the system than simply looking at it, poking it, and describing how it behaves.  This new approach to chemistry helped open the door to the world we live in today.  Products of synthetic organic chemistry dominate our environment, and the design of those products is possible only because understanding the process of novel assembly revealed new principles.

It was this step of moving to Synthetic Chemistry, and then to an engineering of chemistry, which radically changed the way people understood chemistry.  Chemists had to learn rules that weren’t apparent before.  In the same way that Chemical Engineering changed our understanding of nature, as we begin engineering biological systems we will learn considerably more about the way biological pieces work together.  Challenges will arise that aren’t obvious just from watching things happen.  With time, we will understand and address those challenges, and our use of biology will change dramatically in the process.  The analogy at this point should be clear; we are well on our way to developing Synthetic Biology. [Auth. note: Clear if you've read the first three chapters of the book, anyway.]

Before going further, it is worth noting that this is not the original incantation of the phrase “synthetic biology”.  Whatever the reception this time around, the first time it was a flop.  In her history of the modern science of biology, Making Sense of Life, Evelyn Fox Keller recounts efforts at the turn of the 20th Century to discover the secret of life through construction of artificial, and synthetic, living systems; “To many authors writing in the early part of the [20th] century, the [path] seemed obvious: the question of what life is was to be answered not by induction but by production, not be analysis but by synthesis.”(Keller, p.18)  This offshoot of experimental biology reached its pinnacle, or nadir, depending on your point of view, in attempts by Stephané Leduc to assemble purely physical and chemical systems that demonstrated behaviors reminiscent of biology.  As part of his program to demonstrate “the essential character of the living being”(ibid, p.28) at both the sub-cellular and cellular level, Leduc constructed chemical systems that he claimed displayed mitotic division, growth, development, and even cellular motility.  He described these patterns and forms in terms of the well-understood physical phenomena of diffusion and osmotic pressure.  It is important to note that these efforts to synthesize life-like forms relied as much on experiment as upon theory developed to describe the relevant physics and chemistry.  That is, this was a specific program to use physical principles to explain biological phenomena.  These efforts were described in a review paper at the time as “La Biologie synthetique”(ibid, p.31-32).

While the initial reception to this work was somewhat favorable, Leduc’s grandiose claims about the implications of his work, and a growing general appreciation for complicated biological mechanisms determined through experiments with living systems, led to something of a backlash against the approach of understanding biology through construction.  By 1913, one reviewer wrote, “The interpretations of M. Leduc are so fantastic…that it is impossible to take them seriously”(ibid, p.31).  Keller chronicles this episode within the broader historical debate over the role of construction and theory in biology.   History regards the folks in the synthetic camp, and related efforts to build mathematical descriptions of biology, particularly in the area of growth and development, as poorly regarded by their peers.  Perhaps inspired by the contemporaneous advances in physics, it seems that the mathematical biologists and the synthetic biologists of the day pushed the interpretation of their work further than was warrented by available data.

In response to what he viewed as theory run rampant, Charles Davenport suggested in 1934 that, “What we require at the present time is more measurement and less theory…There is an unfortunate confusion at the present time bewteen quantitative biology and bio-mathematics…Until quantitative measurement has provided us with more facts of biology, I prefer the former science to the latter”(ibid, p.86).  I think these remarks are still valid today.  Leduc, and the approach he espoused, failed because real biological parts are more complex, and obey different rules, than his simple chemical systems, however beautiful they were.  And it is quite clear that vast forests have been felled to publish theoretical papers that have little to do with the biology we see out the window.  But theory, drawn from physics, chemistry, and engineering, does have a role to play in describing biological systems.  Resistance to the tools of theory has been, in part, cultural.  There has always been a certain tension in biology over the utility of mathematical and physical approaches to the subject;

To put it simply, one could say that biologists do not accept the Kantian view of mathematics (or, rather, mathematization) as the measure of a true science; indeed, they have often actively and vociferously repudiated any such criterion.  Nor have practicing biologists shown much enthusiasm for the use of mathematics as a heuristic guide in their studies of biological problems.(Keller, p. 81)

Fortunately, this appears to be changing. Mathematical approaches are flourishing in biology, particularly in the interpretation of large data sets produced by genomic and proteomic studies.  Physicists and engineers are making fundamental contributions to the quantitative understanding of how individual proteins work in their biological context.  But I think it is important to acknowledge that not all biologists think a synthetic, bottom up, approach will yield truths applicable to complex systems that have evolved over billions of years.  Such concerns are not without merit, because as the quotation from Charles Davenport suggests, biology has traditionally had more success when driven by good data rather than theory.  The challenge today is to build quantitatively predictive design tools based on the measured device physics of real biological parts, and to implement designs within organisms in ways that work in the real world.


Thus the present project is truly different than the biology that has come before.  Synthetic Biology is based on an explicit reliance upon mathematical models.  My own particular bent here is in developing technology that enables better measurement of biological systems so as to test and constrain models and also to provide required capabilities for biological engineering.  Without that, we are stuck with Charles Davenport's criticism of seventy years ago.

"Synthetic Biology" fits, both linguistically and historically.  Why are we stuck on this same damn topic two years after the first meeting?  We have better, and more important, things to worry about.  And lot's of work to do.  Synthetic Biology 3.0 will take place in Zurich, Switzerland, 24-27 June, 2007.

Live from Synthetic Biology 2.0, Part III

Wandering out into the lobby, I found Paul Rabinow and Oliver Morton chatting about the future of DNA synthesis companies.  Oliver is blogging the meeting at his site Mainly Martian.

Earlier I mentioned the new flavor of money at this meeting and the presence of competing DNA synthesis companies.  Oliver is hot onto the story that these companies are already struggling with the fact that large scale synthesis is becoming commoditized, and they may not all be around for long.  (By the way, we are wondering why John Mulligan, founder of Blue Heron and an early entrant into the commercial synthesis game, isn't at the meeting.  His is a conspicuous absence. (UPDATE:  John Mulligan was on a camping trip and is apparently on his way here now.  But we still missed him yesterday.))  It seems like there is already quite a lot of pressure for desktop DNA synthesizers.

David Baltimore is speaking now about engineering the immune system, which I should tune into.

Live from Synthetic Biology 2.0, Part II

This year's meeting has an interesting new flavor, namely that of money.  There are VC's here (yesterday at lunch gave us the interesting sight of Vinod Khosla and Craig Venter sitting off together in a corner, no doubt planning the future of Synthetic Biology); the list of sponsors is heavy with corporate names.  This is all a great change from SB 1.0, which had a very academic feel.

Yesterday's "Synthesis Panel" was in fact a series of tag team marketing pitches from synthesis company executives, presumably in exchange for their sponsorship of the meeting.  The summary of that session, perhaps unintended, was that all four companies essentially gave quotations to the audience for synthesis jobs: "no more than four weeks, perfect synthesis, buck a base."  We also heard that they are expecting the cost curves to keep up the current pace, and that this time next year synthesis of genes will be $.50 a base.  We heard some discussion of changes in technology, but everybody is still essentially using the same chemistry, just different plumbing.  The presentation from Codon Devices included references to a bunch of interesting methods, including something I predicted/hoped would happen, namely the combination of the synthesis strategy published by Tian, et al., with the MutS purification scheme from Peter Carr at MIT.

The commercial (as opposed to governmental or foundation) money here is an indication that biological technologies are achieving recognition as a significant potential influence on the economy.  I still don't understand how to finesse the IP issues -- I've been working on a blog post and book chapter about "The State of Open Source Biology", or perhaps just "Open Biology", which just aren't ready for release yet.

Carolyn Bertozzi (UC Berkeley) is speaking now, which reveals another interesting thread to this meeting.  Prof. Bertozzi is presenting work on modifying extracellular sugar groups to better understand cell signaling and hopefully get at cancer diagnostics and therapeutics.  People are really waking up to the possibilities of combining powerful biochemistry with synthetic methods for building new pathways with exceptional power and flexibility.

Jack Szostak (Harvard) just stepped up to the microphone to speak about a "Model of Synthetic Protocell."  His protocell is a simple replicating vesicle with it's own nucleic acid instruction set, but he doesn't want to use any preexisting biochemical machinery.  "All processes must be spontaneous".  That's ambitious.  He says he doesn't think the work has any particular practical application, but I suspect that is just a matter of time.

Live from Synthetic Biology 2.0

I'm sitting in Synthetic Biology 2.0 at UC Berkeley.  Talks started off with energy applications, which is interesting.  Evidently there was also a big VC meeting in the last few days that focused on SB applications to producing energy.

Steve Chu (Nobel Laureate in Physics and Director of LBL) led off the talks with proposals that "excess" crop land in US could be used to "grow energy", by producing appropriate plants and methods via synthetic biology.  He mentioned that despite global population expansion by a factor of ~2.5 in the last 60 years, cultivated land has only increased by ~10-15% due to increases in productivity.  But Chu made no mention of the problem that we have trashed lots of crop land in the last 50 years, and it isn't obvious that we could use large amounts of land to grow energy given the state of the soil.  More importantly, he made no mention of where we would get all the water to grow those energy crops.

I am all for growing our energy sustainably, of course, but I don't think that terrestrial crops have a hope of being the right answer.  Best meme from Chu's talk was starting off with the most efficient engine design, figuring out the best fuel for that engine, then designing an organism to produce that fuel.  Cool.

Craig Venter is speaking now.  Lots on minimal genomes and looking at alternative pathways for directly producing energy.  Directly photosynthetic production of methane, etc.

More soon.