The Arrival of Nanopore Sequencing

(Update 1 March: Thanks to the anonymous commenter who pointed out the throughput estimates for existing instruments were too low.)

You may have heard a little bit of noise about nanopore sequencing in recent weeks.  After many years of development, Oxford Nanopore promises that by the end of the year we will be able to read DNA sequences by threading them through the eye of a very small needle.

How It Works: Directly Reading DNA

The basic idea is not new: as a long string of DNA pass through a small hole, its components -- the bases A, T, G, and C -- plug that hole to varying degrees.  As they pass through the hole, in this case an engineered pore protein derived from one found in nature, each base has slightly different interactions with the walls of the pore.  As a result, while passing through the pore each base lets different numbers of salt ions through, which allows one to distinguish between the bases by measuring changes in electrical current.  Because this method is a direct physical interrogation of the chemical structure of each base, it is in principal much, much faster than any of the indirect sequencing technologies that have come before.

There have been a variety of hurdles to clear to get nanopore sequencing working.  First you have to use a pore that is small enough to produce measurable changes in current.  Next the speed of the DNA must be carefully controlled so that the signal to noise ratio is high enough.  The pore must also sit in an insulating membrane of some sort, surrounded by the necessary electrical circuitry, and to become a useful product the whole thing must be easily assembled in an industrial manner and be mechanically stable through shipping and use.

Oxford Nanopore claims to have solved all those problems.  They recently showed off a disposable version of their technology -- called the MinIon -- containing 512 pores built into a disposable USB stick.  This puts to shame the Lava Amp, my own experiment with building a USB peripheral for molecular biology.  Here is one part I find extremely impressive -- so impressive it is almost hard to believe: Oxford claims they have reduced the sample handling to single (?) pipetting step.  Clive Brown, Oxford CTO, says "Your fluidics is a Gilson."  (A "Gilson" would be a brand of pipetter.)  That would be quite something.

I've spent a good deal of my career trying to develop simple ways of putting biological samples into microfluidic doo-dads of one kind or another.  It's never trivial, it's usually a pain in the ass, and sometimes it's a showstopper.  Blood, in particular, is very hard to work with.  If Oxford has made this part of the operation simple, then they have a winning technology just based on everyday ease of use -- what sometimes goes by the labels of "user experience" or "human factors".  Compared to the complexity of many other laboratory protocols, it would be like suddenly switching from MS DOS to OS X in one step.

How Well Does it Work?

The challenge for fast sequencing is to combine throughput (bases per hour) with read length (the number of contiguous bases read in one go).  Existing instruments have throughputs in the range of 10-55,000 megabases/day and read lengths from tens of bases to about 800 bases.  (See chart below.)  Nick Loman reports that using the MinIon Oxford has already run DNA of 5000 to 100,000 bases (5 kB to 100 kB) at speeds of 120-1000 bases per minute per pore, though accuracy suffers above 500 bases per minute.  So a single USB stick can run easily run at 150 megabases (MB) per hour, which basically means you can sequence full-length eukaryotic chromosomes in about an hour.  Over the next year or so, Oxford will release the GridIon instrument that will have 4 and then 16 times as many pores.  Presumably that means it will be 16 times as fast.  The long read lengths mean that processing the resulting sequence data, which usually takes longer than the actual sequencing itself, will be much, much faster.

This is so far beyond existing commercial instruments that it sounds like magic.  Writing in Forbes, Matthew Herper quotes Jonathan Rothberg, of sequencing competitor Ion Torrent, as saying "With no data release how do you know this is not cold fusion? ... I don't believe it."  Oxford CTO Clive Brown responded to Rothberg in the comments to Herper's post in a very reasonable fashion -- have a look.

Of course I want to see data as much as the next fellow, and I will have to hold one of those USB sequencers in my own hands before I truly believe it.  Rothberg would probably complain that I have already put Oxford on the "performance tradeoffs" chart before they've shipped any instruments.  But given what I know about building instruments, I think immediately putting Oxford in the same bin as cold fusion is unnecessary.

Below is a performance comparison of sequencing instruments originally published by Bio-era in Genome Synthesis and Design Futures in 2007.  (Click on it for a bigger version.)  I've hacked it up to include the approximate performance range of 2nd generation sequencers from Life, Illumina, etc, as well for a single MinIon.  That's one USB stick, with what we're told is a few minutes worth of sample prep.  How many can you run at once?  Notice the scale on the x-axis, and the units on the y-axis.  If it works as promised, the MinIon is so vastly better than existing machines that the comparison is hard to make.  If I replotted that data with log axis along the bottom then all the other technologies would be cramped up together way off to the left. (The data comes from my 2003 paper, The Pace and Proliferation of Biological Technologies (PDF), and from Service, 2006, The Race for the $1000 Genome).
 
Carlson_sequencer_performanc_2012.png The Broader Impact

Later this week I will try to add the new technologies to the productivity curve published in the 2003 paper.  Here's what it will show: biological technologies are improving at exceptional paces, leaving Moore's Law behind.  This is no surprise, because while biology is getting cheaper and faster, the density of transistors on chips is set by very long term trends in finance and by SEMATECH; designing and fabricating new semiconductors is crazy expensive and requires coordination across an entire industry. (See The Origin of Moore's Law and What it May (Not) Teach Us About Biological Technologies.)  In fact, we should expect biology to move much faster than semiconductors. 

Here are a few graphs from the 2003 paper:

...The long term distribution and development of biological technology is likely to be largely unconstrained by economic considerations. While Moore's Law is a forecast based on understandable large capital costs and projected improvements in existing technologies, which to a great extent determined its remarkably constant behavior, current progress in biology is exemplified by successive shifts to new technologies. These technologies share the common scientific inheritance of molecular biology, but in general their implementations as tools emerge independently and have independent scientific and economic impacts. For example, the advent of gene expression chips spawned a new industrial segment with significant market value. Recombinant DNA, gel and capillary sequencing, and monoclonal antibodies have produced similar results. And while the cost of chip fabs has reached upwards of one billion dollars per facility and is expected to increase [2012 update: it's now north of $6 billion], there is good reason to expect that the cost of biological manufacturing and sequencing will only decrease. [Update 2012: See "New Cost Curves" for DNA synthesis and sequencing.]

These trends--successive shifts to new technologies and increased capability at decreased cost--are likely to continue. In the fifteen years that commercial sequencers have been available, the technology has progressed ... from labor intensive gel slab based instruments, through highly automated capillary electrophoresis based machines, to the partially enzymatic Pyrosequencing process. These techniques are based on chemical analysis of many copies of a given sequence. New technologies under development are aimed at directly reading one copy at a time by directly measuring physical properties of molecules, with a goal of rapidly reading genomes of individual cells.  While physically-based sequencing techniques have historically faced technical difficulties inherent in working with individual molecules, an expanding variety of measurement techniques applied to biological systems will likely yield methods capable of rapid direct sequencing.

Cue nanopore sequencing. 

A few months ago I tweeted that I had seen single strand DNA sequence data generated using a nanopore -- it wasn't from Oxford. (Drat, can't find the tweet now.)  I am certain there are other labs out there making similar progress.  On the commercial front, Illumina is an investor in Oxford, and Life has invested in Genia.  As best I can tell, once you get past the original pore sequencing IP, which it appears is being licensed broadly, there appear to be many measurement approaches, many pores, and many membranes that could be integrated into a device.  In other words, money and time will be the primary barriers to entry.

(For the instrumentation geeks out there, because the pore is larger than a single base, the instrument actually measures the current as three bases pass through the pore.  Thus you need to be able to distinguish 4^3=64 levels of current, which Oxford claims they can do.  The pore set-up I saw in person worked the same way, so I certainly believe this is feasible.  Better pores and better electronics might reduce the physical sampling to 1 or 2 bases eventually, which should result in faster instruments.)

It may be that Oxford will have a first mover advantage for nanopore instruments, and it may be that they have amassed sufficient additional IP to make it rough for competitors.  But, given the power of the technology, the size of the market, and the number of academic competitors, I can't see that over the long term this remains a one-company game.

Not every sequencing task has the same technical requirements, so instruments like the Ion Torrent won't be put to the curbside.  And other technologies will undoubtedly come along that perform better in some crucial way than Oxford's nanopores.  We really are just at the beginning of the revolution in biological technologies.  Recombinant DNA isn't even 40 years old, and the electronics necessary for nanopore measurements only became inexpensive and commonplace in the last few years.  However impressive nanopore sequencing seems today, the greatest change is yet to come.

Bumps for Biofuels and Growing Pains for the BioEconomy

I found this post, written in early 2008, for some reason sitting unpublished in my archives.  It is just as relevant today, now that we are through the worst of the economic meltdown, so I'll push the "publish" button in just a moment.  I updated the revenue numbers for the US, but otherwise it is unchanged.  I note that high farm prices are again putting pressure on the amount of land in the conservation reserve program.

---------------

Just as we reach the point where biological technologies can begin to economically replace the industrial chemistry we have relied on for the last two centuries, the price of raw materials is going through the roof.  As explored in my recent article, "Laying the Foundations for a Bio-Economy", the contribution of "genetically modified stuff" to the U.S. economy already amounts to the equivalent of more than 2% of GDP, or north of $300 billion.  [See the Biodesic 2011 Bioeconomy Update for the updated revenue numbers.]  About 80% of this total is from agriculture and industrial products, where revenues from the latter are growing 15-20% a year.  But as more products of industrial biotechnology hit the market, they will compete for more expensive feedstock resources.

The New York Times carried two stories on 9 April that illustrate some of the attendant issues.  In, "Harnessing Biology, and Avoiding Oil, for Chemical Goods", Yudhijit Bhattacharjee" gives a short summary of the shift from chemistry to biology for producing everything from plastic to fuel.  I've written here before about DuPont's success with Sorona, a plastic made using corn processed by engineered bacteria.  By itself, Sorona is already a billion-dollar product.  It seems DuPont has discovered additional uses for materials that are produced using biology:

The payoffs from developing biobased chemicals could be huge and unexpected, said Dr. John Pierce, DuPont's vice president for applied biosciences-technology. He pointed to DuPont's synthesis of propanediol, which was pushed along by the company's goal to use the chemical to make Sorona, a stain-resistant textile that does not lose color easily.

Soon DuPont scientists realized that bioderived propanediol could also be used as an ingredient in cosmetics and products for de-icing aircraft. The high-end grades that are now used in cosmetics are less irritating than traditional molecules, Dr. Pierce said, and the industrial grade used in de-icing products is biodegradable, which makes it better than other options.

DuPont is, of course, not the only one in this game.  Cathay Industrial Biotech, for example, ships many different polymers composed of long chain dicarboxylic acids, which are derived from corn and used in anticorrosion products for cars.  Both firms are buying more corn just as prices for commodities are headed through the roof.  Higher prices are now leading U.S farmers to pull land out of conservation programs for use in producing more crops, as described by David Streitfeld in, "As Prices Rise, Farmers Spurn Conservation Program".  Corn, wheat, and soy prices are all up, but so are the prices of oil and fertilizer.

Ostensibly, the Conservation Reserve Program pays farmers to keep environmentally sensitive land out of production.   In the context of a grain surplus, this has the effect of reducing the total amount of land in production, thereby keeping prices a bit higher.  But the surplus of recent decades is over, due in large part to increases in demand in developing countries (see, for example, my post "China and Future Resource Demands"). 

The utility of keeping lands in conservation programs is debated intensely by a range of interested parties, including farmers, policy makers, conservationists, hunters, and even bakers.  From Streitfled's article:

"We're in a crisis here. Do we want to eat, or do we want to worry about the birds?" asked JR Paterakis, a Baltimore baker who said he was so distressed at a meeting last month with Edward T. Schafer, the agriculture secretary, that he stood up and started speaking "vehemently."

The Paterakis bakery, H&S, produces a million loaves of rye bread a week. The baker said he could not find the rye flour he needed at any price.   

..."The pipeline for wheat is empty," said Michael Kalupa, a bakery owner in Tampa, Fla., who is president of the Retail Bakers of America. Mr. Kalupa said the price he paid for flour had doubled since October. He cannot afford to absorb the cost and he cannot afford to pass it on. Sales have been falling 16 percent to 20 percent a month since October. He has laid off three employees.

Among farmers, the notion of early releases from conservation contracts is prompting sharp disagreement and even anger. The American Soybean Association is in favor. "We need more food," said John Hoffman, the association's president.

The National Association of Wheat Growers is against, saying it believes "in the sanctity of contracts." It does not want more crops to be grown, because commodity prices might go down.

That is something many of its members say they cannot afford, even with wheat at a robust $9 a bushel. Their own costs have increased, with diesel fuel and fertilizer up sharply. "It would decrease my profit margin, which is slim," said Jeff Krehbiel of Hydro, Okla. "Let's hurt the farmer in order to shut the bakers up, is that what we're saying?"

Mr. Krehbiel said his break-even last year was $4 a bushel. This summer it will be $6.20; the next crop, $7.75.

That a baker  in the U.S. can't even find the flour he needs is remarkable, though it may not actually be a harbinger of food shortages.  One reason that baker is having trouble is no doubt an increase in demand, and another, equally without doubt, is due to shifting grain production priorities that accommodate increased use of biofuels.

Much in the news the last couple of months has been the assertion that production and use of biofuels is largely responsible for recent increases in food prices.  But how much of the price increase is due to shifting crops to fuel use?

Censoring Science is Detrimental to Security

Restricting access toscience and technology in the name of security is historically a losing proposition.  Censorship of information that is known to exist incentivizes innovation and rediscovery. 

As most readers of this blog know, there has been quite a furor over new results demonstrating mutations in H5N1 influenza strains that are both deadly and highly contagious in mammals.  Two groups, led by Ron Fouchier in the The Netherlands and Yoshihiro Kawaoka at The University of Wisconsin, have submitted papers to Nature and Science describing the results.  The National Science Advisory Board for Biosecurity (NSABB) has requested that some details, such as sequence information, be omitted from publication.  According to Nature, both journals are "reserving judgement about whether to censor the papers until the US government provides details of how it will allow genuine researchers to obtain redacted information".

For those looking to find more details about what happened, I suggest starting with Dorveen Caraval's interview with Fouchier in the New York Times, "Security in Flu Study Was Paramount, Scientist Says"; Kathleen Harmon's firsthand account of what actually happened when the study was announced; and Heidi Ledford's post at Nature News about the NSABB's concerns.

If you want to go further, there is more good commentary, especially the conversation in the comments (including from a member of the NSABB), in "A bad day for science" by Vincent Racaniello.  See also Michael Eisen's post "Stop the presses! H5N1 Frankenflu is going to kill us all!", keeping in mind that Eisen used to work on the flu.

Writing at Foreign Policy, Laurie Garrett has done some nice reporting on these events in two posts, "The Bioterrorist Next Door" and "Flu Season".  She suggests that attempts to censor the results would be futile: "The genie is out of the bottle: Eager graduate students in virology departments from Boston to Bangkok have convened journal-review debates reckoning exactly how these viral Frankenstein efforts were carried out."

There is much I agree with in Ms. Garrett's posts.  However, I must object to her assertion that the work done by Fouchier and Kawaoka can be repeated easily using the tools of synthetic biology.  She writes "The Fouchier episode laid bare the emptiness of biological-weapons prevention programs on the global, national, and local levels.  Along with several older studies that are now garnering fresh attention, it has revealed that the political world is completely unprepared for the synthetic-biology revolution."   As I have already written a book that discusses this confusion (here is an excerpt about synthetic biology and the influenza virus), it is not actually what I want to write about today.  But I have to get this issue out of the way first.

As far as I understand from reading the press accounts, both groups used various means to create mutations in the flu genome and then selected viruses with properties they wanted to study.  To clarify, from what I have been able to glean from the sparse accounts thus far, DNA synthesis was not used in the work.  And as far as I understand from reading the literature and talking to people who build viruses for a living, it is still very hard to assemble a functioning, infectious influenza virus from scratch.   

If it were easy to write pathogen genomes -- particularly flu genomes -- from scratch, we would quite frankly be in deep shit. But, for the time being, it is hard.  And that is important.  Labs who do use synthetic biology to build influenza viruses, as with those who reconstructed the 1918 H1N1 influenza virus, fail most of the time despite great skill and funding.  Synthesizing flu viruses is simply not a garage activity.  And with that, I'll move on.

Regardless of how the results might be reproduced, many have suggested that the particular experiments described by Fouchier and Kawaoka should not have been allowed.  Fouchier himself acknowledges that selecting for airborne viruses was not the wisest experiment he could have done; it was, he says, "really, really stupid".  But the work is done, and people do know about it.  So the question of whether this work should have been done in the first place is beside the point.  If, as suggested by Michael Eisen, that "any decent molecular biologist" could repeat the work, then it was too late to censor the details as soon as the initial report came out. 

I am more interested in the consequences of trying to contain the results while somehow allowing access to vetted individuals.  Containing the results is as much about information security as it is biological security.  Once such information is created, the challenge is to protect it, to secure it.  Unfortunately, the proposal to allow secure access only by particular individuals is at least a decade (if not three decades) out of date.

Any attempt to secure the data would have to start with an assessment of how widely it is already distributed.  I have yet to meet an academic who regularly encrypts email, and my suspicion is that few avail themselves of the built-in encryption on their laptops.  So, in addition to the university computers and email servers where the science originated, the information is sitting in the computers of reviewers, on servers at Nature and Science, at the NSABB, and, depending on how the papers were distributed and discussed by members of the NSABB, possibly on their various email servers and individual computers as well.  And let's not forget the various unencrypted phones and tablets all of those reviewers now carry around.

But never mind that for a moment.  Let's assume that all these repositories of the relevant data are actually secure.  The next step is to arrange access for selected researchers.  That access would inevitably be electronic, requiring secure networks, passwords, etc.  In the last few days the news has brought word that computer security firms Stratfor and Symantec have evidently been hacked recently.  Such attacks are not uncommon.  Think back over the last couple of years: hacks at Google, various government agencies, universities.  Credit card numbers, identities, and supposedly secret DoD documents are all for sale on the web.  To that valuable information we can now add a certain list of influenza mutations.  If those mutations are truly a critical biosecurity risk -- as asserted publicly by various members of the NSABB -- then that data has value far beyond its utility in virology and vaccinology.

The behavior of various hackers (governments, individuals, other) over the last few years make clear that what the discussion thus far has done is to stick a giant "HACK HERE" sign on the data.  Moreover, if Ms. Garrett is correct that students across the planet are busy reverse engineering the experiments because they don't have access to the original methods and data, then censorship is creating a perverse incentive for innovation.  Given today's widespread communication, restriction of access to data is an invitation, not a proscription.

This same fate awaits any concentration of valuable data.  It obviously isn't a problem limited to collections of sensitive genetic sequences or laboratory methods.  And there is certainly a case to be made for attempting to maintain confidential or secret caches of data, whether in the public or private interest.  In such instances, compartmentalization and encryption must be implemented at the earliest stages of communication in order to have any hope of maintaining security. 

However, in this case, if it true that reverse engineering the results is straightforward, then restriction of access serves only to slow down the general process of science.  Moreover, censorship will slow the development of countermeasures.  It is unlikely that any collection of scientists identified by the NSABB or the government will be sufficient to develop all the technology we need to respond to natural pathogens, let alone any artificial ones.

As with most other examples of prohibition, these restrictions are doomed before they are even implemented.  Censorship of information that is known to exist incentivizes innovation and rediscovery.  As I explored in my book, prohibition in the name of security is historically a losing proposition.  Moreover, science is inherently a networked human activity that is fundamentally incompatible with constraints on communication, particularly of results that are already disclosed.  Any endeavor that relies upon science is, therefore, also fundamentally incompatible with constraints on communication.  Namely developing technologies to defend against natural and artificial pathogens.  Censorship threatens not just science but also our security.

Further Thoughts on iGEM 2011

Following up on my post of several weeks ago (iGEM 2011: First Thoughts), here is a bit more on last year's Jamboree.  I remain very, very impressed by what the teams did this year.  And I think that watching iGEM from here on out will provide a sneak peak of the future of biological technologies.

I think the biggest change from last year is the choice of applications, which I will describe below.  And related to the choice of applications is change of approach to follow a more complete design philosophy.  I'll get to the shift in design sensibility further on in the post.

The University of Washington: Make it or Break it

I described previously the nuts and bolts of the University of Washington's Grand Prize winning projects.  But, to understand the change in approach (or perhaps change in scope?) this project represents, you also have to understand a few details about problems in the real world.  And that is really the crux of the matter -- teams this year took on real world problems as never before, and may have produced real world solutions.

Recall that one of the UW projects was the design of an enzyme that digests gluten, with the goal of using that enzyme to treat gluten intolerance.  Candidate enzymes were identified through examining the literature, with the aim of finding something that works at low pH.  The team chose a particular starter molecule, and then used the "video game" Foldit to re-design the active site in silico so that it would chew up gluten (here is a very nice Youtube video on the Foldit story from Nature).  They then experimentally tested many of the potential improvements.  The team wound up with an enzyme that in a test tube is ~800 times better than one already in clinical trials.  While the new enzyme would of course itself face lengthy clinical trials, the team's achievement could have an enormous impact on people who suffer from celiac disease, among many other ailments.

From a story in last week's NYT Magazine ("Should We All Go Gluten-Free?"), here are some eye-opening stats on celiac disease, which can cause symptoms ranging from diarrhea to dramatic weight loss:

  • Prior to 2003, prevalence in the US was thought to be just 1 in 10,000: widespread testing revealed the actual rate was 1 in 133.
  • Current estimates are that 18 million Americans have some sort of gluten intolerance, which is about 5.8% of the population.
  • Young people were 5x more likely to have the disease by the 1990s than in the 1950s based on looking at old blood samples.
  • Prevalence is increasing not just in US, but also worldwide.

In other words, celiac disease is a serious metabolic issue that for some reason is affecting ever larger parts of the global population.  And as a summer project a team of undergraduates may have produced a (partial) treatment for the disease.  That eventual treatment would probably require tens of millions of dollars of further investment and testing before it reaches the market.  However, the market for gluten-free foods, as estimated in the Times, is north of $6 billion and growing rapidly.  So there is plenty of market potential to drive investment based on the iGEM project.

The other UW project is a demonstration of using E. coli to directly produce diesel fuel from sugar.  The undergraduates first reproduced work published last year from LS9 in which E. coli was modified to produce alkanes (components of diesel fuel -- here is the Science paper by Schirmer et al).  Briefly, the UW team produced biobricks -- the standard format used in iGEM -- of two genes that turn fatty acids into alkanes.  Those genes were assembled into a functional "Petrobrick".  The team then identified and added a novel gene to E. coli that builds fatty acids from 3 carbon seeds (rather than the native coli system that builds on 2 carbon seeds).  The resulting fatty acids then served as substrates for the Petrobrick, resulting in what appears to be the first report anywhere of even-chain alkane synthesis.  All three genes were packaged up into the "FabBrick", which contains all the components needed to let E. coli process sugar into a facsimile of diesel fuel.

The undergraduates managed to substantially increase the alkane yield by massaging the culture conditions, but the final yield is a long way from being useful to produce fuel at volume.  But again, not bad for a summer project.  This is a nice step toward turning first sugar, then eventually cellulose, directly into liquid fuels with little or no purification or post-processing required.  It is, potentially, also a step toward "Microbrewing the Bioeconomy".  For the skeptics in the peanut gallery, I will be the first to acknowledge that we are probably a long way from seeing people economically brew up diesel in their garage from sugar.  But, really, we are just getting started.  Just a couple of years ago people thought I was all wet forecasting that iGEM teams would contribute to technology useful for distributed biological manufacturing of fuels.  Now they are doing it.  For their summer projects.  Just wait a few more years.

Finally -- yes, there's more -- the UW team worked out ways to improve the cloning efficiency of so-called Gibson cloning.  They also packaged up as biobricks all the components necessary to produce magnetosomes in E. coli.  The last two projects didn't make it quite as far as the first two, but still made it further than many others I have seen in the last 5 years.

Before moving on, here is a thought about the mechanics of participating in iGEM.  I think the UW wiki is the about best I have seen.   I like very much the straightforward presentation of hypothesis, experiments, and results.  It was very easy to understand what they wanted to do, and how far they got.  Here is the "Advice to Future iGEM Teams" I posted a few years ago.  Aspiring iGEM teams should take note of the 2011 UW wiki -- clarity of communication is part of your job.

Lyon-INSA-ENS: Cobalt Buster

The team from Lyon took on a very small problem: cleaning up cooling water from nuclear reactors using genetically modified bacteria.  This was a nicely conceived project that involved identifying a problem, talking to stakeholders, and trying to provide a solution.  As I understand it, there are ongoing discussions with various sponsors about funding a start-up to build prototypes.  It isn't obvious that the approach is truly workable as a real world solution -- many questions remain -- but the progress already demonstrated indicates that dismissing this project would be premature.

Before continuing, I pause to reflect on the scope of Cobalt Buster.  One does wonder about the eventual pitch to regulators and the public: "Dear Europe, we are going to combine genetically modified organisms and radiation to solve a nuclear waste disposal problem!"  As the team writes on its Human Practices page: "In one project, we succeed to gather Nuclear Energy and GMOs. (emphasis in original)"  They then acknowledge the need to "focus on communication".  Indeed.

Here is the problem they were trying to solve: radioactive Cobalt (Co) is a contaminant emitted during maintenance of nuclear reactors.  The Co is typically cleaned up with ion exchange resins, which are both expensive and when used up must be appropriately disposed of as nuclear waste.  By inserting a Co importer pump into E. coli, the Lyon team hopes to use bacteria to concentrate the Co and thereby clean up reactor cooling water.  That sounds cool, but the bonus here is that modelling of the system suggests that using E. coli as a biofilter in this way would result in substantially less waste.  The team reports that they expect 8000kg of ion exchange resins could be replaced with 4kg of modified bacteria.  That factor of 2000 in volume reduction would have a serious impact on disposal costs.  And the modified bug appears to work in the lab (with nonradioactive Cobalt), so this story is not just marketing.

The Lyons team also inserted a Co sensor into their E. coli strain.  The sensor then drove expression of a protein that forms amyloid fibers, causing the coli in turn to form a biofilm.  This biofilm would stabilize the biofilter in the presence of Co.  The filter would only be used for a few hours before being replaced, which would not give the strain enough time to lose this circuit via selection.

Imperial College London: Auxin

Last, but certainly not least, is the very well thought through Imperial College project to combat soil erosion by encouraging plant root growth.  I saved this one for last because, for me, the project beautifully reflects the team's intent to carefully consider the real-world implications of their work.  There are certainly skeptics out there who will frown on the extension of iGEM into plants, and who feel the project would never make it into the field due to the many regulatory barriers in Europe.  I think the skeptics are completely missing the point.

To begin, a summary of the project: the Imperial team's idea was to use bacteria as a soil treatment, applied in any number of ways, that would be a cost-effective means of boosting soil stability through root growth.  The team designed a system in which genetically modified bacteria would be attracted to plant roots, would then take up residence in those roots, and would subsequently produce a hormone that encourages root growth.

The Auxin system was conceived to combine existing components in very interesting ways.  Naturally-occurring bacteria have already been shown to infiltrate plant roots, and other soil-dwelling bacteria produce the same growth hormone that encourages root proliferation.

Finally, the team designed and built a novel (and very clever) system for preventing leakage of transgenes through horizontal gene transfer.  On the plasmid containing the root growth genes, the team also included genes that produce proteins toxic to bacteria.  But in the chromosome, they included an anti-toxin gene.  Thus if the plasmid were to leak out and be taken up by a bacterium without the anti-toxin gene, any gene expression from the plasmid would kill the recipient cell.

The team got many of these pieces working independently, but didn't quite get the whole system working together in time for the international finals.  I encourage those interested to have a look at the wiki, which is really very good.

The Shift to Thinking About Design

As impressive as Imperial's technical results were, I was also struck by the integration of "human practices" into the design process.  The team spoke to farmers, economists, Greenpeace -- the list goes on -- as part of both defining the problem and attempting to finesse a solution given the difficulty of fielding GMOs throughout the UK and Europe.  And these conversations very clearly impacted the rest of the team's activities.

One of the frustrations felt by iGEM teams and judges alike is that "human practices" has often felt like something tacked on to the science for the sake of placating potential critics.  There is something to that, as the Ethical, Legal, and Social Implications (ELSI) components of large federal projects such as The Human Genome Project and SynBERC appear to have been tacked on for just that reason.  Turning "human practices" into an appendix on the body of science is certainly not the wisest way to go forward, for reasons I'll get to in a moment, nor is it politically savvy in the long term.  But if the community is honest about it, tacking on ELSI to get funding has been a successful short-term political hack.

The Auxin project, along with a few other events during the finals, helped crystallize for me the disconnect between thinking about "human practices" as a mere appendix while spouting off about how synthetic biology will be the core of a new industrial revolution, as some of us tend to do.  Previous technological revolutions have taught us the importance of design, of thinking the whole project through at the outset in order to get as much right as possible, and to minimize the stuff we get wrong.  We should be bringing that focus on design to synthetic biology now.

I got started down this line of thought during a very thought-provoking conversation with Dr. Megan Palmer, the Deputy Director for Practices at SynBERC.  (Apologies to you, Megan, if I step your toes in what follows -- I just wanted to get these thoughts on the page before heading out the door for the holidays.)  The gist of my chat with Megan was that the focus on safety and security as something else, as an activity separate from the engineering work of SB, is leading us astray.  The next morning, I happened to pass Pete Carr and Mac Cowell having a chat just as one of them was saying, "The name human practices sucks. We should really change the name."  And then my brain finally -- amidst the jet lag and 2.5 days of frenetic activity serving as a judge for iGEM -- put the pieces together.  The name does suck.  And the reason it sucks is that it doesn't really mean anything.

What the names "human practices" and "ELSI" are trying to get at is the notion that we shouldn't stumble into developing and using a powerful technology without considering the consequences.  In other fields, whether you are thinking about building a chair, a shoe, a building, an airplane, or a car, in addition to the shape you usually spend a great deal of time thinking about where the materials come from, how much the object costs to make, how it will be used, who will use it, and increasingly how it will be recycled at end of use.  That process is called design, and we should be practicing it as an integral part of manipulating biological systems.

When I first started as a judge for iGEM, I was confused by the kind of projects that wound up receiving the most recognition.  The prizes were going to nice projects, sure, but those projects were missing something from my perspective.  I seem to recall protesting at some point in that first year that "there is an E in iGEM, and it stands for Engineering."  I think part of that frustration was the pool of judges was dominated for many years by professors funded by the NIH, NRC, or the Welcome Trust, for example -- scientists who were looking for scientific results they liked to grace the pages of Science or Nature -- rather than engineers, hackers, or designers who were looking for examples of, you know, engineering.

My point is not that the process of science is deficient, nor that all lessons from engineering are good -- especially as for years my own work has fallen somewhere in between science and engineering.  Rather, I want to suggest that, given the potential impact of all the science and engineering effort going into manipulating biological systems, everyone involved should be engaging in design.  It isn't just about the data, nor just about shiny objects.  We are engaged in sorting out how to improve the human condition, which includes everything from uncovering nature's secrets to producing better fuels and drugs.  And it is imperative that as we improve the human condition we do not diminish the condition of the rest of the life on this planet, as we require that life to thrive in order that we may thrive.

Which brings me back to design.  It is clear that not every experiment in every lab that might move a gene from one organism to another must consider the fate of the planet as part of the experimental design.  Many such experiments have no chance of impacting anything outside the test tube in which they are performed.  But the practice of manipulating biological systems should be done in the context of thinking carefully about what we are doing -- much more carefully than we have been, generally speaking.  Many fields of human endeavor can contribute to this practice.  There is a good reason that ELSI has "ethical", "legal", and "social" in it.

There have been a few other steps toward the inclusion of design in iGEM over the years.  Perhaps the best example is the work designers James King and Daisy Ginsburg did with the 2009 Grand Prize Winning team from Cambridge (see iGEM 2009: Got Poo?).  That was lovely work, and was cleverly presented in the "Scatalog".  You might argue that the winners over the years have had increasingly polished presentations, and you might worry that style is edging out substance.  But I don't think that is happening.  The steps taken this year by Imperial, Lyon, and Washington toward solving real-world problems were quite substantive, even if those steps are just the beginning of a long path to get solutions into people's hands.  That is the way innovation work s in the real world.

The National Bioeconomy Blueprint

Last week the White House Office of Science and Technology Policy (OSTP) closed a Request for Information for the National Bioeconomy Blueprint.  I previously submitted the Biodesic 2011 Bioeconomy Update as background information, and I then extended my comments with a proposal aimed at "Fostering Economic and Physical Security Through Public-PrivatePartnerships and a National Network of Community Labs" (PDF).  In short, I proposed that the U.S. government facilitate the founding and operation of community biotech labs as a means to improve the pace of innovation and reduce the attendant level of risk.

Garages are a critical component of technological innovation and job creation in the United States.  Over the last few years the Kauffman Foundation has published analyses of Census data that show start-ups under a year old are responsible for 100% of the net job creation in the U.S.; firms of all other ages are net job destroyers.  Moreover, as I made clear in my testimony before the Presidential Commission for the Study of Bioethical Issues, garages played a crucial role in developing many of the technologies we use on a daily basis.  Thus if we want to maintain a healthy pace of innovation in biological technologies, it makes sense that we will need to foster a profusion of garage biotech labs.

A biotech lab in every garage will make many a policy wonk uneasy.  What about safety and security?  I suggest that the emerging model of community labs (Genspace, Biocurious, etc.) is a good foundation to build on.  The FBI already has a program in place to engage these labs.  And as it turns out, the President has already signed a document that states garage biology is good and necessary for the future physical and economic security of the United States.  The USG could offer grants (financial, equipment, etc) to labs that sign on to follow educational and operational guidelines.  The existence of such labs would facilitate access to infrastructure for innovators and would also facilitate communication with those innovators by the USG.

I will admit that in my early conversations with the founders of Genspace and Biocurious that I was skeptical the model would work.  More than a decade ago I put serious effort into figuring out if a commercial bio-incubator model could work, and I concluded that numbers were nowhere near workable.  I also think it is too early to take real lessons away from the for-profit hackerspaces that are cropping up all over, because there isn't enough of a track record of success.  Anyway, and fortunately, the folks at Genspace and Biocurious ignored me.  And I am glad they did, because I was stuck thinking about the wrong kind of model.  Not for profit and community engagement is definitely the way to go.  I think most medium to large U.S. cities could support at least one community biotech lab.

Where should we put these labs?  I suggest that, following the recent model of installing Fab Labs and Hackspaces in public libraries, the USG should encourage the inclusion within libraries and other underused public spaces of community biotech labs.  There are endless benefits to be had from following this strategy.

I could go on, but there's more in my submission the OSTP: "Fostering Economic and Physical Security Through Public-Private Partnerships and a National Network of Community Labs" (

PDF

).

Diffusion of New Technologies

A Tweet and blog post from Christina Cacioppo about technological diffusion led me to dig out a relevant slide and text from my book.  Ms. Cacioppo, reflecting on a talk she just saw, asks "Are we really to believe there was no "new" technology diffusion between 1950 and 1990? I thought this was the US's Golden Age of Growth. (Should we include penicillin, nuclear power, or desktop computers on this chart?)".  There is such data out there, but it can be obscure.

As it happens, thanks to my work with bio-era, I am familiar with a 1997 Forbes piece by Peter Brimlow that explores what he called "The Silent Boom".  Have a look at the text (the accompanying chart is not available online), but basically the idea is that keeping track of the cost of a technology is less informative than tracking actual market penetration, which is sometimes called "technological diffusion".  The time between the introduction of a technology and widespread adoption is a "diffusion lag".  The interesting thing for me is that there appears to be a wide distribution of diffusion lags; that is, some technologies hit the market fast (which can still mean decades) while others can take many more decades.  There really isn't enough data to say anything concrete about how diffusion lags are changing over time, but I am willing to speculate that not only are the lags getting shorter (more rapid market adoption), but that the pace of adoption is getting faster (steeper slope).  Here is the version of the chart I use in my talks, followed by a snippet of related text from my book (I am sure there is a better data set out there, but I have not yet stumbled over it):

carlson_silent_boom.png
And from pg 60 of Biology is Technology:

Diffusion lags in acceptance appear frequently in the adoption of new technologies over the last several centuries. After the demonstration of electric motors, it took nearly two decades for penetration in U.S. manufacturing to reach 5% and another two decades to reach 50%. The time scale for market penetration is often decades[6] (see Figure 5.6). There is, however, anecdotal evidence that adoption of technologies may be speeding up; "Prices for electricity and motor vehicles fell tenfold over approximately seven decades following their introduction. Prices for computers have fallen nearly twice as rapidly, declining ten-fold over 35 years."[4]

Regardless of the time scale, technologies that offer fundamentally new ways of providing services or goods tend to crop up within contexts set by preceding revolutions. The interactions between the new and old can create unexpected dynamics, a topic I will return to in the final chapter. More directly relevant here is that looking at any given technology may not give sufficient clues as to the likely rate of market penetration. For example, while the VCR was invented in 1952, adoption remained minimal for several decades. Then, in the late 1970's the percentage of ownership soared. The key underlying change was not that consumers suddenly decided to spend more time in front of the television, but rather that a key component of VCRs, integrated circuits, themselves only a few decades old at the time, started falling spectacularly in price. That same price dynamic has helped push the role of integrated circuits into the background of our perception, and the technology now serves as a foundation for other "independent" technologies ranging from mobile phones, to computers, to media devices.

iGEM 2011: First Thoughts

Congratulations to the 2011 University of Washington iGEM team for being the first US team ever to win the Grand Prize.  The team also shared top honors for Best Poster (with Imperial College London) and for Best Food/Energy Project (with Yale).  The team also had (in my opinion) the clearest, and perhaps best overall, wiki describing the project that I have seen in 5 years as an iGEM judge.  I only have a few minutes in the airport to post this, but I will get back to it later in the week.

The UW team had an embarrassment of riches this year.  One of the team's projects demonstrated production of both odd and even chain alkanes in E. coli directly from sugar.  The odd-chain work reproduces the efforts of a Science paper published by LS9 last year, but the team also added an enzyme from B. subtilis to the pathway that builds alkanes starting from a 3-carbon seed rather than the normal 2-carbon seed in coli.  This latter step allowed them to make even-chain alkanes via a synthetic biological pathway, which has not been reported elsewhere.  So they wound up directly making diesel fuel from sugar.  The yields aren't all there yet to roll out this sort of thing more widely, but its not so bad for a summer project.

And that's not all.

The other main project was an effort to produce an enzyme to digest gluten.  There is one such enzyme in clinical trials at the moment, intended for use as a therapeutic for gluten intolerance, which afflicts about 1% of the population.  However, that enzyme is not thermostable and has an optimum pH of 7.

The UW team found an enzyme in the literature that was not known to digest gluten, but which works at pH 4 (close to the human stomach) and is from a thermophilic organism.  They used Foldit to redesign the enzyme to process gluten, and then built a library of about 100 variants of that design.  One of those variants wound up working ~800 times better than the enzyme that is currently in clinical trials.  And the team thinks they can do even better by combining some of the mutants from the library.

Nice work.

I could go on and on about the competition this year.  The teams are all clearly working at a new level.  I recall that a couple of years ago at iGEM Drew Endy asked me, somewhat out of frustration, "Is this it?  Is this all there is?"  The answer: No.  There is a hell of a lot more.  And the students are just getting started.

Plenty of other teams deserve attention in this space, in particular Imperial College London, the runner up.  They built a system (called Auxin) in E. coli to encourage plant root growth, with the aim of stopping desertification.  And their project was an extremely good example of design, from the technical side through to conversations with customers (industry) and other stakeholders (Greenpeace) about what deployment would really be like.

More here later in the week.  Gotta run for the plane.

Biodesic 2011 Bioeconomy Update: U.S. Revenues from Genetically Modified Systems Now $300 Billion, or Greater than 2% of GDP.

Biodesic has released a short Technical Report on the size of U.S. Bioeconomy.  The Biodesic 2011 Bioeconomy Update (PDF) walks the reader through changes in revenues from GM crops, biologics, and industrial biotech.  The Technical Report updates the figures and analysis published in, Biology is Technology: The Promise, Peril, and New Business of Engineering Life.

I continue to be surprised by the misreporting in major publications of revenues from GM crops.  Based on USDA statistics and average crop prices, the three main GM crops in the U.S. (corn, soy, and cotton) brought in farm scale revenues of $100 billion in 2010.  As I noted in 2009 in Nature Biotechnology, many news outlets continue to report the $5.5 billion in revenues from U.S. GM seed sales as total sector revenues.

With U.S. biologics revenues of $75 billion, and industrial biotech revenues of $115 billion, total U.S. 2010 revenues from genetically modified systems were $300 billion, or the equivalent of more than 2% of GDP. 

Globally, biotech investment continues to accelerate, as do revenues (see table below).  China and India have made domestic biotech a priority for producing jobs and economic growth and as an independent source of fuels, food, and materials.  Malaysia has recently reported biotech constituted 2.5% of its 2010 GDP, up from zero in 2005.  Pakistan's biotech economy presently consists entirely of GM cotton, which the USDA estimates to now be 100% of the annual drop, and which until 2010 was entirely illegal.

Read more in the Biodesic 2011 Bioeconomy Update.

Country

2010 Biotech Revenues

2010 Est. Growth

2020 Target Biotech Revenues

Malaysia

2.5%

25%

10%

China

2.5%

20%

5-8%

United States

>2%

10-15%

NA

India

0.24-0.40%

20%

1.6% (2015)

Pakistan

1.6%

<5%

NA

Europe

<1.0%

5%

NA

Table 1.

Biotech Revenues as Share of GDP. Source: Biodesic 2011 Bioeconomy Update.


Staying Sober about Science

The latest issue of The Hastings Center Report carries an essay of mine, "Staying Sober about Science" (free access after registration), about my thoughts on New Directions: The Ethics of Synthetic Biology and Emerging Technologies (PDF) from The Presidential Commission for the Study of Bioethical Issues.

Here is the first paragraph:

Biology, we are frequently told, is the science of the twenty-first century. Authority informs us that moving genes from one organism to another will provide new drugs, extend both the quantity and quality of life, and feed and fuel the world while reducing water consumption and greenhouse gas emissions. Authority also informs that novel genes will escape from genetically modified crops, thereby leading to herbicide-resistant weeds; that genetically modified crops are an evil privatization of the gene pool that will with certainty lead to the economic ruin of small farmers around the world; and that economic growth derived from biological technologies will cause more harm than good. In other words, we are told that biological technologies will provide benefits and will come with costs--with tales of both costs and benefits occasionally inflated--like every other technology humans have developed and deployed over all of recorded history.

And here are a couple of other selected bits:

Overall, in my opinion, the report is well considered. One must commend President Obama for showing leadership in so rapidly addressing what is seen in some quarters as a highly contentious issue. However, as noted by the commission itself, much of the hubbub is due to hype by both the press and certain parties interested in amplifying the importance of the Venter Institute's accomplishments. Certain scientists want to drive a stake into the heart of vitalism, and perhaps to undermine religious positions concerning the origin of life, while "civil society" groups stoke fears about Frankenstein and want a moratorium on research in synthetic biology. Notably, even when invited to comment by the commission, religious groups had little to say on the matter.

The commission avoided the trap of proscribing from on high the future course of a technology still emerging from the muck. Yet I cannot help the feeling that the report implicitly assumes that the technology can be guided or somehow controlled, as does most of the public discourse on synthetic biology. The broader history of technology, and of its regulation or restriction, suggests that directing its development would be no easy task.8 Often technologies that are encouraged and supported are also stunted, while technologies that face restriction or prohibition become widespread and indispensable.

...The commission's stance favors continued research in synthetic biology precisely because the threats of enormous societal and economic costs are vague and unsubstantiated. Moreover, there are practical implications of continued research that are critical to preparing for future challenges. The commission notes that "undue restriction may not only inhibit the distribution of new benefits, but it may also be counterproductive to security and safety by preventing researchers from developing effective safeguards."12 Continued pursuit of knowledge and capability is critical to our physical and economic security, an argument I have been attempting to inject into the conversation in Washington, D.C., for a decade. The commission firmly embraced a concept woven into the founding fabric of the United States. In the inaugural State of the Union Address in 1790, George Washington told Congress "there is nothing which can better deserve your patronage than the promotion of science and literature. Knowledge is in every country the surest basis of publick happiness."13

The pursuit of knowledge is every bit as important a foundation of the republic as explicit acknowledgment of the unalienable rights of life, liberty, and the pursuit of happiness. Science, literature, art, and technology have played obvious roles in the cultural, economic, and political development of the United States. More broadly, science and engineering are inextricably linked with human progress from a history of living in dirt, disease, and hunger to . . . today. One must of course acknowledge that today's world is imperfect; dirt, disease, and hunger remain part of the human experience. But these ills will always be part of the human experience. Overall, the pursuit of knowledge has vastly improved the human condition. Without scientific inquiry, technological development, and the economic incentive to refine innovations into useful and desirable products, we would still be scrabbling in the dirt, beset by countless diseases, often hungry, slowly losing our teeth.

There's more here.

References:

8. R. Carlson, Biology Is Technology: The Promise, Peril, and New Business of Engineering Life (Cambridge, Mass.: Harvard University Press, 2010).

12. Presidential Commission for the Study of Bioethical Issues, New Directions, 5.

13. G. Washington, "The First State of the Union Address," January 8, 1790, http://ahp.gatech.edu/first_state_union_1790.html.