"Stem-Cell Craze Spreads in Russia"

A tidbit from the AP today about quasi-legal stem cell treatments in Russia (via Wired News), "Stem-Cell Craze Spreads in Russia".  Evidently, treatments putatively consisting of adult and/or embryonic stem cells are being used as treatments for everything from cosmetic adjustments to MS.  The treatments are totally unregulated and at best skirt the edge of what is legal in Russia.  It is unclear where the cells are coming from, or whether those performing the injections have the skills and equipment to isolate stem cells in the first place.  No studies are being performed to follow the patients, or to find out if the treatments are causing harm.

This demonstrates the lengths people are willing to go in order to take advantage of new, unproven technologies.  It also suggests the extent of body modification we can expect when real treatments are demonstrated using stem cells, particularly those that have been genetically modified or coaxed to differentiate into particular tissue types.  Feather goatees will be passe.

The Thousand Dollar Genome

I have once again been hearing noises about the "thousand dollar genome" (TDG).  That is, a human genome read de novo for a USD 1000 or less.  Here (REVOLUTIONARY GENOME SEQUENCING TECHNOLOGIES -- THE $1000 GENOME), for example, is a request for proposals from the National Human Genome Research Institute to develop technology that would enable the TDG.

Based on my early efforts to quantify how the productivity and cost of sequencing were changing, Steward Brand asked me back in 2002 when we would get the TGD. 

Thousand_dollar_genome_3Here is the plot I generated in response (click on the figure thumbnail for a full-sized version).

The cost hasn't changed dramatically recently, and at the current pace we we won't get the TDG until sometime after 2020.  With 3 billion (3x109) bases in the human genome, we need to hit USD .3x10-6 per base (which is .3 microbucks, 300 nanobucks, 300 nanodollars per base -- nanoeconomics anyone?) to reach the Thousand Dollar Genome.  However, the numbers on the plot are primarily based on instruments that use slab gel electrophoresis and capillary electrophoresis.  Thus as new technologies emerge we could very well get to the TDG much more rapidly.

Ch 5 of Learning to Fly is Online

Here is part of "The Second Coming of Synthetic Biology", the fifth chapter of my book, Learning to Fly: the past, present, and future of Biological Technology.  More at: www.BiologyIsTechnology.com.

Chapter 5.  The Second Coming of Synthetic Biology

"I must tell you that I can prepare urea without requiring a kidney of an animal, either man or dog.”   With these words, in 1828 Friedrich Wohler announced he had irreversibly changed the world.  In a letter to his former teacher Joens Jacob Berzelius, Wohler wrote that he had witnessed, “The great tragedy of science, the slaying of a beautiful hypothesis by an ugly fact.”  The beautiful idea to which he referred was vitalism, the notion that organic matter, exemplified in this case by urea, was animated and created by a vital force and that it could not be synthesized from inorganic components.  The ugly fact was a dish of urea crystals on his laboratory bench, produced by heating inorganic salts.  Thus was born the field of synthetic organic chemistry.

Around the dawn of the 19th century, chemistry was in revolution right along with the rest of the western world.  The study of chemical transformation, then still known as alchemy, was undergoing systematic quantification.  Rather than rely on vague and mysterious incantations, scientists such as Antoine Lavoisier wanted to create what historian of science and technology Bruce Hevly calls an “objective vocabulary” for chemistry.  Through careful measurement, a set of clear rules governing the synthesis of inorganic, non-living materials gradually emerged.

In contrast, in the early 1800s the study of organic molecules was primarily concerned with understanding how molecules already in existence were put together.  It was a study of chemical compositions and reactions.  Unlike the broader field of chemistry taking shape from alchemy, making new organic things was of lesser concern because it was thought by many that organic molecules were beyond synthesis.  Then, in 1828, Wohler synthesized urea.  Suddenly, with one experiment, the way scientists did organic chemistry changed.  The ability to assemble organic molecules from inorganic components altered the way people viewed a large fraction of the natural world because they could conceive of building much of it from simpler pieces.  Building something from scratch, or modifying an existing system, requires understanding more details about the system than simply looking at it, poking it, and describing how it behaves.  This new approach to chemistry helped open the door to the world we live in today.  Products of synthetic organic chemistry dominate our environment, and the design of those products is possible only because understanding the process of novel assembly revealed new principles.

It was this step of moving to Synthetic Chemistry, and then to an engineering of chemistry, which radically changed the way people understood chemistry.  Chemists had to learn rules that weren’t apparent before.  In the same way that Chemical Engineering changed our understanding of nature, as we begin engineering biological systems we will learn considerably more about the way biological pieces work together.  Challenges will arise that aren’t obvious just from watching things happen.  With time, we will understand and address those challenges, and our use of biology will change dramatically in the process.  The analogy at this point should be clear; we are well on our way to developing Synthetic Biology.

Before going further, it is worth noting that this is not the original incantation of the phrase “synthetic biology”.  Whatever the reception this time around, the first time it was a flop.  In her history of the modern science of biology, Making Sense of Life, Evelyn Fox Keller recounts efforts at the turn of the 20th Century to discover the secret of life through construction of artificial, and synthetic, living systems; “To many authors writing in the early part of the [20th] century, the [path] seemed obvious: the question of what life is was to be answered not by induction but by production, not be analysis but by synthesis.”(Keller, p.18)  This offshoot of experimental biology reached its pinnacle, or nadir, depending on your point of view, in attempts by Stephané Leduc to assemble purely physical and chemical systems that demonstrated behaviors reminiscent of biology.  As part of his program to demonstrate “the essential character of the living being”(ibid, p.28) at both the sub-cellular and cellular level, Leduc constructed chemical systems that he claimed displayed mitotic division, growth, development, and even cellular motility.  He described these patterns and forms in terms of the well-understood physical phenomena of diffusion and osmotic pressure.  It is important to note that these efforts to synthesize life-like forms relied as much on experiment as upon theory developed to describe the relevant physics and chemistry.  That is, this was a specific program to use physical principles to explain biological phenomena.  These efforts were described in a review paper at the time as “La Biologie synthetique”(ibid, p.31-32).

While the initial reception to this work was somewhat favorable, Leduc’s grandiose claims about the implications of his work, and a growing general appreciation for complicated biological mechanisms determined through experiments with living systems, led to something of a backlash against the approach of understanding biology through construction.  By 1913, one reviewer wrote, “The interpretations of M. Leduc are so fantastic…that it is impossible to take them seriously”(ibid, p.31).  Keller chronicles this episode within the broader historical debate over the role of construction and theory in biology.   History regards the folks in the synthetic camp, and related efforts to build mathematical descriptions of biology, particularly in the area of growth and development, as poorly regarded by their peers.  Perhaps inspired by the contemporaneous advances in physics, it seems that the mathematical biologists and the synthetic biologists of the day pushed the interpretation of their work further than was warrented by available data.

In response to what he viewed as theory run rampant, Charles Davenport suggested in 1934 that, “What we require at the present time is more measurement and less theory…There is an unfortunate confusion at the present time bewteen quantitative biology and bio-mathematics…Until quantitative measurement has provided us with more facts of biology, I prefer the former science to the latter”(ibid, p.86).  I think these remarks are still valid today.  Leduc, and the approach he espoused, failed because real biological parts are more complex, and obey different rules, than his simple chemical systems, however beautiful they were.  And it is quite clear that vast forests have been felled to publish theory papers that have little to do with the biology we see out the window.  But theory, drawn from physics, chemistry, and engineering, does have a role to play in describing biological systems.  Resistance to the tools of theory has been, in part, cultural.  There has always been a certain tension in biology over the utility of mathematical and physical approaches to the subject;

To put it simply, one could say that biologists do not accept the Kantian view of mathematics (or, rather, mathematization) as the measure of a true science; indeed, they have often actively and vociferously repudiated any such criterion.  Nor have practicing biologists shown much enthusiasm for the use of mathematics as a heuristic guide in their studies of biological problems.(Keller, p. 81)

Fortunately, this appears to be changing.  Mathematical approaches are flourishing in biology, particularly in the interpretation of large data sets produced by genomic and proteomic studies.  Physicists and engineers are making fundamental contributions to the quantitative understanding of how individual proteins work in their biological context.  But I think it is important to acknowledge that not all biologists think a synthetic, bottom up, approach will yield truths applicable to complex systems that have evolved over billions of years.  Such concerns are not without merit, because as the quotation from Charles Davenport suggests, biology has traditionally had more success when driven by good data rather than theory.  The challenge today is to build quantitatively predictive design tools based on the measured device physics of real biological parts, and to implement designs within organisms in ways that work in the real world...

More at: www.BiologyIsTechnology.com.

WebTV in Paradise

<>

I am stuck trying to post on a decrepit WebTV system from a hotel in Kauai.  Don't buy one of these things if it is your last option to communicate with the world.  Forget trying to include links in a post.  Regarding Oliver Morton's recent op-ed in the Times, "Biology's New Forbiden Fruit" (11 Feb, 2005), there was an editing mistake that gave me sole credit for the recent paper on the use of Tadpoles for sensitive detection in Nature Methods.  It seems this misprint will be corrected in a forthcoming issue.  Why oh why did I leave my Powerbook at home?

UPDATE
(18.02.05, back in Seattle.  Brrr.):  Here is the column (NYTimes in exchange for your first born and all that), and here is a free copy at freerepublic.com, a site that I wouldn't ordinarily advertise, but they chose to violate copyright and take on the Times, which means I didn't have to.

UPDATE: Here is the correction in the Times, with an additional very odd addendum that seems to overplay the cautionary aspects of Oliver's op-ed.  Editors -- can't live with 'em, and can't live without -- hmmm...

A Few Thoughts on the Tian et al Nature paper and Nicholas Wade's NY Times article

In the 23 December 2004 issue of Nature, Jingdong Tian et al. describe a new method for "Accurate multiplex gene synthesis from programmable DNA microchips."  The name most frequently associated with the paper is that of George Church, a professor at Harvard Medical School.

The authors combine microfluidics, biochemistry, and molecular biology to produce a widget capable of rapid synthesis of long oligonucleotides (oligos).  The paper reports an integration of 1) a new way to elute completed oligos from arrays; 2) on chip amplification of oligos; 3) error correction using via "strict hybridization" conditions to remove mistakes; and 4) microfluidic multiplexing, to produce 14.5 kilobase (KB) long fragments of DNA.  Slipped in at the end of the paper is the claim that they have already used this technology to successfully fabricate 95-382 KB oligos, assembling them into megabase (MB) length sequences.  Although it may receive less press, when the paper comes out describing the latter advance it will mark a significant milestone in the human ability to manipulate biological systems.  Organismal length sequences will be well within reach.

Now for the press coverage of the paper.  Mr. Wade, in the 12 January 2005 edition of The New York Times, describes it thus;

Researchers have made an unexpectedly sudden advance in synthesizing long molecules of DNA, bringing them closer to the goal of redesigning genes and programming cells to make pharmaceuticals.

But the success also puts within reach the manufacture of small genomes, such as those of viruses and perhaps certain bacteria. Some biologists fear that the technique might be used to make the genome of the smallpox virus, one of the few pathogens that cannot easily be collected from the wild.

With all respect to George Church and his colleagues, and without reducing the significance of their technical achievement, I have to say this actually isn't so much of a surprise.  It is true that I have been following this, and that I saw the chip on Erdogan Gulari's desk last winter.  In other words, I have had time to get over it.  But this sort of thing has been in the air for a while, and Drew Endy and I talked about something similar many years ago at tMSI.  I am certain we were not the first to do so.

More interesting is the reduction in cost per base of the synthesis, which Professor Gulari puts at about a penny a base for the long oligos.  This is news, and the cost falls completely off the curves I published in 2002.  The impact of the paper will only be felt when the technology becomes widely available, which is at least a couple of years out.  Unless I misunderstand the market and the state of the technology, the only people with access to synthesis at this scale and cost are the authors of the paper and their pals in academia.

With respect to suggestions that oligo synthesizers should be regulated, my views are well known at this point.  In the NY Times piece, Professor Church suggests registration of instruments could go a long way towards increasing security.  More information is, of course, better.  But we have too much experience forcing people "underground" when the things they want to pursue are restricted or made illegal.  I suspect we will be much better off encouraging an open community of people unafraid to talk about what they are up to in their garage.  Finally, even if instrument makers are willing to going along with registration, there will be a big hole in the registry due to the aftermarket, and I don't know how to enforce registration of homemade DNA synthesizers.  There are arguments that no one will want to build a synthesizer, or to play with what it enables, but I think the history of tinkering is a fairly decisive counterexample.  So the real question is, how do you stop people from playing?  I don't think you can.

As an advance in the technology, far more interesting to me is a paper by Peter Carr et al, from the Jacobson group at MIT, "Protein-mediated error correction for de novo DNA synthesis".  They use the DNA mismatch-binding protein MutS to identify mistakes, which are then removed from the synthesis pool.  One round of this procedure improves the error rate to ~1 in 4000 bases, which is a factor of three better than the Tian et al work discussed above.  A second round of error correction reduces the error rate to ~1 in 10 KB.  This rate is so low that a single round of synthesis and cloning should be sufficient to produce multi-gene cassettes suitable for use in complicated genetic circuits.  The combination of the protein-mediated correction and the Tian et al work would be impressive indeed.  Since George Church is thanked in the acknowledgments of the Carr paper, no doubt all the right people are considering the possibilities.

Tadpoles Unleashed

The first paper describing sensitive, parallel quantitation of "just about anything" using Tadpoles is now published.  "Using protein-DNA chimeras to detect and count small numbers of molecules"(abstract), is now available at Nature Methods.  The News and Views piece (subscription required), by Garry Nolan, a microbiology and immunology professor at Stanford, describes the paper thus;

What is important about the work is that [it] went well beyond the norm in providing proof of concept for a detection system. The modularity of [the] approach, the ease with which the recognition domains can be created and simply coupled to a DNA marker for multiplexed measurements, and the extraordinary sensitivity of the approach makes this an appealing system for researchers wanting a standardized high-throughput, and accurate, detection system for...just about anything.

It is gratifying to finally see this technology out in the world.  Ian Burbulis, in particular, did a tremendous job in grinding out the details of assembling the detector molecules and of making the assays work.  When Ian and I conceived this technology, the point was to enable multiplexed detection of proteins and other analytes from single cells.  While we have more work to do to implement the assay at the single cell level, the paper demonstrates we are well on our way.

Nolan also notes the commercial potential of the technology: "The authors [demonstrated] a more real-world, sensitive test of an important bacterial pathogen in whole blood sera.  I can already see the reagent vendors scrambling for their phones."  As one of the two inventors (here is the patent application), this gives me the opportunity to blog about the tension between protecting inventions, to enable commercialization, and the philosophy and practice of Open Source.  I first discussed the potential of widespread access to biological technology in "Open Source Biology And Its Impact on Industry", published in IEEE Spectrum in 2001.  More on this in an upcoming post.

"Carlson Curves" and Synthetic Biology

(UPDATE, 1 September 06: Here is a note about the recent Synthetic Biology story in The Economist.)

(UPDATE, 20 Feb 06: If you came here from Paul Boutin's story "Biowar for Dummies", I've noted a few corrections HERE.)

Oliver Morton's Wired Magazine article about Synthetic Biology is here. If you are looking for the "Carlson Curves", The Pace and Proliferation of Biological Technologies" is published in the journal Biosecurity and Bioterrorism. The paper is available in html at kurzweilai.net.

A note on the so-called "Carlson Curves" (Oliver Morton's phrase, not mine): The plots were meant to provide a sense of how changes in technology are bringing about improvements in productivity in the lab, rather than to provide a quantitative prediction of the future. I am not suggesting there will be a "Moore's Law" for biological technologies. Although it may be possible to extract doubling rates for some aspect of this technology, I don't know whether this analysis is very interesting. I prefer to keep it simple. As I explain in the paper, the time scale of changes in transistor density are set by planning and finance considerations for multi-billion dollar integrated circuit fabs. That doubling time has a significant influence on many billions of dollars of investment. Biology, on the other hand, is cheap, and change should come much faster. Money should be less and less of an issue as time goes on, and my guess is those curves provide a lower bound on changes in productivity.

I will try to have something tomorrow about George Church and Co's "unexpected improvement" in DNA synthesis capacity, as well as some comments about Nicholas Wade's New York Times story.