Synthetic Biology in China

Everyone interested in the future of biology should be paying attention to what Chinese students are up to.  The latest post on the Synthetic Biology discussion list, maintained by the MIT group, announces an upcoming workshop at Tianjin University on the International Genetically Engineered Machines (iGEM) competition.  I'll bet there will be an enormous demand by students to create teams for iGEM.  The resulting profusion of skills and new parts, which will presumably become Biobricks, is only going to make the future more interesting.  Interesting, as in, "May you live in interesting times," with all the promise and peril that implies.

Water Arithmetic for Biofuels

Some months ago, I set out to try to make a back of the envelope calculation of how much water is available in the U.S. for growing crops destined for processing into biofuels.  Unfortunately, the more I learn, the larger the envelope seems to get.

My interest was piqued at Synthetic Biology 2.0, where Steve Chu, Nobel Laureate and Director of LBL, suggested there was plenty of water available for growing rain fed crops on marginal agricultural land.  (I've written about this before: "The Impact of Biofuel Production on Water Supplies", and "Live from Synthetic Biology 2.0".)  I have spent most of my life living in Western states, and over the years the snow pack has gotten smaller, summer water shortages more frequent, and acrimony over water issues all the more intense.  So I am somewhat skeptical of the notion that we can somehow conjure up sufficient resources to simply farm our way into energy independence.

Because it seems very hard to sort out just how much water is available from rainfall, or from aquifers, I am going to punt on the calculation.  Perhaps someone else out there can figure out an easy way to make an estimate.  The simplest way to judge how much water can be used for growing biofuels may be to look at the broadest possible level and note how much effort Western states are putting into shoring up water rights, how many are building new pipelines, and how many are putting desalination plants into operation.  The New York Times has a nice story today on all of this, entitled "An Arid West No Longer Waits for Rain":

Some $2.5 billion in water projects are planned or under way in four states, the biggest expansion in the West's quest for water in decades.

..."What you are hearing about global warming, explosive growth -- combine with a real push to set aside extra water for environmental purpose -- means you got a perfect situation for a major tug-of-war contest," said Sid Wilson, the general manager of the Central Arizona Project, which brings Colorado River water to the Phoenix area.

New scientific evidence suggests that periodic long, severe droughts have become the norm in the Colorado River basin, undermining calculations of how much water the river can be expected to provide and intensifying pressures to find new solutions or sources.

..."The Western mountain states are by far more vulnerable to the kinds of change we've been talking about compared to the rest of the country, with the New England states coming in a relatively distant second," said Michael Dettinger, a research hydrologist at the United States Geological Survey who studies the relationships between water and climate.

Mr. Dettinger said higher temperatures had pushed the spring snowmelt and runoff to about 10 days earlier on average than in the past. Higher temperatures would mean more rain falling rather than snow, compounding issues of water storage and potentially affecting flooding.

Changes in rainfall are having very real consequences in the way state and regional planners think about how water is distributed in the West.  States are engaged in legal actions against each other to prevent new pipelines that might redistributed what water there is, and cities are paying for water now legally owned by farmers:

The great dams and reservoirs that were envisioned beginning in the 1800s were conceived with farmers in mind, and farmers still take about 90 percent of the Colorado River's flow. More and more,  [Robert W. Johnson, the Bureau of Reclamation commissioner], said, the cities will need that water.

An agreement reached a few years ago between farmers and the Metropolitan Water District of Southern California, the chief supplier of water to that region, is one model. Under the terms of the agreement, farmers would let their fields lie fallow and send water to urban areas in exchange for money to cover the crop losses.

"I definitely see that as the future," Mr. Johnson said.

Note that this means there will be less water available for crops presently grown as food.  Yet another complicating factor for figuring out how much water will be available for growing biofuels.  All across the globe, the demand for  food crops has increased dramatically as corn is used to make ethanol for fuel.  This has produced mass protest in Mexico, and prompted the Chinese government to curtail ethanol production.  For example, in the 21 December, 2006, Asian Times, "Biofuels eat into China's food stocks".

The story was more explicitly told in Red Herring a few months ago, "Corn Again: 3 Reasons Ethanol Will Be Back":

In more bad news, China on Wednesday halted the expansion of its ethanol industry, blaming it--and other industrial corn uses--for soaring grain prices, according to Xinhua, China's official news agency

Here is a recent column from Bloomberg on water and biofuels, by Andy Mukherjee.  He focuses on the trade-offs and odd cost structures used to encourage biofuel production in China and India.  The piece has some interesting numbers and is basically a tale of woe.

Oddly, near the end of the column, Mukherjee throws down the statement that, "The U.S. has plenty of water; the world as a whole doesn't."  Um, hasn't he heard the phrase "water wars"?  We have those today, every day, in the Western U.S., and they are only getting worse.  Food vs. electricity, waterborne commerce vs. fish?  Most of the fighting is done with words, but bullets and bulldozers come into play none too infrequently.  The only place on the west coast really flush with water is Los Angeles -- witness all the green lawns during the desert summer -- but that's because they just steal it all from somewhere else.                

The year end issue of New Scientist carried an interesting centerfold entitled "The State of the Planet", which, alas, doesn't seem to be available online.  There is a small map of groundwater withdrawal by country.  The U.S. withdraws somewhere between 251 and 500 cubic meters (1000 liters) per person per year, India between 101 and 250 cubic meters, and China less than 100 cubic meters.  Europe, Brazil, Russia, and Canada all fall between 100 and 250 cubic meters per person per year.  Interestingly, only the U.S., China, and India withdraw a total annual amount greater than what is recharged naturally.

Thus we are already operating at a significant, perhaps severe, water deficit, and I just don't see how we can avoid pushing further into negative territory by using yet more water for growing plants used as fuel.

Below are a few resources that may be of use in sorting out how much water is actually available for growing biofuels.

Here is a 1976 report suggesting the total annual precipitation in the US is 5759 cubic kilometers, which is 5759 billion cubic meters, and here is a page from Purdue University stating that:

The U.S. receives enough annual precipitation to cover the entire country to a depth of 30 inches...  Most of this precipitation returns to the water cycle through evapotranspiration. Of the 30 inches of rainfall, 21 inches returns to the atmosphere in this manner. Water loss by plants, the transpiration portion of evapotranspiration, is most significant. One tree transpires approximately 50 gallons of water a day. Approximately 8.9 inches of annual precipitation flows over the land in rivers and returns to the ocean. Only 0.1 of an inch of precipitation infiltrates into the ground water zone by gravity percolation.

A recent OECD report puts US water consumption at ~518 billion cubic meters annually, or ~1730 cubic meters per capita annually.

If you prefer thinking in old fashioned gallons, here is a report from the EPA entitled, "How We Use Water in These United States." 

Here is the USGS Groundwater Atlas of the United States, and Estimated Use of Water in the US in 2000.
 

Enzymatic Production of O-Type Blood

Everyone knows that blood for transfusions is always in short supply.  Qiyong Liu and colleagues report in Nature Biotechnology the conversion of type A and B blood to type O.  Liu, et al, screened 2500 fungal and bacterial samples to find exoglycosidases that efficiently cleave carbohydrate groups from donor erythrocytes, thereby providing a route to "universal red blood cells".

In a short news piece by Peter Aldhous, The New Scientist notes that:

The A and B antigens, which give blood groups their name, are sugars carried on the surface of red blood cells. Human red blood cells can carry one of these antigens, both, or neither; giving four blood groups: A, B, AB and O, respectively. Receiving mismatched blood can cause a life-threatening reaction, and errors are made in 1 in every 15,000 transfusions, on average.

From Liu, et al., (jargon warning):

The enzymes are expressed with high yields in E. coli and because they have similar properties, a single common conversion buffer system and process can be used to remove A and B antigens and produce ECO RBCs from A, B and AB RBCs that type as blood group O with routine licensed typing reagents and methods. Extensive FACS and biochemical analyses confirm the efficient removal of the immunodominant A and B antigens and exposure of the underlying H antigens. The current process, which is performed manually at neutral pH, is scalable to automated full-unit conversions, and ECO cells produced by this method are predicted to survive and function in a manner equivalent to native group O RBCs in non-ABO matched individuals as reported previously for B-ECO RBCs . The process has a projected consumption of approx60 mg (A-ECO) and 2 mg (B-ECO) recombinant enzyme with 60-min enzyme treatment per unit RBCs. This is approx30- (A-ECO) and 1,000-fold less (B-ECO) enzyme than the conversion protocol developed for group B RBCs with the Coffee bean alpha-galactosidase . Accordingly, we believe that automated cost-effective processes can be developed for practical use in transfusion medicine.

...Preferred properties of an exoglycosidase suitable for enzymatic conversion of RBCs include the following parameters: (i) high substrate specificity for the blood group antigens to restrict the reaction to the immunodominant blood group A and B antigens; (ii) reaction conditions suitable for maintenance of RBC integrity and functions; (iii) high efficiency in cleavage of antigens on the RBC surface to minimize residual antigens and enzyme consumption; and (iv) properties to facilitate enzyme removal from the RBCs by routine cell-washing techniques. The glycosidases presented in this study offer all of these characteristics.

The two enzyme families described here perform efficiently in conversion of RBCs; ECO cells type as group O with all licensed reagents; and sensitive FACS and glycolipid analyses confirm efficient removal of A and B antigens. Finally, enzymes from both families are slightly basic and associate with the negatively charged RBCs through ionic interactions, thereby enabling efficient removal with isotonic buffer solutions, such as PBS, used for cell washing.

The availability of enzymes from these glycosidase families has resulted in the development of a simple and efficient process for producing universal RBCs that type as blood group O. Clinical translation of this approach may allow improvement of the blood supply and enhancement of patient safety in transfusion medicine.

In principle, this is an excellent set of new parts for the toolbox, to be sure.  Though the New Scientist reports that the technology, with obvious monetary value, is being commercialized by ZymeQuest.

Given that mammalian erythrocytes, notably from cows, can be used in emergency transfusions, I wonder if there are set of enzymes that could be more generally used to strip carbohydrates from animal blood cells, thereby providing an even bigger pool of universal donor cells.

 

The Million Dollar Genome

This week's Science carries a short news piece by Eliot Marshall on "Project Jim", the effort to sequence James Watson's genome by 454 Life Sciences.  Eliot writes that:

When the project began, 454's equipment wasn't up to the task...But improved technology made it possible to sequence 10 billion bases in multiple overlapping fragments of Watson's DNA "in a space of a few weeks."...The project cost is "about $1 million."

That puts it a bit ahead of my original estimates of exponential pace decreases described, for example, here.  I dropped by Bob Waterston's office a few weeks ago, and he put a rough estimate on a human genome at a few million dollars, which more or less corroborates 454's number.

Where Should We Look For Biofuels?

The Press is full of reports describing the investment boom in biofuels.  So much hoopla.  The problem is that not all biofuels are the same, and some of them will apparently do more harm than good. 

Bio Economic Research Associates has studied the alternatives quite intensely, and now that "Genome Synthesis and Design Futures" is published we are examining more closely where Synthetic Biology fits into the biofuels picture.  More broadly, we are now exploring not just the technological angles, but also the economic and social costs built into choices about what crops to use for biofuels, where and how to grow those plants, and what happens to carbon emissions under the various options.

Vinod Khosla laid out his views in Wired last fall with "My Big Biofuels Bet", describing a plan to reduce carbon emissions and reduce reliance on imported oil, with all the best of intentions.  And here is a story from the AP (via Wired) "Betting on a Green Future", that appeared "way back" in April, 2006.  It's easy to find articles on biofuels in every major (and minor) news outlet, in big and small scientific journals, and of course in blogs.  Money is chasing opportunities in ethanol fermented from corn and straw, biodiesel from soy and palm, various liquid fuels produced from animal and plant biomass via Fischer-Tropsch or similar processes, methane from manure and garbage heaps, all the way through genetically modified plants that either directly produce fuels or are easier to process into fuels, to direct production of liquid biofuels using microbes modified with the tools of Synthetic Biology.  Venture Capitalists were as prominent as biologists and engineers at Synthetic Biology 2.0 last year in Berkeley.

Very interesting and promising stuff indeed.  But perhaps not so well thought through as it needs to be.  For example, the last couple of days have seen a profusion of articles on carbon release from land in Indonesia and Malaysia cleared for growing oil palms destined for use as biodiesel.  Here is an excellent story from the AP, via the IHT, that carries the title, "Energy companies rethink palm oil as biofuel":

A report late last year by a Netherlands-based research group claimed some plantations produce far more carbon dioxide than they save. Seeded on drained peat swamps, they unleash a warehouse of carbon from decomposed plants and animals that had been locked in the bogs for hundreds of million years, which one biologist described as "buried sunshine."

"As a biofuel, it's a failure," said Marcel Silvius, a climate change expert for Wetlands International, the institute that led the research team.

The story does note that, "Wetlands' figures could not be independently verified by the U.N. Climate Change Secretariat in Bonn, Germany, by the World Resources Institute in Washington, D.C., nor by academic experts. But all said the research appeared credible."

Companies that produce and consume palm oil are hoping that a trusted trading scheme can be set up to ensure oil comes from sustainable sources:

With concerns mounting over sourcing, plantation owners joined forces with processors, investors and environmentalists three years ago to form the Roundtable on Sustainable Palm Oil with the aim of monitoring the industry and drawing up criteria for socially responsible trade. But the RSPO has yet to create a foolproof system to verify the supply chain.

I have serious doubts about whether any such system is possible.  Given the fungibility of the palm oil, just as with petroleum, I wonder whether it will be possible to keep track of sources, particularly if the oil is consolidated or mixed during harvesting, processing, and shipping.  It only gets worse once the raw palm oil is converted into higher value diesel fuel

The size of the potential carbon release from peat and rain forest cleared for growing biofuels is so large that biodiesel use could easily run afoul of carbon caps being considered in Europe, Japan, Canada, and perhaps eventually the U.S.  Given how lucrative the plant oil market is becoming, there will be plenty of incentives for cheating on the supply side, as is now happening with sugar cane production in Brazil.  I don't see an easy technological fix for tracking sugar, ethanol, palm oil, or biodiesel, so I don't understand where any sort of lever will be useful for suppressing the emergence of a black market as plants become fuel.  It seems to me that there could be significant costs associated with verification, tracking, and perhaps certification, of sources, and I suspect this will have a big effect on plans for importing and processing oil.  Not only are the direct economic costs something to consider, but the social and public relations impacts could be substantial.  Indeed, the latter are affecting decision making already.  From the IHT:

"We spent more than a year investigating the sustainability issues with palm oil," said Leon Flexman, of RWE npower, Britain's largest electricity supplier. The company decided against palm oil because it could not verify all its supplies would be free of the taint of destroyed rain forest or peat bogs, he said.

Beyond the effects on carbon emissions, converting crops into biofuels fundamentally impacts food supplies.  Not to mention all the water that will be required to irrigate crops grown using modern farming methods.  George Monbiot, writing at The Guardian, makes a surprisingly good (for The Guardian) argument for a moratorium on governmental targets and incentives for biofuel use.  Monbiot cites all sorts of gloomy facts and figures regarding the climate effects and market impacts of using food crops as fuel, and of clearing rain forest to grow sugar cane or oil palms.

An altogether different set of problems arises when you start examining the prospects for biofuels produced from genetically modified versions of food crops.  While leakage of genes from GM crops into their un-modified cousins is still a hypothetical danger, there is a very real and immediate possibility of governmental regulations that limit planting.  Here, for example, is an interesting collection of stories about GM crops, leakage, and policy from gepolicyalliance.com.  With recent examples of pharmaceutically-modified rice and corn finding their way into the food supply, some farm state congressmen are wondering aloud about legislation to limit the planting of such crops.

So it makes sense to think ahead about the effects on biofuels.  In a long and detailed letter published last month in Nature Biotechnology under the title, "Biofuels and biocontainment", C. Neal Foster at the University of Tennessee, writes:

It is difficult to imagine that transgenic technologies will not be pivotal in transforming the process of going from grass to gas, in particular enhancing the production of lignocellulosic-based plant feedstock and its conversion into ethanol or biodiesel. Although biotech has an opportunity to increase yields and efficiency of bioenergy crop production as well as aid the conversion of complex carbohydrates and plant oils to fuels, unless modifications are performed with an eye to meet future regulatory and consumer issues, these potential benefits might never be realized.

...On the regulatory side, history has shown that it is nearly impossible to prevent industrial or pharmaceutical crops from entering the human food chain or feed when grown in proximity to one another. Low levels of adventitious presence of agronomic traits have been tolerated to some degree, but there is less tolerance for pharmaceutical and industrial transgene adventitious presence in the food chain.

...Because large tracts of land will likely be planted in bioenergy crops, there are important ecological considerations for sustainability. We need to prepare now to detour obvious roadblocks on the road to biofuels sustainability. One enduring lesson from agricultural biotech is that it is a huge mistake to underestimate biosafety concerns. A corollary is that Nature will always find a way; Murphy's law implies that no matter how unlikely it seems that genes will flow, they eventually will.

Foster explores the various options for GM food crops and non-food crops, and the rest of the letter is well worth reading.  Given the recent decision by a federal judge that the USDA was negligent in approving GM alfalfa without greater study (here is the press release from the Center for Food Safety), it is clear that open planting of GM crops may not be as easy in the future.  But there are other possibilities for high-yield biofuel production from plants.

One potentially less controversial source of biofuels, at least for North America, is to use non-GM, native grasses as the raw material.  David Tilman and his colleagues published a paper last December in Science arguing that restored native grasslands could be used as a source of biomass for producing liquid fuels.  More significantly, using existing technology, it appears that the resulting fuel production infrastructure would be carbon negative, that is, storing more carbon than emitted during harvesting, processing, and use as fuel.  Tilman, an ecologist at the University of Minnesota and a member of the NAS, lays out his plan with research associate Jason Hill in an essay on checkbiotech.org,  originally carried in The Washington Post on 25 March.  Tilman and Hill summarize the paper in Science as follows:

In a 10-year experiment reported in Science magazine in December, we explored how much bioenergy could be produced by 18 different native prairie plant species grown on highly degraded and infertile soil. We planted 172 plots in central Minnesota with various combinations of these species, randomly chosen. We found, on this highly degraded land, that the plots planted with mixtures of many native prairie perennial species yielded 238 percent more bioenergy than those planted with single species. High plant diversity led to high productivity, and little fertilizer or chemical weed or pest killers was required.

The prairie "hay" harvested from these plots can be used to create high-value energy sources. For instance, it can be mixed with coal and burned for electricity generation. It can be "gasified," then chemically combined to make ethanol or synthetic gasoline. Or it can be burned in a turbine engine to make electricity. A technique that is undergoing rapid development involves bioengineering enzymes that digest parts of plants (the cellulose) into sugars that are then fermented into ethanol.

Whether converted into electricity, ethanol or synthetic gasoline, the high-diversity hay from infertile land produced as much or more new usable energy per acre as corn for ethanol on fertile land. And it could be harvested year after year.

Even more surprising were the greenhouse gas benefits. When high-diversity mixtures of native plants are grown on degraded soils, they remove carbon dioxide from the air. Much of this carbon ends up stored in the soil. In essence, mixtures of native plants gradually restore the carbon levels that degraded soils had before being cleared and farmed. This benefit lasts for about a century.

Across the full process of growing high-diversity prairie hay, converting it into an energy source and using that energy, we found a net removal and storage of about a ton and a half of atmospheric carbon dioxide per acre. The net effect is that ethanol or synthetic gasoline produced from this grass on degraded land can provide energy that actually reduces atmospheric levels of carbon dioxide.

All in all, an exceptionally interesting proposal.  Tilman was a co-author on a Science paper earlier in 2006 that showed high diversity grasslands produce considerably more biomass per acre than monocultures of either grass or corn.  And that healthy prairie full of perennial grasses serves as habitat for all kinds of other wildlife, suggesting this approach could be a big win in many different ways.

But you still have to turn the raw biomass into fuel, and that is where Synthetic Biology will probably play a role.  Not in open fields, but in contained vats where microbes, first with modified enzymes, then later with altogether new pathways, will eat the harvested grasses and turn it into fuels.  This is an explicit focus of the new biofuels institute at UC Berkeley/LBL and the University of Minnesota, funded to the tune of $500 million by BP (story in Nature, from BP, and UCB).  And start ups like LS9 and Amyris are pouring effort into building microbes that directly produce fuels from simple feedstocks.

While this seems like a relatively straightforward path to producing significant amounts of ethanol, biodiesel, and eventually butanol, it will probably take 5-10 years before anything hits the market.  Then again, much of this is more a matter of money and organization than science.  We could get significant supplies of biofuels soon depending on our choices.

Chip Fab Now Costs US$2.5 Billion

Chip fabs just keep getting more expensive.  The AP, via Wired, reports that Intel is investing US$2.5 billion in a new factory in China.  The facility will churn out chips only for the Chinese market, evidently, and using old technology.  U.S. export rules require that Intel restrict the fab to using 90-nm processing, whereas chips made and sold in the U.S. will soon use a 45-nm process.

And biology just keeps getting cheaper.

Update on Public Access to "Genome Synthesis and Design Futures"

Due to confusion about access to "Genome Synthesis and Design Futures", I would like to make a clarification.  The original order page was not as clear as it could have been.  The report is publicly available, and is available as a free PDF or via a print-on-demand service for $95.  There are no restrictions to obtaining a copy, unless you are shy or are obviously misrepresenting yourself.  While the report does not contain sensitive material, Bio-era is requiring registration to receive a copy in an effort to both track interest and be a responsible public citizen.  I think it is rather ironic that the decision to require registration has been the target of public criticism by people who have made a business of making noise about restricting access to, and progress in, biological technologies.

Here is the new, clearer, web page.

Thoughts on Open Biology

A story at LinuxDevices last year on a report from the Committee for Economic Development (CED), recommending government use of "open source" and "open research", prompted me to collect the following thoughts on Open Biology.

I've changed the entry in my category list for this blog from "Open Source Biology" to "Open Biology".  Despite unleashing the phrase "Open Source Biology" on the world six years ago, at this point I no longer know what Open Source Biology might be.  Perhaps Drew Endy still has a  useful definition in mind, but as I try to understand how to maintain progress, improve safety, and keep the door open for economic growth, I think the analogy between software and biology just doesn't go far enough.  Biology isn't software, and DNA isn't code.  As I study the historical development of railroads, electricity, aviation, computer hardware, computer software, and of the availability of computation itself (distributed, to desktop, and back to distributed; or ARPANet to Microsoft Office to Google Apps), I am still trying to sort out what lessons can be applied to biological technologies.  I have only limited conclusions about how any such lessons will help us plan for the future of biology.

When I first heard Drew Endy utter the phrase "Open Source Biology", it was within the broader context of living in Berkeley, trying to understand the future of biology as technology, and working in an environment (the then embryonic Molecular Sciences Institute) that encouraged thinking anything was possible.  It was also within the context of Microsoft's domination of the OS market, the general technology boom in the San Francisco Bay area, the skyrocketing cost of drug development coupled to a stagnation of investment return on those dollars, and the obvious gap in our capabilities in designing and building biological systems.  OSB seemed the right strategy to get to where I thought we ought to be in the future, which is to create the ability to tinker effectively,  perhaps someday even to engineer biology, and to employ biology as technology for solving some of the many problems humans face, and that humans have created.

As in 2000, I remain today most interested in maintaining, and enhancing, the ability to innovate.  In particular, I feel that safe and secure innovation is likely to be best achieved through distributed research and through distributed biological manufacturing.  By "Open Biology" I mean access to the tools and skills necessary to participate in that innovation and distributed economy.

"Open source biology" and "open source biotechnology" are catchy phrases, but they have little if any content for the moment.  As various non-profits get up and running (e.g., CAMBIA and the BioBrick Foundation), some of the vagaries will be defined, and at least we will have some structure to talk about and test in the real world.  When there is a real license a la the GPL, or the Lesser License, and when it is finally tested in court we will have some sense of how this will all work out.

I am by no means saying work should stop on OSB, or on figuring out the licenses, just that I don't understand how it fits into helping innovation at the moment.  A great deal of the innovation we need to see will not come from academia or existing corporations, but from people noodling around in their garages or in start-ups yet to be founded.  These are the customers for Biobricks, these are the people who want the ability to build biological systems without needing an NIH grant.

But Drew Endy (Biobricks) and Richard Jefferson (CAMBIA) have as primary customers not corporations, hobbyists, or tinkerers, but large foundations and governments.  The marketplace in which Biobricks and CAMBIA compete for funding values innovation and the promise of changing the world.  At present, they do not derive the majority of their funding from actually selling parts or licenses on the open market, and thus do not rely on sales to fund their work.  Nor should they.  But the rest of our economy operates on exchanges of money for goods and services.  Synthetic Biology will get there some day, too, but the transition is still a bit murky for me.  The Bio-era research report, "Genome Synthesis and Design Futures: Implications for the U.S. Economy", of which I am a co-author, points to the utility of Synthetic Biology and Biobricks in producing biofuels, vaccines, and new materials.  However, the implementation of the new technological framework of genome design, enabled by large scale gene synthesis and composable parts with defined properties, is still in the offing.

Janet Hope has made an initial study of the state of Open Source Biotechnology in her Ph.D. dissertation at Australia National University.  Janet gives the following definition for her project:

"Open Source Biotechnology" refers to the possibility of extending the principles of commerce-friendly, commons-based peer production exemplified by Open Source software development to the development of research tools in biomedical and agricultural biotechnology.

This project examines the feasibility of Open Source Biotechnology in the current industry environment. In particular, it explores:       

1. Whether it would be possible to run a viable biotechnology business on Open Source principles, and

2. What such a business might look like, including the application of specific Open Source-style licences to particular classes of biotechnology research tools.

Janet's book on the subject is due out later this year from Harvard Press.  My book on all of this stuff is, um, not finished.

The CED report  "concludes that openness should be promoted as a matter of public policy, in order to foster innovation and economic growth in the U.S. and world economies."  I think this bit, in particular, is very interesting (quoting from the LinuxDevices story):

  • Open Innovation (such as 'peer production' systems like WikiPedia and eBay user ratings)

    • To foster open innovation, federally funded, non-classified research should be widely disseminated, following the example of the NIH (National Institute of Health)
    • "Any legislation or regulation regarding intellectual property rights [should be] weighed with a presumption against the granting of new rights ... because of the benefits to society of further innovation through greater access to technology."
    • The NSF (National Science Foundation) should fund research into "alternative compensation methods, similar to those created to facilitate the growth of radio, to reward creators of digital information products"

The first point is a bit off, since most NIH sponsored research, as a practical matter, available only through subscriptions to the journals in which it is published.  This will slowly get fixed, however, with increasing publication via the Public Library of Science and similar efforts.  The second point, embodied in patent reform, will probably take forever and will be hobbled by vested interests.  The third may not produce useful results for many years.

So here we sit, needing much fast innovation in biological technologies in order to produce carbon neutral fuels, improve human health, and deal with emerging threats such as SARS and pandemic influenza.  Open Biology is part of that, somehow, but I still don't see a clear path to implementing the ideas within the context of the real economic system we live in every day.

Farewell PEAR Lab -- You were always overripe.

News in the last few weeks that the Princeton Engineering Anomalies Research Lab -- the PEAR Lab -- is shutting down.  The PEAR Lab, run by Dr. Robert Jahn, the former Dean of Engineering, was by no means celebrated at Princeton.  I spent four years there in graduate school and only heard of the Lab during my last year, in Malcolm Browne's science writing class no less, rather than during all those many hours in Jadwin Hall.

Phillip Ball had a nice retrospective on the Lab in last week's Nature entitled, "When research goes PEAR-shaped."  Ball quotes Will Happer, a professor in the Princeton Physics Department and a member of JASON as saying, "I don't believe in anything [Jahn] is doing, but I support his right to do it."  That's pretty charitable, actually, compared with many of the things said about the lab.  Nature continues to pile it on this week, with another piece: "The lab that asked the wrong questions," by Lucy Odling-Smee.

This is the crux of what was wrong with the PEAR Lab.  In that science writing class, Malcolm Browne occasionally brought in people to be "interviewed" by the class, and one day we had someone in from the Lab.  (My recollection is that it was Jahn himself.)  Can't say I was impressed.  But data is data, and they certainly may actually have measured something interesting, however unlikely that may be.  There are many things we can't yet explain about the universe, and maybe Jahn was on to something.

What I found unfortunate, even unpleasant, in the way the data was presented was the context.  Jahn was represented to us not just as an expert in aeronautics, but also in a whole host of other fields, including quantum mechanics.  And we were offered a physical theory "explaining" one experiment, supposedly a quantum mechanical theory.  Here's the problem: that theory, by its very nature, is wrong.  It is inconsistent in its conception and structure with all the rest of quantum mechanics.  The folks in the PEAR Lab were definitely asking the wrong questions, in a very deep physical sense, by which I mean that everything about the way they tried to explain the data I saw was contradicted by modern physics in fundamental ways. 

According to Dr. Jahn, a random process seems to be the vital ingredient for anomalous interactions between consciousness and machines -- coins flipping, balls dropping through a forest of pegs, even electronic random number generators -- which is what led him to speculate about connections between his data and a successful theory in which measurements are probabilistic: quantum mechanics.  In some interpretations of quantum mechanics, the observer and the system observed are both part of a larger closed system.  Indeed, Dr. Jahn and his colleagues believe that quantum mechanics may be just a part of a larger theory that includes phenomena studied in the PEAR Lab.  If this is so, then one would expect the structure of the two theories to be similar.

The theory we were told about was purported to explain how an observer could, by thinking "slower" or "faster", change the period of a large pendulum, something like 2 meters in length, if I recall correctly.  A brief refresher on the relevant classical physics: the period of an ideal pendulum is determined only by its length and the strength of the force of gravity, at least in the case when oscillation amplitudes are small, and not by its mass, or the kind of bearing it is suspended from, or any other factor. Though friction will eventually damp a real pendulum by changing its amplitude, not its period.   

The mechanism by which human consciousness might change the period is not easy to imagine.  The human observer states the intention either to increase or decrease the period, and as the pendulum interrupts photodiodes on each swing the time is recorded.  But whereas a quantum mechanical model requires a probability for the observer to intentionally alter, here the observer is actually trying to intentionally change the period.

Before I go on (and on), you must be asking "Why spend so much time on this?"  Why bother to debunk bad science at all?  Because the universe is full of strange and wonderful things, and we don't yet understand them all.  That's what makes life interesting.  Besides, I like thinking about quantum mechanics.  Back to the story.

Dr. Jahn claimed his data is consistent with the the human subject affecting the damping of the pendulum's oscillation.  Microscopically, friction might be changed by heating or cooling the bearings of the pendulum (which could be tested by carefully measuring the temperature of the bearing during an experiment) causing the atoms in the bearing to move around more or less, a phenomena well understood in statistical mechanics -- and in fact a probabilistic effect.  However, since the operator was not trying to influence this probability distribution, it is not clear how his or her binary intention of changing the period of the pendulum was converted into changing the amount of friction.  Or perhaps the observer was changing the length of the pendulum, or the overall strength of gravity, or even the local coupling of the earth's mass to that of the pendulum.  Still no obvious connection to any distribution.

When asking a question of a quantum mechanical system, or a quantum mechanical question in the parlance of physicists, it must be one which can be phrased in terms of what is called an "operator."  Energy, momentum, and position are all operators and as such provide tools for asking quantum mechanical questions.  The energy operator, for instance, would be used to ask about the average energy of the atoms in the bearing.  To find an analogy to the pendulum we must look in quantum mechanics to something called a harmonic oscillator, which can be imagined as a ball rolling back and forth at the bottom a parabolic bowl.  Two operators used in asking questions about such a system are the raising and lowering operators, which as their names suggest change the energy of a particle and its period of oscillation.

So, for the sake of argument, let's give the PEAR Lab a quantum mechanical operator that works on a macroscopic pendulum.  It might be imagined that a human consciousness is utilizing some sort of raising and lowering operator by intending to increase or decrease the period of oscillation of the pendulum.  Yet the data is fit by assuming the friction in the bearing is changing.  It is simply not consistent with the structure of quantum mechanics to ask one valid question and get the answer to a different valid question.  Furthermore, it is hard to imagine how a more general theory, one subsuming quantum mechanics -- oh, what the hell, let's just call it "magic" -- could account for asking a question of the period of the pendulum with an operator belonging to the "magic" theory but get an answer which is the result of asking a question with the well known and well loved energy operator of quantum mechanics and which could only describe the microscopic state of the bearing.  So there.

Then there is that little thing called the Correspondence Principle, proven correct time and time again, which says that quantum mechanics works for small numbers of atoms.  As the number grows, save in very special, very strange circumstances like Bose-Einstein condensates, your theory must reduce to classical physics.  Which brings us back to the classical model that the period of the pendulum depends only on its length.  Nothing about the bearing, nothing about the observer.  Moreover, the pendulum is big, and the human subject is big.  Many, many atoms.  No quantum mechanics.  Wrong question!

Did you follow all that?  Does your head hurt?  Sometimes quantum mechanics does that, I assure you.  But I suppose "magic" could account for your headache, too.  We must allow for that.  Somehow.  See the PEAR Lab.

Sometimes the exploration of something that seems silly results in important insights, and the rest of the time it is important to keep the human participants of science honest.  That's the way science works.  And science always wins.