A few notes from Nature Biotech

I am catching up on past issues of Nature Biotech.  Here are a few things that caught my eye:

(Feb 09) Cuba is launching a domestically produced GM corn.  The strain (which looks from the name to contain Bt) is to be used in animal feed.  Another sign that developing countries view biotech as important national initiatives, and that they can push the technology on their own.

(Feb 09) Researchers in Belgium got fed up with efforts to get their field trial for GM poplars approved in country, and are taking the trial to the Netherlands.  So much for uniformly applying laws on planting GM crops in Europe.  (Mar 09) Local environment ministers voted to overturn the European Commission's initiative to force member states to lift national bans.

(April 09) Malaysia has dropped several billions of dollars on biotech as part of their stimulus package.  More on this when I dig into it.

The Origin of Moore's Law and What it May (Not) Teach Us About Biological Technologies

While writing a proposal for a new project, I've had occasion to dig back into Moore's Law and its origins.  I wonder, now, whether I peeled back enough of the layers of the phenomenon in my book.  We so often hear about how more powerful computers are changing everything.  Usually the progress demonstrated by the semiconductor industry (and now, more generally, IT) is described as the result of some sort of technological determinism instead of as the result of a bunch of choices -- by people -- that produce the world we live in.  This is on my mind as I continue to ponder the recent failure of Codon Devices as a commercial enterprise.  In any event, here are a few notes and resources that I found compelling as I went back to reexamine Moore's Law.

What is Moore's Law?

First up is a 2003 article from Ars Technica that does a very nice job of explaining the why's and wherefore's: "Understanding Moore's Law".  The crispest statement within the original 1965 paper is "The number of transistors per chip that yields the minimum cost per transistor has increased at a rate of roughly a factor of two per year."  At it's very origins, Moore's Law emerged from a statement about cost, and economics, rather than strictly about technology.

I like this summary from the Ars Technica piece quite a lot:

Ultimately, the number of transistors per chip that makes up the low point of any year's curve is a combination of a few major factors (in order of decreasing impact):

  1. The maximum number of transistors per square inch, (or, alternately put, the size of the smallest transistor that our equipment can etch),
  2. The size of the wafer
  3. The average number of defects per square inch,
  4. The costs associated with producing multiple components (i.e. packaging costs, the costs of integrating multiple components onto a PCB, etc.)

In other words, it's complicated.  Notably, the article does not touch on any market-associated factors, such as demand and the financing of new fabs.

The Wiki on Moore's Law has some good information, but isn't very nuanced.

Next, here an excerpt from an interview Moore did with Charlie Rose in 2005:

Charlie Rose:     ...It is said, and tell me if it's right, that this was part of the assumptions built into the way Intel made it's projections. And therefore, because Intel did that, everybody else in the Silicon Valley, everybody else in the business did the same thing. So it achieved a power that was pervasive.

Gordon Moore:   That's true. It happened fairly gradually. It was generally recognized that these things were growing exponentially like that. Even the Semiconductor Industry Association put out a roadmap for the technology for the industry that took into account these exponential growths to see what research had to be done to make sure we could stay on that curve. So it's kind of become a self-fulfilling prophecy.

Semiconductor technology has the peculiar characteristic that the next generation always makes things higher performance and cheaper - both. So if you're a generation behind the leading edge technology, you have both a cost disadvantage and a performance disadvantage. So it's a very non-competitive situation. So the companies all recognize they have to stay on this curve or get a little ahead of it.

Keeping up with 'the Law' is as much about the business model of the semiconductor industry as about anything else.  Growth for the sake of growth is an axiom of western capitalism, but it is actually a fundamental requirement for chipmakers.  Because the cost per transistor is expected to fall exponentially over time, you have to produce exponentially more transistors to maintain your margins and satisfy your investors.  Therefore, Intel set growth as a primary goal early on.  Everyone else had to follow, or be left by the wayside.  The following is from the recent Briefing in The Economist on the semiconductor industry:

...Even the biggest chipmakers must keep expanding. Intel todayaccounts for 82% of global microprocessor revenue and has annual revenues of $37.6 billion because it understood this long ago. In the early 1980s, when Intel was a $700m company--pretty big for the time--Andy Grove, once Intel's boss, notorious for his paranoia, was not satisfied. "He would run around and tell everybody that we have to get to $1 billion," recalls Andy Bryant, the firm's chief administrative officer. "He knew that you had to have a certain size to stay in business."

Grow, grow, grow

Intel still appears to stick to this mantra, and is using the crisis to outgrow its competitors. In February Paul Otellini, its chief executive, said it would speed up plans to move many of its fabs to a new, 32-nanometre process at a cost of $7 billion over the next two years. This, he said, would preserve about 7,000 high-wage jobs in America. The investment (as well as Nehalem, Intel's new superfast chip for servers, which was released on March 30th) will also make life even harder for AMD, Intel's biggest remaining rival in the market for PC-type processors.

AMD got out of the atoms business earlier this year by selling its fab operations to a sovereign wealth fund run by Abu Dhabi.  We shall see how they fare as a bits-only design firm, having sacrificed their ability to themselves push (and rely on) scale.

Where is Moore's Law Taking Us?

Here are a few other tidbits I found interesting:

Re the oft-forecast end of Moore's Law, here is Michael Kanellos at CNET grinning through his prose: "In a bit of magazine performance art, Red Herring ran a cover story on the death of Moore's Law in February--and subsequently went out of business."

And here is somebody's term paper (no disrespect there -- it is actually quite good, and is archived at Microsoft Research) quoting an interview with Carver Mead:

Carver Mead (now Gordon and Betty Moore Professor of Engineering and Applied Science at Caltech) states that Moore's Law "is really about people's belief system, it's not a law of physics, it's about human belief, and when people believe in something, they'll put energy behind it to make it come to pass." Mead offers a retrospective, yet philosophical explanation of how Moore's Law has been reinforced within the semiconductor community through "living it":

After it's [Moore's Law] happened long enough, people begin to talk about it in retrospect, and in retrospect it's really a curve that goes through some points and so it looks like a physical law and people talk about it that way. But actually if you're living it, which I am, then it doesn't feel like a physical law. It's really a thing about human activity, it's about vision, it's about what you're allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe what is possible. So here's an example where Gordon [Moore], when he made this observation early on, he really gave us permission to believe that it would keep going. And so some of us went off and did some calculations about it and said, 'Yes, it can keep going'. And that then gave other people permission to believe it could keep going. And [after believing it] for the last two or three generations, 'maybe I can believe it for a couple more, even though I can't see how to get there'. . . The wonderful thing about [Moore's Law] is that it is not a static law, it forces everyone to live in a dynamic, evolving world.

So the actual pace of Moore's Law is about expectations, human behavior, and, not least, economics, but has relatively little to do with the cutting edge of technology or with technological limits.  Moore's Law as encapsulated by The Economist is about the scale necessary to stay alive in the semiconductor manufacturing business.  To bring this back to biological technologies, what does Moore's Law teach us about playing with DNA and proteins?  Peeling back the veneer of technological determinism enables us (forces us?) to examine how we got where we are today. 

A Few Meandering Thoughts About Biology

Intel makes chips because customers buy chips.  According to The Economist, a new chip fab now costs north of $6 billion.  Similarly, companies make stuff out of, and using, biology because people buy that stuff.  But nothing in biology, and certainly not a manufacturing plant, costs $6 billion.

Even a blockbuster drug, which could bring revenues in the range of $50-100 billion during its commercial lifetime, costs less than $1 billion to develop.  Scale wins in drug manufacturing because drugs require lots of testing, and require verifiable quality control during manufacturing, which costs serious money.

Scale wins in farming because you need...a farm.  Okay, that one is pretty obvious.  Commodities have low margins, and unless you can hitch your wagon to "eat local" or "organic" labels, you need scale (volume) to compete and survive.

But otherwise, it isn't obvious that there are substantial barriers to participating in the bio-economy.  Recalling that this is a hypothesis rather than an assertion, I'll venture back into biofuels to make more progress here.

Scale wins in the oil business because petroleum costs serious money to extract from the ground, because the costs of transporting that oil are reduced by playing a surface-to-volume game, and because thermodynamics dictates that big refineries are more efficient refineries.  It's all about "steel in the ground", as the oil executives say -- and in the deserts of the Middle East, and in the Straights of Malacca, etc.  But here is something interesting to ponder: oil production may have maxed out at about 90 million barrels a day (see this 2007 article in the FT, "Total chief warns on oil output").  There may be lots of oil in the ground around the world, but our ability to move it to market may be limited.  Last year's report from Bio-era, "The Big Squeeze", observed that since about 2006, the petroleum market has in fact relied on biofuels to supply volumes above the ~90 million per day mark.  This leads to an important consequence for distributed biofuel production that only recently penetrated my thick skull.

Below the 90 million barrel threshold, oil prices fall because supply will generally exceed demand (modulo games played by OPEC, Hugo Chavez, and speculators).  In that environment, biofuels have to compete against the scale of the petroleum markets, and margins on biofuels get squeezed as the price of oil falls.  However, above the 90 million per day threshold, prices start to rise rapidly (perhaps contributing to the recent spike, in addition to the actions of speculators).  In that environment, biofuels are competing not with petroleum, but with other biofuels.  What I mean is that large-scale biofuels operations may have an advantage when oil prices are low because large-scale producers -- particularly those making first-generation biofuels, like corn-based ethanol, that require lots of energy input -- can eke out a bit more margin through surface to volume issues and thermodynamics.  But as prices rise, both the energy to make those fuels and the energy to move those fuels to market get more expensive.  When the price of oil is high, smaller scale producers -- particularly those with lower capital requirements, as might come with direct production of fuels in microbes -- gain an advantage because they can be more flexible and have lower transportation costs (being closer to the consumer).  In this price-volume regime, petroleum production is maxed out and small scale biofuels producers are competing against other biofuels producers since they are the only source of additional supply (for materials, as well as fuels).

This is getting a bit far from Moore's Law -- the section heading does contain the phrase "meandering thoughts" -- I'll try to bring it back.  Whatever the origin of the trends, biological technologies appear to be the same sort of exponential driver for the economy as are semiconductors.  Chips, software, DNA sequencing and synthesis: all are infrastructure that contribute to increases in productivity and capability further along the value chain in the economy.  The cost of production for chips (especially the capital required for a fab) is rising.  The cost of production for biology is falling (even if that progress is uneven, as I observed in the post about Codon Devices).&nb sp; It is generally becoming harder to participate in the chip business, and it is generally becoming easier to participate in the biology business.  Paraphrasing Carver Mead, Moore's Law became an organizing principal of an industry, and a driver of our economy, through human behavior rather than through technological predestination.  Biology, too, will only become a truly powerful and influential technology through human choices to develop and deploy that technology.  But access to both design tools and working systems will be much more distributed in biology than in hardware.  It is another matter whether we can learn to use synthetic biological systems to improve the human condition to the extent we have through relying on Moore's Law. 

On the Demise of Codon Devices

Nature is carrying a short news piece by Erica Check and Heidi Ledford on the end of Codon Devices, "The Constructive Biology Company".  I am briefly quoted in the discussion of what might have gone wrong.  I would add here that I don't think it means much of anything for the field as a whole.  It was just one company.  Here is last week's initial reporting by Todd Wallack at the Boston Globe.

I've been pondering this a bit more, and the following analogy occurred to me after I was interviewed for the Nature piece.  Codon, as described to me by various people directly involved, was imagined as a full-service engineering firm -- synthetic genes and genomes, design services, the elusive "bio-fab" that would enable one-stop conversion of design information into functional molecules and living systems.  Essentially, it seems to me that the founders wanted to spin up an HP of biology, except that they tried to jump into the fully developed HP of 1980 or 1990 rather than the garage HP of 1939.  Codon was founded with of order $50 million, with no actual products ready to go.  HP was founded with ~$500 (albeit 1939 dollars) and immediately started selling a single product, a frequency standard, for which there was a large and growing market.  HP then grew, along with it customers, organically over decades. Moreover, the company was started within the context of an already large market for electronics.

The synthetic biology market -- the ecology of companies that produce and consume products and services related to building genes and genomes -- still isn't very big.  A very generous estimate would put that market at $100 million.  This means the revenues for any given firm are (very optimistically) probably no more than a few tens of millions.  (The market around "old style" recombinant DNA is, of course, orders of magnitude larger.)  Labor, rather than reagents and materials, is still likely to be the biggest cost for most companies in the field.  And even when they do produce an organism, or a genetic circuit, with value, companies are likely to try to capture all the value of the learning that went into the design and engineering process. 

This leads to an important question that I am not sure is asked often enough by those who hope to make a living off of emerging biological technologies: Where is the value?  Is it in the design (bits), or in the objects (atoms)?  The answer is a bit complicated.

Given that the maximum possible profit margin on synthetic genes is falling exponentially, it would seem that finding value in those particular atoms is going to get harder and harder.  DNA is cheap, and getting cheaper; the design of genetic circuits (resulting in bits) definitely costs more (in labor, etc.) than obtaining the physical sequence by FedEx. That is the market that Codon leapt into.  If all of the value is in the design process, and in the learning associated with producing a new design, not many companies are going to outsource that value creation to a contractor.  If Codon had a particular design expertise, they could have made a go with that as a business model, as do electronics firms that have niche businesses in power electronics or ASICs.  There are certainly very large firms that design, but do not build, electronics (the new AMD, for example), but they didn't get that way overnight.  They have emerged after a very long (and brutal) process of competition that has resulted in the separation of design and manufacturing.  Intel is the only integrated firm left standing, in part because they set their sights on maintaining scale from day one (see the recent Economist article on the semiconductor industry for a nice summary of where the market is, and where it may be headed). 

In another area of synthetic biology, I can testify with an uncomfortably high degree of expertise that costs in the market for proteins (a very different beast than DNA) are much higher for atoms than for bits.  It is relatively easy for me to design (update: perhaps better phraseology would be "specify the sequence of") a new protein for Biodesic and have Blue Heron synthesize the corresponding gene.  It is rather less easy for me to get the actual protein made at scale by a third party (and it would be even harder to do it myself).  Whereas gene synthesis appears to be a commodity business, contract protein manufacturing is definitely not.  Expression and purification require knowledge (art).  Even if a company has loads of expertise in protein expression, in my experience they will only offer an estimate of the likelihood of success for any given job.  And even if they can make a particular protein, without a fairly large investment of time and money they may not be able to make very much of the protein or ship it at a sufficiently high purity.  Unlike silicon processing and chip manufacturing, it isn't clear that anyone can (yet) be a generalist in protein expression.  Once you get a protein manufacturing process sorted out, the costs quickly fall and the margins are excellent.  Until then: ouch.

So, for DNA bits are expensive and atoms are cheap.  For proteins, bits are cheap and atoms are initially very expensive.  Who knows how much of this was clear to the founders of Codon several years ago; I have only been able to articulate these ideas myself relatively recently.  It is still very early in the development of synthetic biology as a market, and as a sector of the economy. 

Mood Hacking at The World Economic Forum

(Update: see "Revisiting Mood Hacking with Scents", 3 December 2009.)

We are all familiar with the aromas used by stores in the hopes of motivating consumer frenzy.  Walk into some establishments and you may feel as if you have been smacked with a fragrant bunch of flowers.  Or possibly a fragrant leather shoe.  Maybe this actually encourages people to spend money.  It usually just makes me sneeze.

But what if the general strategy of behavior modification via perfumes of one kind or another really does work?  At the 2008 World Economic Forum in Davos, there was an explicit attempt to influence discussions through the use of custom scents designed for the occassion.

Here is a short excerpt from "Davos Aromas Deodorize Subprime Stench, Charm Dimon, Kissinger", by A. Craig Copetas (Bloomberg News):

"I know a lot of people think this is foolish,'' says Toshiko Mori, chairwoman of Harvard University's architecture department and one of the WEF delegates who initiated the perfume project. ``But the global economy is in dire straits and we must improve the quality of human spirits. Perfuming is a powerful tool in a much broader discourse. The fragrances will help us reach economic and political solutions at Davos.''

Here is CNN's take: "Smelly Davos unveils new world odor."  Ha.

The reader might imagine a room full of national security professionals debating the merits and ethics of this "technology".  We see two camps emerge.  The first group is shocked -- shocked! -- that biochemical warfare is being brought indoors to induce in captains of industry and policy makers a mood of compromise.  The second group notes that all it took to hack the mood of Boris Yeltsin was an open bottle of vodka.  The latter strategy has, of course, been used for millennia.

Hacking a the mood of an entire room full of people at once is an interesting twist, though.  What happens when someone modifies airborne rhinoviruses to express neuroactive peptides?  (See my post on iGEM 2008: "Surprise -- the Future is Here Already".)  Science fiction gave us the answer long ago.  Isaac Asimov had his characters wearing anti-viral filters in their nostrils even in his early stories.  Seems like filters with sufficiently small pores might make it hard to breathe.  And what happens if you sneeze?  "Ouch!" or "Ewww", I imagine.

Anyway, how would we even know that mood hacking was occurring?  Aside from simply noting changes in behavior, or getting, um, wind of the threat via human intelligence, we would have to measure any chemical or biological weapon directly.  But before pulling out the Tricorder and identifying a threat, we would first have to be constantly monitoring our environment in order to get a baseline of environmental signals.  So, we have already struck out.  No such monitoring is really happening.  We are just cherry picking a few things that are easy to see.  Oh, and still no Tricorder.

If the mood altering mechanism was delivered via a virus, we would have to not just monitor the number of viruses of any given species in the air, but also be sequencing all of them, all the time.  Again, we are striking out.

I have a hard time imagining that viral mood hacking threats are going to show up any time soon, but then we have no means of knowing either way.  Perhaps such things are already about.  How can you be sure you aren't part of "The Giving Plague"?

"The New Biofactories"

(Update: McKinsey seems to have pulled the whole issue from the web, which is too bad because there was a lot of good stuff in it.  The text of my contribution can be found below.)

I have a short essay in a special edition of the McKinsey Quarterly, What Matters.  My piece is waaaay back at the end of the printed volume, and all the preceding articles are well worth a look.  Other essayists include Steven Chu, Hal Varian, Nicholas Stern, Kim Stanley Robinson, Yochai Benkler, Vinod Khosla, Arianna Huffington, Joseph Nye, and many more.  Good company.

Here is the essay: "The New Biofactories" (PDF), Robert Carlson, What Matters, McKinsey & Company, 2009.

Well, that's it, then.

Finally, the book is done.  Aside from reviewing the proofs in a couple of months, and writing an afterword, it is at last out of my hands.

The title, finally, will be "Biology is Technology: The Promise, Peril, and Business of Engineering Life".  It will be in the Fall 2009 Catalog from Harvard University Press, with atoms showing up at approximately New Years.  I'll get around to updating the web site text eventually.

My brain is presently mush.  I haven't blogged in so long I'd forgotten the user name and password for my account.  I have a couple of posts in mind that I hope to get up over the weekend.

Otherwise, I can't wait to get back to actually doing science.  What a concept.

First: sleep.  No -- second sleep.  First: beer.

Advice for Future iGEM Teams

I'm giving a short talk to the University of Washington iGEM interest group tonight based on my experience watching the competition from the beginning and as a judge for the last couple of years.

The judges are given a long list of criteria for the various medals and awards.  The list has grown longer and more involved -- if the trend holds next year I expect it to be even more complicated.  There are many more teams than judges, so each of us sees only a small fraction of the teams in person on the first day of the Jamboree.  The only way we can keep things fair (and keep the teams straight in our heads) is to follow the judging criteria very closely.  We have a checklist.

It is important to remember in what follows that my academic training is in experimental physics, and I spend most of my time today trying to build stuff out of DNA.  I don't have anything against elegant and cool models; I simply groove more on elegant and cool atoms.  I speak only for myself and not for any other of the judges or organizers.

Here is what I plan to say this evening:

  1. You need to make easy for the judges to understand your objective and your design.
  2. Web pages can be too cool.  A rough rule of thumb: the cooler the web page is, the harder it is to understand.  A cool web page may be full of information, but as a judge it is the baud rate I care about.
  3. Fun is good.  Demonstrating actual learning is better.  Data trumps everything.
  4. In my experience, the more equations in your model, the less likely you will produce experimental data.  I find complexity as distracting in my own work as I do when I have something like 15 minutes to figure out the theoretical details of an iGEM project.  Keep it simple!
  5. Find a mentor to help tailor your story to your customers, namely the judges.  This past year the judges were a mixture of academics and industry types -- biologists, engineers, computer scientists, physicists; theorists, experimentalists, hackers.  All probably have PhDs in something or other, which means we are used to rapidly parsing stories that are packaged more like papers in Science and Nature than like facespace/mybook/twitterwikirama/whatever.  Those things may be the future of science for all I know, but your customers (the judges) don't play that game -- we are fogeys as far as you are concerned.  You have to market to us.
  6. Follow the directions!  Follow the checklist.  Make sure your DNA is to spec (e.g. meets the Biobrick(TM) standards).  Make sure it is in the Registry.  Get everything in on time.  Sometimes the organizers and judges screw up this part -- the way to resolve complaints is with reason and your own checklist.  No whinging.
  7. Here is a suggestion I made to the organizers after the last competition.  Even if they don't implement it, you should.  Everyone in the competition has completed some sort of laboratory course requiring basic experimental write-ups.  Make sure your web page has a basic lab write-up, no clicking or hunting required. You will do better if the judges don't have to spend even thirty seconds trying to figure out if you have actual data and where it might be hiding on your wiki, especially if other pages are better designed and easier to read.  If I recall from my student days, those write-ups go something like this, mostly in this order: "1. Here is what we wanted to do and why.  2. Here is what we did.  3. Model.  4. Data.  5. Conclusion."  Bonus: if it didn't work, why not?  iGEM and the Biobricks Foudation both need a failure archive.

Good luck next year!

Tamiflu-resistant Influenza Strains

(Update, 30 April 2009: I see from the server logs that this post is getting a lot of traffic today.  Please note that the contents of the post discuss the annual influenza strains in the US, not the "H1N1 Influenza A" strain, which at this time is susceptible to Tamiflu.)

The IHT is carrying a great article by Donald Mcneil on the sudden emergence of antiviral resistance in this year's circulating influenza viruses.  The title says it all: "Flu in U.S. found resistant to main antiviral drug".

Virtually all the flu in the United States this season is resistant to the leading antiviral drug Tamiflu...  The problem is not yet a public health crisis because this has been a below-average flu season so far and the chief strain circulating is still susceptible to other drugs.

There are two important points in this story.  First, the resistance seems to derive from a spontaneous mutation rather than having emerged from overuse of the drug:

"It's quite shocking," said Dr. Kent Sepkowitz, director of infection control at Memorial Sloan-Kettering Cancer Center in New York. "We've never lost an antimicrobial this fast. It blew me away."

The mutation appears to have arisen in Norway, a country that the article suggests does not even use Tamiflu. Second, while the CDC is recommending that hospitals test all flu cases to find out whether patients are carrying a the resistant subtype, this capability is still not widespread:

"We're a fancy hospital, and we can't even do the ... test in a timely fashion," Sepkowitz said. "I have no idea what a doctor in an unfancy office without that lab backup can do."

I haven't written very much about the flu for a couple of years, but it is clear that the threat is still quite present.

The article ends with this bit of speculation:

And while seasonal flu is relatively mild, the Tamiflu resistance could transfer onto the H5N1 bird flu circulating in Asia and Egypt, which has killed millions of birds and about 250 people since 2003. Although H5N1 has not turned into a pandemic strain, as many experts recently feared it would, it still could -- and Tamiflu resistance in that case would be a disaster.

I'm not so sure that the resistance gene "could easily transfer onto the H5N1 bird flu".  It sounds like Mr. Mcneil may be giving more weight here to Henry Niman (who is quoted extensively in the article on other specific topics) than the rest of the community might.  This is not to say that such a transfer is unlikely -- this is the sort of thing that I fear we know so little about that we could make poor assumptions leading to even worse policy.  The mechanisms for recombination and reassortment of genes in the flu are still disputed in the literature.  But it's damn scary, either way, even if the probability of such a transfer is small.

In the end, if nothing else, what this demonstrates is that our technological base for both detecting and responding to infectious disease is still poorly developed.

Carl Zimmer on Synthetic Biology for Biofuels

Carl Zimmer has a nice piece in Yale Enivronment360 on continued efforts to build bugs that produce fuel, "The High-Tech Search For A Cleaner Biofuel Alternative".  The article extensively quotes Steve Aldrich, President of Bio-era, on the trade-offs of using sugar cane as a source material.

Craig Venter makes an appearance arguing that the best long-term bet is to build photosynthetic bugs that use atomspheric CO2 to directly produce fuel.  Maybe.  This would require containment facilities for culturing engineered bugs, where those facilities also must capture sunlight and CO2 to feed the bugs.  The costs for this infrastructure are not insignificant, and this is exactly what is presently standing in the way of large scale algal biodiesel production.

Here is the question I keep asking in these circles: why not just grow naturally occurring algae, which can be grown at extremely high yield in a wide variety of conditions, as food for bugs hacked to eat cellulose?  If there is no algae to be had, just throw in another source of cellulose or other biomass.  There would be minimal concern over growing modified organisms that might escape into the wild.  The processing of biomass into fuel under would also be under conditions that are easier to optimize and control.

I'm not suggesting this is the only answer, but rather that it appears to balance 1) the costs of infrastructure, 2) concerns over enviromental release of genetically modified organisms, and 3) provide an efficient processing infrastructure that could use a wide variety of feedstocks.