Grand challenges in biological engineering

Kunal Mehta
3 September 2012

I believe there are three "grand challenges" in biological engineering which must be overcome to transform it from a more or less ad-hoc discipline to a viable technological platform.
1. The Automation Problem
Most of the low-level work of biological engineering in the lab consists of moving small volumes of liquid between different containers. Right now, in the lab, all of this is done by hand, more or less one sample at a time. Companies can and do invest in automated systems that can do some of these things, but the systems are expensive and not robust (for example, at one company I've been to each liquid handler has a networked digital camera trained on it, so that an operator can constantly monitor it to make sure it is working properly). Academic labs don't have even this basic functionality. We need to reach a level of sophistication that allows an engineer to write a "program" like this:
Start 100 cultures of cells. Once the number of cells in the culture reaches a certain level, add various amounts of chemical X, Y, and Z. While the cultures grow, measure parameters A, B, and C at regular intervals. When the cultures have stopped growing, stop the cultures and process them.
A system like this would enable an engineer to spend his time on the design of a project, rather than the execution of experiments.
2. The Measurement Problem
Engineering depends on measurement: we need to be able to know what's going on in a system to know how to improve it. In biology, this means being able to profile the performance of an engineered strain across many "sectors" of the physiology, especially metabolism and transcription. We can do at least one piece of this right now: microarrays allow profiling of transcription across the entire genome, for example. And metabolic profiling can be done by 13C flux analysis, but that's a technology that hasn't changed for decades. The ideal biological measurement tool could take one of two forms: a "metabolic microarray" that allows profiling across the entire metabolic network, or a "biological voltmeter" that allows random-access measurement of a specific reaction or pathway.
3. The Prediction Problem
The standard "shotgun" approach to strain engineering consists of multiple rounds of random mutagenesis (usually targeted to specific genes or parts of the genome) and screening for the desired behavior. This is in contrast to rational or "forward" engineering, where constructions are built up from parts whose individual functions are (reasonably well) known in advance. Mutagenesis and screening does work, and the mutagenesis in particular is much easier thanks to recent advances in DNA biochemistry. But it is cumbersome. The real advance would be a computational model that could predict the performance of a design in advance, and thus focus the engineering along a rational path. This model would need to (1) incorporate all sectors of physiology, including metabolism and signaling, and (2) make accurate quantitative predictions of useful parameters. There's reason to be optimistic: airplanes are some of the most complex engineered products we currently build, and we have models that can predict flight performance of designs to within 10% of the actual values. When Boeing designed the 777 (which was launched in 1995), CAD tools could design individual parts so precisely that engineers did not need to build expensive mock-ups to test the fit, as had been done for every aircraft till then. No model is comprehensive: even airplanes and computer chips need to be tested. But in those fields, relatively speaking, the point at which testing is really required -- when the designs become too complicated for the models to predict how they will behave -- comes far later than in biology.
These three challenges are not mutually exclusive: having good models depends on being able to collect good data, and doing that in non-galactic amounts of time will almost certainly require robust automation.
Channel Nine | kkmeht @ gmail