A chip off Mother Nature's own hard drive

November 9, 2006

In using DNA to build nanocomputers, synthetic biologists are posing questions about the very notion of computation and of life itself, says Martyn Amos.

Possibly the most unusual reviewing assignment I have ever accepted came in 2002, when Guinness World Records asked me to help validate a claim made by a group of Israeli scientists to have built the "world's smallest computer". What made this machine radically different was not just its incredibly miniaturised state but its basic construction material. Rather than piecing together transistors on a silicon surface, Ehud Shapiro and his team at the Weizmann Institute had fabricated their device out of the very stuff of life itself - DNA.

Three trillion copies of their machine could fit into a single teardrop. This miracle of miniaturisation was achieved not through traditional technology but through a breakthrough in the emerging field of molecular computing. The team used strands of DNA to fuel these nanomachines, their latent energy freed by enzymatic "spark plugs". These were not computers in any traditional sense. Their computational capabilities were rudimentary and, rather than using the familiar zeros and ones of binary code, their "software" was written in the vocabulary of the genes - strings of As, Gs, Cs and Ts.

A primary motivation for shrinking traditional computer chips is to extract the maximum amount of computational power from a tiny space. By placing ever smaller features on the silicon real estate of processors, chip manufacturers such as Intel try to keep in step with Moore's law - that computer power roughly doubles every 18 months.

Shapiro's computer was never going to win any prizes for mathematical muscle. All it could do was analyse a sequence of letters and determine whether or not it contained an even number of a specific character.

Nevertheless, it represented the state of the art in a scientific field that had been in practical existence for less than a decade.

In 1994, Len Adleman (one of the co-inventors of the main internet encryption scheme, and the man who gave a name to what we now know as computer viruses) stunned the computing world by demonstrating the feasibility of performing computations using molecules of DNA.

Rather than representing information as electronic bits inside a silicon chip, Adleman showed how to solve a problem using data encoded as sequences of bases on DNA molecules. One of his motivations lay in the storage capacity of DNA; nature has data compression down to a fine art. Every living cell in your body contains a copy of your unique 3Gb genome, the data equivalent of 200 copies of the Manhattan telephone directory. Adleman wanted to use the nature of chemical reactions to perform massively parallel computations on this molecular memory.

Each tube could contain trillions of individual DNA strands, and each molecule could encode a possible answer to a particular problem. The idea was to exploit the fact that enzymes and other biological tools act on every strand in a tube at the same time, quickly weeding out bad solutions and giving the potential for parallel processing on a previously unimagined scale.

Adleman's initial paper led to the emergence of a fully fledged field. A rash of papers appeared, describing proposals to use DNA to crack government encryption schemes or to build real "wet" memories more capacious than the human brain. After this flurry of untamed optimism - when some seriously thought that molecular machines could give traditional computers a run for their money - DNA computing matured into a more thoughtful discipline. Scientists no longer talk seriously about taking on silicon machines and are instead seeking niche markets for their molecular machines, areas such as medical diagnostics and drug delivery, where traditional devices and methods are too large, invasive or prone to error.

Shapiro's simple computer was one example of such an application; a small step towards eventual "on-site" diagnosis and treatment of diseases such as cancer. A later version of his machine was capable (in a test tube, at least) of identifying the molecules that signal the presence of prostate cancer and then releasing a therapeutic molecule to kill the malevolent cells. Shapiro and his team have spoken about their aim of creating a "doctor in a cell", a reprogrammed human cell that could roam the body sniffing out and destroying disease. As physicist Richard Jones explains in his book Soft Machines, the Fantastic Voyage scenario of humans in a miniaturised submarine is "quite preposterous", but that does not rule out serious work into trying to engineer existing living systems to act as "medibots" able to detect and control disease at its source.

A growing band of experts is slowly coming together to form a vanguard at the frontiers of science, where boundaries between biology, chemistry, engineering and computing become fluid and ever-changing. This is the new world of synthetic biology. "We want to do for biology what Intel does for electronics," says George Church, professor of genetics at Harvard University. Tom Knight of the Massachusetts Institute of Technology is even more blunt: "Biology is the nanotechnology that works."

DNA is so much more than an incredibly compact data storage medium. As physicist Richard Feynman explained: "Biology is not simply writing information; it is doing something about it." Floating inside its natural environment - the cell - DNA carries meaning, used to generate signals, make decisions, switch things on and off, like a program that controls its own execution. DNA, and the cellular machinery that operates on it, is the original reprogrammable computer, predating our efforts by billions of years. By re-engineering the code of life, we may finally be able to take full advantage of the biological "wetware" that has evolved over millennia.

We are dismantling living organisms and rebuilding them - this time according to a preplanned design. It is the ultimate scrapheap challenge.

As pioneers such as Alan Turing and John von Neumann discovered, there are direct parallels between the operation of computers and the gurglings of living "stuff" - molecules and cells. Of course, the operation of organic bio-logic is more noisy, messy and complex than the relatively clear-cut execution of computer instructions. But rather than shying away from the complexity of living systems, a new generation of synthetic biologists is seeking to harness the diversity of behaviour that nature offers, rather than trying to control or eliminate it. By building devices that use this richness of behaviour at their very core, we are ushering in a new era in terms of practical devices and applications and of how we view the very notion of computation and of life itself.

The questions that drive this research include the following: Does nature "compute", and if so, how? What does it mean to say that a bacterium is "computing"? Can we rewrite living cells' genetic programs to make them do our bidding? How can humanity benefit from this potentially revolutionary new technology? What are the dangers? Could building computers with living components put us at risk from our own creations? What are the ethical implications of tinkering with nature's circuits? How do we (indeed, should we) reprogram the logic of life?

The dominant science of the new millennium may well prove to be at the intersection of biology and computing. As biologist Roger Brent argues: "I think synthetic biology will be as important to the 21st century as (the) ability to manipulate bits was to the 20th." This is not tinkering around the edges, it is blue-skies research - the sort of high-risk work that could change the world or crash and burn. I took a huge risk in the 1990s when I gambled on DNA computing as the topic of my PhD research - a field with a literature base, at the time, of a single article.

It is exhilarating stuff, and it has the potential to change for ever our definition of a "computer". But most researchers are wary of promising too much, preferring to combine quiet optimism with grounded realism. As researcher Drew Endy explains: "It'll be cool if we can pull it off. We might fail completely. But at least we're trying."

Martyn Amos is a senior lecturer in computing, Manchester Metropolitan University. His book Genesis Machines is published this month by Atlantic Books. He is speaking at the ICA on November 14. Details: www.ica.org.uk

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments