It takes only a few minutes for the stumbling animated figures displayed on Torsten Reil's computer screen to learn how to walk. But by the time one has gained a confident gait, it will have gone through many generations of evolution, writes Steve Farrar.
The team led by the Oxford University zoologist has devised a new approach to animation that combines artificial nervous systems and evolution to produce realistic, interactive motion.
The spin-off company founded by Mr Reil and colleagues Colm Massey and David Raubenheimer - NaturalMotion - is developing and commercialising the approach for use in computer games, movies and visual simulation industries.
The technology emerged from research into the neuro-control of human and animal locomotion. It involves software that creates a physical simulation of a body based on biomechanical data.
The body's movements are controlled by an artificial nervous system of interconnected neurons. The configuration of this neural network is not determined in advance but randomly generated in a host of different combinations.
Each body attempts to walk according to physical parameters such as the dimensions and mass distribution of the character to be animated but only a few manage more than a single footstep before falling.
The most successful are then selected to produce the next generation, with each offspring differing from its parent by a few random mutations. The whole process is repeated until one neural network evolves the ability to keep the body on its feet.
Mr Reil said: "We end up with a fully simulated, interactive human or animal that appears very natural. What you see on the screen is not just a computer graphic of the character, it is the character."
The active character technology is being used to simulate other less challenging movements that the figure might be asked to perform.
It will be unveiled at Siggraph 2002 - the world's premier computer graphics meeting - in Texas in July.