LIVING creatures took millions of years to evolve from amphibians to four-legged mammals - with larger, more complex brains to match. Now an evolving robot has performed a similar trick in hours, thanks to a software "brain" that automatically grows in size and complexity as its physical body develops.
Existing robots cannot usually cope with physical changes - the addition of a sensor or new type of limb, say - without a complete redesign of their control software, which can be time-consuming and expensive.
So artificial intelligence engineer Christopher MacLeod and his colleagues at the Robert Gordon University in Aberdeen, UK, created a robot that adapts to such changes by mimicking biological evolution. "If we want to make really complex humanoid robots with ever more sensors and more complex behaviours, it is critical that they are able to grow in complexity over time - just like biological creatures did," he says.
As animals evolved, additions of small groups of neurons on top of existing neural structures are thought to have allowed their brain complexity to increase steadily, he says, keeping pace with the development of new limbs and senses. In the same way, Macleod's robot's brain assigns new clusters of "neurons" to adapt to new additions to its body.
The robot is controlled by a neural network - software that mimics the brain's learning process. This comprises a set of interconnected processing nodes which can be trained to produce desired actions. For example, if the goal is to remain balanced and the robot receives inputs from sensors that it is tipping over, it will move its limbs in an attempt to right itself. Such actions are shaped by adjusting the importance, or weighting, of the input signals to each node. Certain combinations of these sensor inputs cause the node to fire a signal - to drive a motor, for example. If this action works, the combination is kept. If it fails, and the robot falls over, the robot will make adjustments and try something different next time.
Finding the best combinations is not easy - so roboticists often use an evolutionary algorithm to "evolve" the optimal control system. The EA randomly creates large numbers of control "genomes" for the robot. These behaviour patterns are tested in training sessions, and the most successful genomes are "bred" together to create still better versions - until the best control system is arrived at.
MacLeod's team took this idea a step further, however, and developed an incremental evolutionary algorithm (IEA) capable of adding new parts to its robot brain over time.
The team started with a simple robot the size of a paperback book, with two rotatable pegs for legs that could be turned by motors through 180 degrees. They then gave the robot's six-neuron control system its primary command - to travel as far as possible in 1000 seconds. The software then set to work evolving the fastest form of locomotion to fulfil this task.
"It fell over mostly, in a puppyish kind of way," says MacLeod. "But then it started moving forward and not falling over straight away - and then it got better and better until it could eventually hop along the bench like a mudskipper."
When the IEA realises that its evolutions are no longer improving the robot's speed it freezes the neural network it has evolved, denying it the ability to evolve further. That network knows how to work the peg legs - and it will continue to do so.
At this point, it is just like any other evolved robot: it would be unable to cope with the addition of knee-like joints, say, or more legs. But unlike conventional EAs, the IEA is sensitive to a sudden inability to live up to its primary command. So when the team fixed jointed legs to their robot's pegs, the software "realises" that it has to learn how to walk all over again. To do this, it automatically assigns itself fresh neurons to learn how to control its new legs.
When the team fixed jointed legs onto the robot, it 'realised' it had to learn how to walk all over again
As the IEA runs again, the leg below the "knee" is initially wobbly, but the existing peg-leg "hip" is already trained. "So it flops about, but with more purpose to it," says MacLeod. "Eventually the knee joint works and the robot evolves a salamander-like motion."
No comments:
Post a Comment