Research reveals how we could design robots to think like bees
Honey bees have to balance effort, risk and reward, making rapid and accurate assessments of which flowers are mostly likely to offer food for their hive. Research published in the journal eLife today reveals how millions of years of evolution has engineered honey bees to make fast decisions and reduce risk.
The study enhances our understanding of insect brains, how our own brains evolved, and how to design better robots.
The paper presents a model of decision-making in bees and outlines the paths in their brains that enable fast decision-making. The study was led by Professor Andrew Barron from Macquarie University in Sydney, and Dr HaDi MaBouDi, Neville Dearden and Professor James Marshall from the University of Sheffield.
“Decision-making is at the core of cognition,” says Professor Barron. “It’s the result of an evaluation of possible outcomes, and animal lives are full of decisions. A honey bee has a brain smaller than a sesame seed. And yet she can make decisions faster and more accurately than we can. A robot programmed to do a bee’s job would need the back up of a supercomputer.
“Today’s autonomous robots largely work with the support of remote computing,” Professor Barron continues. “Drones are relatively brainless, they have to be in wireless communication with a data centre. This technology path will never allow a drone to truly explore Mars solo – NASA’s amazing rovers on Mars have travelled about 75 kilometres in years of exploration.”
Bees need to work quickly and efficiently, finding nectar and returning it to the hive, while avoiding predators. They need to make decisions. Which flower will have nectar? While they’re flying, they’re only prone to aerial attack. When they land to feed, they’re vulnerable to spiders and other predators, some of which use camouflage to look like flowers.
“We trained 20 bees to recognise five different coloured ‘flower disks’. Blue flowers always had sugar syrup,” says Dr MaBouDi. “Green flowers always had quinine [tonic water] with a bitter taste for bees. Other colours sometimes had glucose.”
“Then we introduced each bee to a ‘garden’ where the ‘flowers’ just had distilled water. We filmed each bee then watched more than 40 hours of video, tracking the path of the bees and timing how long it took them to make a decision.
“If the bees were confident that a flower would have food, then they quickly decided to land on it taking an average of 0.6 seconds),” says Dr MaBouDi. “If they were confident that a flower would not have food, they made a decision just as quickly.”
If they were unsure, then they took much more time – on average 1.4 seconds – and the time reflected the probability that a flower had food.
The team then built a computer model from first principles aiming to replicate the bees’ decision-making process. They found the structure of their computer model looked very similar to the physical layout of a bee brain.
“Our study has demonstrated complex autonomous decision-making with minimal neural circuitry,” says Professor Marshall. “Now we know how bees make such smart decisions, we are studying how they are so fast at gathering and sampling information. We think bees are using their flight movements to enhance their visual system to make them better at detecting the best flowers.”
AI researchers can learn much from insects and other ‘simple’ animals. Millions of years of evolution has led to incredibly efficient brains with very low power requirements. The future of AI in industry will be inspired by biology, says Professor Marshall, who co-founded Opteran, a company that reverse-engineers insect brain algorithms to enable machines to move autonomously, like nature.
For Macquarie University:
Annette Adamsas, Annette.firstname.lastname@example.org, +61 417 489 903
Niall Byrne, email@example.com, +61 417 131 977
Cairbre Sugrue, Sugrue Communications, firstname.lastname@example.org
Bees are available to film at Macquarie University and the University of Sheffield, video overlay and graphics available.
Journal paper DOI: https://doi.org/10.7554/eLife.86176
(Backup) prepub version of paper at DOI: https://doi.org/10.1101/2023.01.02.522517