What’s all the buzz about? Bee inspired robots are coming and they aren’t just for pollinating plants.
A number of researchers are investigating the development of artificial robotic bees, others are developing artificial bee brains and bee based algorithms, and meanwhile a full connectome map of the honey bee brain has been developed. Artificial insect minds may be the first true Artificial General Intelligences (AGIs) available for commercial use.
The recently announced Green Brain Project at Universities of Sheffield and Sussex in the United Kingdom are embarking on a project to build models of the systems in the brain that govern a honey bee’s vision and sense of smell. The project’s aim is to create the first flying robot able to sense and act as autonomously as an insect. This project will demonstrate that it is possible to construct an artificial brain that can perform complex tasks in an open environment as well as the brain of an animal, and the resulting hardware and software will be useful in a wide range of possibly quite surprising applications.
Tasks the Green Brain robot will be expected to perform will include finding the source of particular scents (actually chemical vapors or gases see below) in the same way that a bee can identify particular flowers. The robot will be expected to operate in real-time in the real-world, that is it will be a working example of artificial embodied cognition. The Green Brain project is partially supported with hardware donated by NVIDIA Corporation. Nvidia is a leading dveloper of parallel GPUs or Graphics Processing Units which are advanced chips originally developed for accelerating video games but which are increasingly being used for supercomputing as well. See http://www.hpcwire.com/hpcwire/2012-08-09/nvidia_s_supercomputing_kepler_gpu_lands_in_workstations.html for more information on the latest advances in GPU based computing systems from Nvidia.[Editors note: there are recent rumors of a new mystery chip from Nvidia as well, the so called Project Boulder]
Meanwhile researchers at the Institut fur Biologie-Neurobiologie Berlin, Mercury Computer Systems GmbH, SRI International, and the Zuse-Institut Berlin have developed a three-dimensional average-shape atlas of the Honeybee Brain. This atlas could allow the development of a full robotic insect capable of imitating the full social capabilities of an insect operating in its natural environment. This would open the door to entirely new ways to study insects using a robotic insect to enter and participate in activity in the hive while transmitting information back to human scientists in real-time. Robotic insects could also be used to help endangered biological insect colonies more efficiently locate food sources or offer enhanced protection against predators, disease, etc., for example to rescue specific bee colonies in danger of collapse.
An important point to understand here is that the honey bee brain is a general purpose bio-information processor capable of complex spatial perception and recognition of visual and olfactory stimuli in open environments. The bee brain can perform real world pattern recognition and categorization in the presence of noise and interfering or competing signals, it provides goal based guidance and control of a winged flying body, and bees operate in groups demonstrating group goal based behaviors such as directed and optimal search by using social communication and “swarm” organization.
It has been known for some time that bees perform fairly complex computational tasks and recently it was shown that bees can out perform the best existing computational methods at some of these tasks, i.e. solving the traveling salesmen problem. The widespread availability of bee based AGI will allow these problems to be solved using this same approach providing better solutions for numerous problems in routing, path planning, and resource allocation. As it turns out, insect minds are just about the right scale for modern computers to properly model them. This is an advantage over other connectome based approaches (i.e. the full human or feline connectome projects) which require massive super-computing resources not generally commercially available.
The idea of using insect based models as a general computing paradigm isn’t new. For example, back in 1993 I gave a presentation at the 3rd Annual Meckler Virtual Reality Conference in San Jose California on this subject. The presentation was entitled “Dynamic Agents in Virtual Reality” and we demonstrated a simple swarm optimization system based on a very abstract “bee brain”. Our system used a “potential field” model and it was used to produce an interactive and dynamic 3D virtual world from a very simple set of programming constructs. Relevant here, we demonstrated that using this field model and a simple inter-agent communications scheme (modeling the so called ” dance of the honey bee”) well understood search strategies were observed to emerge; for example the virtual colony allocated more bees to resource rich areas of the world and exploited them in a “depth first” fashion which as it turns out is what bees actually do.
The neuronal coding used by the bee brain to represent odors has been largely mapped and understood. This means that it should be possible to train bee brains or to redesign or modify this neuronal system in silico to perform similar related tasks, for example recognizing chemical compounds of interest but also possibly recognizing people, finding objects, etc. Robotic bees will be “programmed” to pollinate specific plants at industrial scales. And yes these will also be great spybots; it might even be possible to develop a bee robot specifically modified to identify and track an individual by their scent. Bees generate spatial maps of odors in their brains and specific patterns of activity correspond to different substances. Bee robots will be useful for locating and tracking environmental polluters, clandestine drug labs, and much more. Insect bots can even be armed with poison stings or aerosol based chemical agents.
Another interesting bee inspired project is based at Harvard’s School of Engineering and Applied Sciences. The RoboBee project is being developing with Centeye Corporation and promises a robotic insect for applications such as
- autonomously pollinating a field of crops;
- search and rescue (e.g., in the aftermath of a natural disaster);
- hazardous environment exploration;
- military surveillance;
- high resolution weather and climate mapping; and
- traffic monitoring.
Perhaps most interesting is the method of fabrication of the Harvard “MoBees” which enables mass production of winged microrobots.
Beyond this the RoboBees project is developing a comprehensive and producible architecture for a winged micro-robotic device.
The Honey Bee Standard Brain could be used to develop a programmable neuromorphic processing unit with a variety of applications beyond use in robotic insects. Such devices have been already demonstrated, for example, to provide 10,000 times faster processing than biological neurons and therefore have a wide range of applications in large scale data mining and general machine learning and machine vision applications. Programmable or trainable bee brains could be used to pilot vehicles such as aircraft or cars. Your future PC or mobile device might include a bee inspired processing unit of this sort.
Art Credit: Queen Wasp, Naoto Hattori http://naotohattori.com/prints/queen_wasp.html
Need more of the buzz? See below.
The ABC Algorithm
Mathieu Lihoreau, Lars Chittka, and Nigel E. Raine, Travel Optimization by Foraging Bumblebees through Readjustments of Traplines after Discovery of New Feeding Locations,
NMR Imaging of the honeybee brain
The Virtual Atlas of the Honeybee Brain
Harvard RoboBees Project