Bees get their ‘fix’ in virtual reality
How does a small insect, with its tiny eyes and brain, see the world it moves through? Obviously an insect can’t answer our question directly, so scientists have developed other means to uncover the answers. One approach is to show the insect visual cues in virtual-reality – and measure how it responds. If the measured response is used to update the scene the insect sees, such an experiment is said to be closed-loop, enabling investigations into how the insect moves through a controlled environment. Yet in the measurement step a problem can arise; being tiny, an insect only needs to produce tiny forces and movements to propel itself around. These small actions can be difficult to record accurately; particularly as experimental sensors are often repurposed from other, less demanding, fields. Like in any experiment, the accuracy of the results will be limited by the accuracy with which the original measurements are made. We wondered if inaccurate sensor measurements might also influence an insect to change its behaviour during a closed-loop experiment?
To address this question we conducted a study where honeybees, which are known to be exceptionally capable at learning to complete challenging tasks, were tethered above a ball they could control with their legs. The movement of the ball, measured by a feedback sensor, in turn controlled the position of a vertical green bar, displayed on an LED panel in front of the bees. This was a closed-loop experiment where the bees could interact with this restricted visual environment. Left to their own devices, the bees would position the green bar to their front, a behavior called fixation.
When we compared the behavior of bees between experiments where we changed the feedback sensor used to measure the motion of the ball, we found that bees would also change aspects of their behaviour. Specifically, a behavioural change occurred when using sensors that were repurposed from consumer computer mice, as compared to when a purpose designed computer vision system was used, or compared to naturally walking bees. We identified that the bees in the experiment took advantage of a measurement glitch in the consumer sensors, such that by walking faster bees reduced these sensors’ sensitivity to their turning motions, effectively making the bar they were interested in easier to control. There was due to an inherent flaw in these sensing devices, but it was not something that human users would usually notice. Indeed, we were surprised bees were not only able to perceive a fault that humans are generally unable to, but were able to take advantage of it as well.
Essentially, we found bees ‘gamed the system’; by changing their behaviour the bees were able to enjoy more time with the green bar positioned to their front. Bees, and insects in general, are often thought to have fairly limited cognitive abilities, that don’t allow for adaptation with regards to the mechanisms they use to control their flight and walking. This is an intriguing finding, because while questioning how small insects, with tiny eyes and brains, are able to process complex visual information, we saw their capabilities exceeded our expectations. Ultimately, these studies of how insects use their limited computational capacity to efficiently process sensory information will lead to design strategies applicable to artificial systems operating with similar constraints, such as robots, and potentially provide fundamental insight into the mechanisms underlying our own information processing abilities.
Adapted from a media release by Darius Koreis at the University of Queensland.
Insects modify their behaviour depending on the feedback sensor used when walking on a trackball in virtual reality.
Taylor GJ, Paulk AC, Pearson TW, Moore RJ, Stacey JA, Ball D, van Swinderen B, Srinivasan MV
J Exp Biol. 2015 Oct