Flies_vr

Fancy VR fact-finds fly flight

There it is again. Zzzzzzzz. That pesky fly. It’s the only one in the room and it will not settle long enough for successful deployment of the swatter. Zzzzzzz.

A tethered apple fly (Rhagoletis pomonella) responding to airflow and odour stimuli in a VR arena. Credit: Shoot for Science: Deepak Kakara, Dinesh Yadav, Sukanya Olkar, and Parijat Sil.

The only choice is to go for a mid-air hit – a wrist-powered surface-to-air swipe. Except… well, have you tried to swat a fly on the wing? Their navigational perception is extraordinary – in a room full of obstacles and with a determined human swishing at them madly they manage to flap, stall, spin or dive to safety again and again.

Now imagine that you’re a scientist trying to observe and understand the behaviour of flies on the wing in the natural world, where they’re flying for food, shelter, sex. What a nightmare.

So rather than attempt the (nearly) impossible task of following flies around, scientists at the National Centre for Biological Sciences in India brought them indoors, into their own special world.

The researchers used virtual reality (VR) tools to create a complex, naturalistic 3D environment for insects to navigate. They showed that insects can respond to three-dimensional objects in a virtual world and also use odour and wind cues to make choices.

Shannon Olsson, leader of the study published in the journal PNAS, had long wondered how tiny insects are so good at finding things – such as a mosquito seeking out a human – that are several kilometres away.

“With the VR that we built we can begin to get at this question – of what it is that causes them to make certain choices,” says Olsson.

The use of VR to study insect behaviour isn’t new: for several decades, scientists have been presenting insects with simple sensory clues, like stripes, that simulate a sense of motion.

“Stripes are a very neat, simple structure to understand the physiology or mechanism of vision. But they don’t help us understand behaviour,” explains Pavan Kumar Kaushik, the VR fly world’s chief architect.

In this world, a tethered fly is presented with a natural setting including sky, grass and trees. The insect responds by beating its wings to move as it would in the real world. The screen moves in response to the wingbeats, giving the fly an illusion of actually navigating the world. It’s the fly’s version of a video game. Mosquitoes, apple flies and other insects sought out objects of their interest in the VR world, just as they would in nature.

The final reality touches were airflow to mimic wind and enticing puffs of odour. These were by far the most challenging aspects to recreate, says Kaushik, who spent more than three years perfecting them: “When you have visual systems you can simply look to see if something is wrong.”

Flies can distinguish the size and distance of virtual objects amidst a complex background. Credit: Shoot for Science: Deepak Kakara, Dinesh Yadav, Sukanya Olkar, and Parijat Sil.

But the other cues he had to use the insects’ behaviour to help him troubleshoot and correct the system to mimic reality as closely as possible.

“What is really interesting about [this] setup is that it attempts to bridge the gap between traditional VR – simple shapes, simple environments – and the natural world of insects,” says cognitive neuroscientist Vivek Jayaraman, from the Howard Hughes Medical Institute in Maryland, US.

The researchers chose the apple fly Rhagoletis pomonella for their behavioural studies.

“It’s a specialist – it only likes apple trees – so we don’t have to worry about whether we’re giving it the right stimuli,” explains Olsson. 

Multiple studies over decades in apple orchards and other places have uncovered the shape of trees the apple fly favours, and the kind of fruit and smells they like, giving Kaushik and Olsson some truth on which to base the VR.

“[The VR arena] is pretty well-controlled, so they don’t lose the experimental advantages of VR,” says Jayaraman, who got a first-hand look at the set-up on a visit to the lab.

“What they gain is the ability to glean insights into some of the trickier aspects of insect navigation –long-range localisation of an odour source, visual algorithms that the insects use to make decisions about whether or not to approach something, and how olfactory, wind and visual cues are combined to get to a food source.”

At first the researchers measured the distance beyond which the apple fly no longer travels to an apple tree. In one of the most fascinating experiments, they show that flies can use a phenomenon called motion parallax to perceive the depth of an object against a complex background.

They present the fly with two trees that look equal-sized, but as the fly moves closer, one tree expands in size much faster, since it is actually twice as close. By choosing to manoeuvre toward this tree instead of the farther one, the flies show that they can discern depth from motion and use this information to locate food sources.

Airflow cues are also important to help the flies orient themselves, particularly in the absence of visual cues. This is similar to our reliance on sounds for orientation when we can’t see an object, such as when we’re trying to locate a buzzing mobile phone that’s hidden from sight.

But when they’re orienting to smell, the flies need to have a world that they can “see” – if they don’t also have visual and wind cues, they can’t locate the odours they detect.

“Earlier, there was no way to really isolate these cues. Our VR allows us to do that and show how they use these cues in combination,” says Olsson.

The findings demonstrate that flying insects integrate multiple types of sensory cues to locate and navigate toward virtual objects in a complex 3D landscape. 

According to the authors, the findings could be exploited by ecological models, robotics, search algorithms and a variety of other applications, such as optimal strategies for pest control, crop pollination, and disease vector management.

Leave a Reply

Your email address will not be published. Required fields are marked *