CHRISTCHURCH, May 29: AI is going underwater with the development by Kiwi engineers of a smart marine drone for the aquaculture industry.
“To enable the expansion of our aquaculture sector, we need this technology,” says project leader Professor Richard Green of the University of Canterbury.
“The natural progression is to figure out how we can improve world food security. By enabling more automation of the farming we are already doing, we could expand farming without it being prohibitively expensive.
The ocean is all swells, currents and creatures in continual three-dimensional movement. A challenging place for humans to visit and to collect data. Artificial Intelligence (AI) has made inroads into that space with innovations like ReefCloud for collating and analysing reef survey imagery, simplifying and speeding up reef condition reporting.
Now AI-driven Autonomous Underwater Vehicles (AUV), or drone, in development by engineers at the University of Canterbury, may be coming to fish farms and wharves across the Tasman, with potential global applications for the technology.
New Zealand aquaculture is worth $600 million ($NZ650 million), with seafood exported to 81 countries. The focus is on the native Green-lip Mussel (Perna canaliculus) and two introduced species, the Pacific Oyster (Crassostrea gigas) and the King or Chinook Salmon (Oncorhynchus tshawytscha).
Marine aquaculture is expensive, facing challenges like biofouling (over-growth of nets and the shells of molluscs with marine life), which can influence water quality and complicates growth monitoring, particularly of shellfish. Potential unwelcome species on shellfish nets and ropes include the native black mussel and the potentially harmful invasive mud-worm.
“We’ve been working with a good cross section of representatives from the aquaculture industry, which has been really helpful to understand their needs and collect a lot more data,” says Green.
“Skinny mussel lines have been the hardest problem. At the moment, mussel farmers are taking a slow multi-million-dollar boat with a big crane, and lifting the lines up to try and get an estimate and grab a handful.”
An AI-driven drone could take over the job.
The onboard AI would take the AUV close to the mussel ropes or fish nets, all of which are moving in three dimensions, often in areas with fast-flowing water currents.
Cameras on the drone take dozens of photos from many angles, for later 3D image building. An extendable claw is also in development to bring samples for analysis.
Green told Cosmos, “You could just get a couple of people in a really fast boat, throw the AUV over the side. It just scans up the mussel ropes and down, and within minutes, comes back with all the sizes, the count, and even brings back samples.”
The project is a culmination of 10 years of research.
Navigating a drone in a constantly moving environment and taking useful photos to build up 3D images is no mean feat.
Technology like the ‘Doppler velocity log’ has made this possible says Green. The log uses the ‘Doppler Effect’ — sound wave frequencies change depending on the relative positions of the wave source and the observer.
Think of an ambulance coming up behind; the pitch of the siren changes as it goes past. That’s due to the Doppler Effect. UC’s AUV bounces sound waves off the sea bottom and on-board sensors on the drone pick up the change in frequency, which is directly proportional to the craft’s speed.
Then there are the GPS, accelerometers and gyro wheels, which tell the AUV where it is in 3D. “How much you’re changing your angle or how much you’re moving, if you’re accelerating. Helps to stabilise”, say Green
You don’t necessarily want to be sitting there like a rock relative to the seabed.
“The sea is pushing, you’re wasting so much power, and you’ve got to track the mussel farm. But also you’ve got to consider what the water is doing, because the mussel farm and the water don’t necessarily move at the same direction” says postgrad student, Tim Rensen who is developing the AUV and software for his AI and computer vision PhD.
“We have to fuse all these sensors together with different weightings on them, depending on what we’re doing,” says Green.
All co-ordinated by the ‘robot brain’, the NVIDIA processing unit, adds Rensen.
The AI also allowed the team to predict the 3D location of mussel ropes about a second into the future, getting around the issue of constant movement.
“It’s severely underestimated how difficult robots in the wild truly are.”
“That’s the difference between a factory robot and a robot on the wild. Factory robots just doing the motions on a pre-programmed path, whereas a robot in the wild has to adapt. It has to have intelligence to respond, it has to be learning, because the environment changes,” he says.
“This work all came together at the right time because we needed high-quality AI algorithms; really fast processing to make it faster and cheaper; improved camera technology; and LED lighting. It couldn’t have been done 10 years ago,” says Green.
The AI-augmented drone is in the working-prototype stage. The latest iteration is called Poseidon, after the Greek god of the sea.
Drone in marine research
Do you care about the oceans? Are you interested in scientific developments that affect them? Then our email newsletter Ultramarine is for you.