The diabolical problems of the Moon which Australian universities have to overcome by 2026

The diabolical problems of the Moon which Australian universities have to overcome by 2026

Vision. Spatial awareness. Hazard avoidance. Humans – and even ants – have it all. And Australia’s proposed lunar rovers need it to mine the Moon’s airless, radiation-blasted surface.

According to the Australian Institute for Machine Learning’s Professor Tat-Jun Chin, the challenges of giving Earth-bound autonomous vehicles even a basic degree of visual perception are immense.

He looks at our world class mining industry: “After the billions of dollars thrown at the industry, where are all our autonomous cars?” he asks.

And then there’s the Moon.

“If you train a machine learning model that works on our environment and you send it to the Moon – it’s not going to work,” he adds.

Impenetrable shadows and then blinding brightness. Thousands of shades of grey. Eye-straining conditions make up every sight. No GPS.

And machines aren’t at all effective at extrapolating what they know to understand unfamiliar situations.

First, they need to “see” the environment around them accurately and reliably.

Then they need to have the ability to extract meaning from it all: situational awareness.

And all this will have to be crammed within a tiny, low-powered rover operating in extremes of temperature and radiation with 2.5 seconds wi-fi lag every time it “phones home” to Earth.

“It is not yet certain if self-driving vehicles and intelligent robots on Earth will ever become a reality. Yet, using autonomous robots in space exploration and exploitation is almost certainly a necessity,” he says.

It’s a pressing problem for Australia’s Space Agency (ASA). It has resolved to apply the nation’s industrial and academic creativity towards providing “foundation services” in space. That’s the essential infrastructure necessary for everything and anything to work. Resource extraction. Materials processing. Civil construction. Manufacturing. Assembly. Transport. Maintenance.

The ASA’s Trailblazer project wants to have rovers mining and transporting water and oxygen from lunar regolith by 2026. That’s its contribution to NASA’s Artemis mission to land a woman on the Moon and build a “space cabin” at its south pole.

Regolith is the loose, deposits of rock and soil which covers solid rock. It’s a word we’ll hear frequently as plans advance for the mining of the moon.

The face on the moon confronts Darwin’s eye dilemma

Charles Darwin’s greatest hurdle in coming to grips with his theory of evolution was the eye. Flabbergasted, he declared it was “absurd” to think it was the result of spontaneous mutation and natural selection.

Then he realised all it would take was one photo-sensitive cell backed by a pigment cell. Combined, they represent the starting point for directional vision. And the more of these pairs clustered together, the greater the resolution.

Professor Chin says that giving machines raw vision is the easiest challenge – once the technology’s limitations are understood.

Even low-cost cameras can produce a crisp, clear picture of their surroundings. And simple LIDAR (light direction and ranging) devices can see through the darkest shadow or brightest mound to generate a perpetually updating 3D “mental map”.

“Combining sources from satellite imagery, LIDAR scans and a pair of cameras, we can get an up-to-date understanding of an environment,” he says.

All three sources can be individually processed to extract meaning. And a separate step can fuse it into a “big picture”, layered overview.

Chin says the concept has been tested in a South Australian copper mine.

“The terrain down there, it’s just ridiculous,” he says. “My guts were spilling out of my mouth … it’s just crazy bumpy.”

The technology has the earthly ability to cut mine-site survey costs and provide almost real-time updates on the safety of the pit’s walls.

Cw90

What’s that got to do with space rovers?

“That kind of environment is alien,” he says.

“And the first time you go to a mine, that’s a totally alien environment. It’s very sparse. It’s very harsh. It’s not like anything you’ll find on the Earth’s surface. And we already have vehicles that are at least partially automated in such conditions.”

But the challenges of giving lunar rovers “eyes” is only the beginning. The real hurdle is making what it “sees” meaningful.

“How do you get meaning from pixels?” he asks. “Humans can self-localise very easily – you see a picture, and you can recognise where it is and what it is.”

“We need to give machines the same ability.”

A sense of space

“Why do we need automation? There’s a significant communication lag,” Chin says. “And the environment is very alien to humans. So you may be able to remotely establish a strong communications link. But controlling something with a 2.5 second time lag isn’t a very intuitive process.”

He says that having a “human in the loop” to micromanage every action would slow lunar mining activity to a snail’s pace. And that’s not viable.

But how do you get an autonomous scout rover to recognise an unexpected obstacle? How do you get a digger to the right spot and identify resources? How do you get a hauler to park in the right place at precisely the proper distance for the digger’s extractor arm to deposit its load?

It comes down to spatial awareness. Giving the robots a sense of position in their landscape. Without GPS. With minimal power usage. And maximum reliability.

“It’s not just about what you’ve learned in your life, but also your ancestors through evolution. Autonomous systems currently are very far from that.”

Tat-Jun Chin

KISS (Keep it Simple Stupid) must apply. Within limits.

“Wheel slippage will be a thing on the Moon,” Chin says. “That means you can’t simply count their rotations to extrapolate an exact path.”

Called “dead reckoning” by Earth-based sailors and pilots, experiments have shown a mining robot following an anticipated average round-trip would be up to three metres off course by the end of just one circuit.

“If you ran it for two hours, you’re going to be so inaccurate as to be totally useless,” he says. “So we need something intelligent.’

“One approach was to use landmarks and a sequence of images to localise ourselves,” Chin says. “So we’ve trained a neural network to recognise lunar landers, and then using fiducial (fixed reference) landmarks to localise the camera. And that’s served us very well.”

Eye of the beholder

“Occasionally, the rovers have to come back – either to get repaired, get recharged, or dump the regolith into the hopper,” says Chin. “When they come back, they’re going to see the landers. Then they’re going to say,’ that left one is the recharge station, the right one is the lander’… and work out a location relative to that.”

Image
Professor Tat-Jun Chin discusses the challenges of giving mining rovers an effective sense of position along with obstacle avoidance and navigation on the Moon’s surface at Adelaide’s Lot Fourteen in December last year. Credit: Supplied

But conditions will have been different to the last visit.

At the very least, deep wheel ruts will have been cut into the regolith by heavy hauler rovers.

Obstacle detection and avoidance may be intuitive to animals and humans. But not machines.

“Autonomous vehicle manufacturers have been attempting to address this issue for some time now,” he says.

“Our machines can accurately measure everything – distances down to centimetres, the curvature of the road, the incline … all that. But we humans do far more than that.”

Not only do we see the car coming towards us, but we can also interpret the expression on its driver’s face – and make inferences. We see a ball bounce onto the road – and can anticipate a child running after it.

But simply sensing the ball moving against a background would overwhelm a mobile processor with hundreds of gigabytes of data.

“It’s not just about what you’ve learned in your life, but also your ancestors through evolution,” he says.

“Autonomous systems currently are very far from that.”

A lunar mining rover won’t have to interpret a traffic cop’s hand signals. But it will have to dodge rocks and avoid colliding with other rovers and infrastructure.

Chin says it will need regular satellite mapping combined with real-time camera and LIDAR vision to give each rover a sense of where it is and what’s around it. Only then can it anticipate  those wheel ruts that weren’t there the last time it passed that way …

The reality of perception

Lunar rovers need to know what to expect. They need to know how to react.

“We don’t have enough data on the lunar environment,” Professor Chin says. “We need to learn as we get it.”

Even the best replica lunar regolith pits on Earth are only an approximation,  a “best guess” interpretation.

But machines can’t separate this from reality, and they can’t extrapolate existing algorithms to new situations very well.

“You can’t just code down every single eventuality,” Chin says. “And you need to learn from data as it’s simply too rich to process in real time.”

And that has implications for the shape, weight and power consumption of payloads being sent to the Moon.

“We’ll be effectively updating the rover models once they arrive on the Moon,” he says. “Can you take the computing power to do that? No. So we’ll have to collect all the footage, all the data, and send it back to Earth. We’ll do the computing and then upload the new versions.”

Professor Chin says the challenge appears to boil down to one over-arching solution.

The rovers are going to have to learn on the job. And fast.

“We’re still nowhere near getting to the Moon. But before we start spending $50 million to build the rovers, let’s find the most adaptable solutions.”

More in The Weekly

Crystalball2

Asthma and fire season are a bad mix. Enter smoke forecasting.

By Claire Peddie



Drone2

Meet your new shark patrol (they spot other animals too)

By Melissa Márquez



Next big thing

Country care: Where traditional knowledge meets science

By Michael Shawn-Fletcher



Moonmining2

The moon has what we need. Should we dig it up?

By Amalyah Hart

Please login to favourite this article.