Navigation on the Moon: Australia's big opportunity

Australia leading the way back to the Moon

Flying to the Moon isn’t a matter of plugging the coordinates into Google Maps and letting GPS satellites guide the way – nor is moving across the lunar surface. And the fleet of autonomous vehicles NASA hopes to send into orbit and bouncing about the Moon’s surface needs an extraordinary amount of situational awareness to guarantee success.

A toppled lander isn’t an option, says artificial intelligence and robotics engineer Xavier Orr. A satellite-sized crater in the regolith is even further down the list of desired outcomes.

That’s why Orr’s Sydney-based Advanced Navigation has inked a deal with US space systems business Intuitive Machines to build position-finding equipment for three landers and two relay satellites by 2025.

One sensor uses a light detection altimetry and velocimetry (LiDAV) system developed by the ANU Gravitational Research Laboratory. Its lasers can pinpoint a vehicle’s velocity and position during autonomous lunar landing procedures. It can also operate in the dark and dusty conditions likely to be found in the Moon’s craters.

The second is an inertial navigation system, the Boreas X90. A partnership with RMIT University’s Integrated Photonics and Applications Centre resulted in digitising this well-established “blind flying” technique.

Now Advanced Navigation has tailored both systems to appeal to the space industry. They’re lighter, less bulky, energy efficient, and more precise. Put together, that’s expected to represent $85 million worth of extra capacity Intuitive Machines can dedicate to paying cargo.

First Australian on the Moon

Advanced Navigation is a leading contender for the status of Australia’s first company to put a product on the Moon. But it’s up against fierce competition in the remote robotics and sensor-processing fields.

Both the LiDAV and Boreas technologies are ready to go. A test rig will be carried into space later this year. The next big challenge will be participating in a demonstration mission to the Moon in 2024. It will plot the course for NASA’s Commercial Lunar Payload Service – part of the Artemis exploration program. 

At about the same time, Intuitive Machines will use the technology to guide three landers and two communications relay satellites. By 2025, the Nova-D cargo system is expected to regularly deliver 500–1000 kilogram payloads to the Moon’s surface when and where needed.

By 2025, the cargo system is expected to regularly deliver 500–1000 kilogram payloads to the Moon’s surface when and where needed.

“It’s imperative that our large lunar payload customers are confident that our systems will deliver the cargo safely and reliably,” says Intuitive Machines Chief Technology Officer Dr Tim Crain. “If we can demonstrate Advanced Navigation’s technology on our current Nova-C landers, we can significantly improve the robustness of landing with Nova-D.”

The technology is opening up new frontiers of exploration.

“They are well suited for our Micro-Nova, a mini-extreme mobility lunar vehicle also known as a ‘hopper’,” says Crain. “Mass on the hopper is at a premium, but we require sensors that can help us fly to permanently shadowed craters and through lava tubes. We look forward to discovering more of the lunar surface with Advanced Navigation.”

Smaller, lighter, better

Established in 2012, Advanced Navigation supplies its technology to the likes of Airbus, Boeing, Google, Apple and General Motors. Orr says that’s why his company can deliver the LiDAV and Boreas systems in such a short timeframe – despite pandemic-induced global supply chain disruptions.

“We’re quite fortunate that we already have a photonics division and manufacturing line,” he says. “And the Boreas X90 shares 98% of its parts with the non-space rated D90 version. That’s at a point where it’s in volume production now, so we have a backlog of parts.”

The LiDAV unit also shares about 70% of its components with the Boreas.

“They’re both laser interference products,” says Orr. “Both have a laser source, optical chips and measurement apparatus. Even on the processing side, it’s all a lot the same. We’re quite lucky with that crossover.”

Smaller, lighter equipment offering better resolution is in demand for aircraft automatic landing and take-off systems. It also has a place in autonomous vehicles – whether in the air, on the ground or under the sea – not to mention uses in weather detection and geophysical modelling. But space is the ultimate frontier.

“The easy part was making it suitable for vacuum,” says Orr. “That mainly meant making sure you don’t have metals with different expansion and contraction rates producing unwanted bends and cracks.”

The big challenge was radiation: “We had some experts give us some guidance on that one; how not to use too much heavy shielding.” 

It meant pulling everything apart and reconfiguring it in ways that turned non-sensitive passive pieces into radiation buffers around sensitive active components.

“We’ve got a bit of testing to go,” Orr says. “We’ve done a lot already.”

Launching LiDAV

It’s like LiDAR. Just better. And, Orr says, it represents a significant improvement on NASA’s own guidance technology.

“NASA’s system provides information including distance to the ground and velocity relative to that surface,” he says. “Our system gives you that and tonnes of extra information as well.”

The LiDAV unit provides angular velocities – your turn rate relative to the surface. It also senses the density and mineral properties of the ground.

“This is all really important stuff for landers when they’re coming down in an unknown location: will that surface support the leg or not? Where is a suitable alternate landing site? The LiDAV system will tell you all of that.”

A laser emits a beam flickering to a coded sequence. Variations in and between different parts of that sequence carry specific details that can be extracted and interpreted by onboard algorithms. 

“This is the technology developed at the ANU over the past 15 years,” Orr says. “We’ve acquired that patent.” 

NASA’s system works differently and requires more equipment. “So it’s going to be very hard for them to shrink it down much further,” he explains. “They’ll get some gains. But they won’t be anywhere near as much as the ones we can still achieve.”

At the moment, the LiDAV weighs about 3.5kg and is the size of your average refillable water bottle. It mainly involves sensors bundled via optical fibres to an optical chip. Orr says that will all be reduced to a chip the size of a matchbox by 2025.

“That’s doing it all on silicon,” he says. “It’s a bit of a process getting there, though.”

But the resulting reduction in size, weight – and cost – will be significant.

“We won’t have to join everything together with fibre optics. Having all those different parts is where the cost of manufacture is.”

Boreas rising

External forces are everywhere. Even in the vacuum of space.

“Your velocity is always changing,” says Orr. “There’s always going to be some large body producing a gravitational pull. And utilising that gravitational pull allows slingshot manoeuvres when sending a payload to the Moon.”

Inertial navigation isn’t new. Take an accelerometer and a gyroscope. The accelerometer detects the influence of gravity and the nudge of manoeuvring thrusters. The gyroscope senses angular velocities.

“From that, you can calculate how much you’ve turned, what direction you’re pointing in, and where you’re going.”

And it all operates without needing external input, such as visual star chart references. What makes Boreas different is that it is the first (and so far only) digital fibre optic gyroscope in the world.

Smaller, lighter equipment offering better resolution is in demand for aircraft, autonomous vehicles, and weather detection and geophysical modelling. But space is the ultimate frontier.

This is where Orr’s own research kicked in.

“Normally, inertial navigation systems and a lot of sensor technologies will use something called the Kalman filter,” he says. “That’s an algorithm that can take sensor readings with errors and fuse them with other sensors to give you an error-free result.

“I developed a new digital version of that algorithm based on AI, using a convolutional neural network instead of the traditional old analogue technology from the 1960s. You get about ten times the performance.”

It also allows the system to be a lot smaller. Digital, as is often the case, also means smaller.

“It’s about 40% smaller and lighter than competing units.”

Please login to favourite this article.