Robots on a roll
Hard-working field robots are paying their way in remote mines and city ports, with Australian-based researchers leading the way. James Mitchell Crow reports.
In a corner of the Port of Brisbane, close to the CBD, no human is allowed to tread. There are high fences, and a web of laser beams scan the perimeter. This land is inhabited by giant robots. Ten metres tall and weighing in at 65-tonnes, they thunder along at 10 metres per second on wheels almost the height of a man.
Welcome to the home of the AutoStrads: a 27-strong family of autonomous dockside straddle trucks that has lived and worked at the Patrick’s Brisbane container terminal since early 2005. These eight-wheeled giants use radar, high precision GPS and a host of other sensors to pilot themselves around the site, working 24-7 to ferry shipping containers between quayside and roadside. In contrast to your typical human baggage handler, they are gentle with the cargo, placing each container with better than two-centimetre accuracy.
AutoStrads are the babies of Hugh Durrant-Whyte, who led much of their development at the Australian Centre for Field Robotics (ACFR) based at the University of Sydney. In late 2010, Durrant-Whyte left to become head of NICTA, Australia’s centre of excellence for information and communications technology research.
The English engineer is somewhat of a paradox. Despite heading a communications institute he has never owned a mobile phone. But then, he has never held a driver’s licence either, and that didn’t stop him teaching robots how to drive.
His work now largely concerns what takes place behind a screen, but he clearly enjoys casting his mind back to Australia’s big outdoors and field robots. As we talk about his AutoStrads, I can’t help thinking he must miss them, just a little.
In the late 1980s, Durrant-Whyte was one of the first robotics researchers to re-set their sights away from the endearing robots Isaac Asimov imagined in his I, Robot series, to no-nonsense, hard-working, un-cuddly “field robots”. His first prototype was an autonomous dockside vehicle, built in the early 1990s while he was still at the University of Oxford. “At that point it was by far the largest robot ever built,” he says.
Two decades later, humanoid robots – the research area Durrant-Whyte left behind – are still stuck in research labs. “To be honest, I’m not sure that field has progressed much since, simply because it is so hard,” he says. “If you think about where robots are having an impact, none of them are humanoid.”
But many of them are field robots: commercially successful autonomous machines such as the AutoStrad. And the country that has led much of this progress? Australia.
There’s nothing small about the Pilbara in Western Australia. It is renowned for its vast, ancient landscapes, its parched red soil dotted with dusty-green gum trees, its oasis-like waterholes – and its mineral wealth. Today, the Pilbara is also dotted with gigantic, bright yellow robots – autonomous 500-tonne iron ore hauling tip trucks that make the Toyota LandCruisers they drive past look like Tonka toys, their rooflines barely half-way up the tip-trucks’ wheels.
Durrant-Whyte became interested in Australia soon after his first forays with dockside robots while at Oxford. “If you are going to do field robotics, Australia is arguably the best place to do it. It is big and empty, and its economy primarily runs on things that sit in the back of large vehicles,” he explains. “So I moved to Australia.”
Australia’s mining industry also had the financial muscle to lift the concept off the drawing board. It’s now more than five years since the radar eyes of the Pilbara’s first robot blinked open, took in its surroundings at Rio Tinto’s West Angelas iron ore mine and then got to work - hauling ore from where it is dug up to where it is crushed, before being loaded on to trains and taken to the coast for export. “We’ve moved 200 million tonnes on the back of these autonomous trucks,” says John McGagh, head of innovation at the multinational mining giant. “By weight, that’s equivalent to about 3500 Sydney Harbour bridges.”
Mining robots were not the first robots that Durrant-Whyte deployed in Australia. “As luck would have it, one of the first people I bumped into was Chris Corrigan, who was already running a container terminal,” he recalls. At the time, Corrigan ran Sydney-based logistics company Patrick Corporation. Durrant-Whyte was drawn to revisiting a dockside robot. “It’s always good to do something a second time, because you’ve learnt all the errors from last time. So we kicked off a program with them.”
From that meeting the AutoStrads were born.
'The challenge with a vision system is to know when it has failed.'
It’s hard to think of a better proving ground for outdoor robots than a shipping container terminal. The environment is structured, contained, relatively small and the robot’s task is to move boxes from A to B. “In theory at least it seemed like a plausible problem to solve,” Durrant-Whyte says. Working out how a machine as large as a straddle truck should move autonomously was one challenge, but the far bigger issue was sensor technology, and the “perception problem” (see box). How does a field robot detect its surroundings and understand its environment well enough to go about a task safely and effectively? Inside, lighting is controlled. But outside the sun is constantly moving, clouds come and go, shadows shift. And then there’s wind, rain, dust, or even snow and fog – all affect a robot’s view of the world.
When it came to designing the AutoStrad’s eyes, the research team were up against some tight constraints. When a 65-tonne truck loaded with a 50-tonne container is navigating the narrow confines of a busy shipping terminal, a positioning error of a few centimetres could be catastrophic. The team quickly ruled out the technologies most often used for artificial eyes: GPS and lasers. The reflective metal surfaces that fill a container terminal bounce satellite signals around and can derail GPS tracking. And lasers and optical cameras are fairweather friends – they can’t penetrate far enough through rain. The team opted for high-frequency radar. The longer wavelengths don’t get disrupted by raindrops – and to improve positioning accuracy, passive radar reflectors were stationed around
The AutoStrads know where these reflectors are and triangulate their own position. But no individual sensor technology is foolproof, says Durrant-Whyte. “The challenge with a vision system is to know when it has failed. Think of your own vision system, sometimes it does fail, so you need an alternative way of detecting the object that will not fail in the same way,” he says. You might not spot the approaching motorcycle as you step out to cross the road, but your ears will pick up the buzz of its engine, and so you look again. And all the while, to keep us stable and upright, our brains are cross-checking information from our eyes with inertia sensors in our ears, and the sensation in our legs – which is why striding along a moving walkway can feel so peculiar.
In a similar way, the AutoStrads have multiple back-up systems to let them know where they are and what’s around them. They check speed and direction not just with radar but also by using data from sensors on their wheels, an inertia sensor tucked inside the body of the vehicle that tracks its motion, and a backup GPS sensor that pokes out of its roof.
As an additional layer of safety, they have laser-based collision detection systems on each corner and touch-sensitive bumpers that will bring them to a halt if they detect a collision. Sneaking up on an AutoStrad is all but impossible – but to be utterly certain, that system of perimeter lasers will halt all the machines if any intruder should hop the fence. AutoStrads have been working at the Port of Brisbane since 2005, and next year the company (now called Asciano) plans to introduce 44 of the bright red robots to its Sydney operations at the newly redeveloped site of Port Botany.
Why go to all this trouble to use robots rather than people? For a start, robots are safer drivers. “In the first year of automation at our Brisbane AutoStrad Terminal, we achieved a 75% reduction in safety incidents, increasing to 90% in following years,” says Asciano’s container terminals director, Alistair Field.
Autonomous vehicles are also more efficient drivers. Robots might be made of metal, but human drivers are the leadfoots. AutoStrads use 40% less fuel, and require 70% less maintenance than manually driven straddle trucks, says Durrant-Whyte.
But the main advantage of using robots, whatever industry you are in, is always the same, says McGagh. He holds degrees in engineering and economics and is clearly adept at bringing both skill sets to bear, judging by the successes of the “Mine of the Future” program his innovation team manages at Rio Tinto. “Why do you use autonomous welding in the motor vehicle industry? For quality, precision and productivity,” he says – and it’s the same thing in mining.
Using an approach first trialled on automated London Underground trains in the 1960s, each autonomous machine is in constant communication with a central control computer. No robot can move until it has sought, and received, permission. Like an automated aircraft control tower, the central computer system is aware of the position of all autonomous and non-autonomous vehicles in the mine, and directs the robots to use the most efficient and safe path.
Autonomous trucks are not the only machines chatting with the central controller in the Pilbara. They have recently been joined by some new friends – the autonomous drill rigs. These slow-moving platforms carry out the first step in the iron ore operation – drilling holes into the ore body, up to 16 metres deep. Each hole is then filled with explosives and detonated to break up the rock, dirt and ore. In the Pilbara, a million blast holes are drilled each year. And it’s not just a matter of brute force.
It’s hard to form an emotional connection with an automated mining truck. They are brutally strong, tirelessly efficient, but not terribly charismatic.
“The drill rigs are real scalpels, it’s high precision stuff - down to sub-centimetre positioning of the machine,” says McGagh. Using their high precision GPS, laser and optical cameras, the drill rigs carefully shuffle into position, then use tilt sensors connected to jacks, to set themselves perfectly level before they begin to drill. As it drills, the robot continually analyses the rock it is cutting through, sending real-time data about the ore body back to human mine operators, who then know much more about the next batch of ore before it has even been blasted. “We get better quality data from the autonomous machines, and that’s worth a lot of money to us,” says McGagh. The promise of added value benefits such as this convinced the mining industry to invest in autonomous systems, Durrant-Whyte recalls. “I remember taking people from Rio Tinto to the automated container terminal in Brisbane, watching the lights go on when I explained to them that what they were seeing was an automated terminal, not a bunch of robots. Immediately they got the vision that what they wanted was an automated mine, not an automated truck. That really kicked off the work we did with Rio.”
It’s hard to form an emotional connection with an automated mining truck. They are brutally strong, tirelessly efficient, but not terribly charismatic.
Perhaps that’s why humanoid robots still garner so much attention, despite not being especially useful. We can’t help but try to nurture them along, as if they were clumsy toddlers, and it’s diverted a lot of time and resources away from other areas of robotics research. “I used to tell my students, I reckon anthropomorphic approaches to robotics had put back the field by a decade. And I’d now modify that to say two decades,” says Durrant-Whyte.
So what the average field robot may lack in charm, it more than makes up for in other ways – something I come to appreciate when I visit Sydney’s centre for field robotics, and see many of them in the metal.
The centre is one of the world’s biggest, Stefan Williams tells me as he shows me around. Like several of his colleagues, Williams is a “lifer” at the centre. The personable Canadian arrived in 1998 after coming to the country to begin a PhD under Durrant-Whyte’s supervision. He finished in 2001 but has remained ever since, and is now a professor.
At the centre, I quickly come to appreciate there are jobs robots can now handle with aplomb, and others where automation just doesn’t do a good enough job yet. It’s coffee time, and as we walk through the workshop, everyone is huddled around the big old manually operated espresso machine, grinding the beans, frothing milk – and ignoring the shiny automatic espresso machine in the corner. Barista jobs will be safe for a good while yet, it seems.
The centre works on three types of field robot: land robots; robots that fly; and underwater robots, the subject of Williams’ research (see box). In the workshop, all three forms were on show - most with their cases opened up and their electronic entrails spilled out across the workshop bench. But they evoke little pathos; there’s a dispassionate, geeky intrigue to seeing the insides of these extremely advanced gadgets.
An intact field robot stands in a corner. Painted bright red, and with a curved outer shell, he’s called Ladybird. He looks perky, with one side of his shell raised as if in salute.
Like his insect namesake, he’s a good thing to have in your veggie patch.
Agriculture is another Australian industry that is ripe for robots. It is notoriously hard to source farm labour, especially around harvest time – and particularly as labourers can earn far more in mining. The National Farmers’ Federation estimates the farm labour shortage at up to 100,000 workers, and has lobbied hard for more seasonal workers to be allowed in from overseas. In response the Federal Government trialled, and in 2012 made permanent, a seasonal worker program that grants temporary entry to workers from the Pacific to help gather the country’s harvest. Agriculture is not as lucrative as mining, so research funding has been tighter, nevertheless the first efforts of agricultural robots are starting to bear fruit. Broadacre crops such as wheat or rice are relatively simple for robots to tend and harvest; it’s a case of adding autonomous systems to existing tractors and harvesters (see The robot tractor and precision farming).
The much bigger challenge – and where Ladybird comes in – is with fruit and vegetable crops. “Harvesting is where most of the labour costs are,” says Salah Sukkarieh, who leads the team developing Ladybird.
Like Williams, Sukkarieh also has been at the centre since joining to start a PhD with Durrant-Whyte in the late 1990s – his research topic was to help develop the AutoStrad’s navigation system. “My big interest is optimisation and efficiency,” he tells me – and in finding real-world situations where his work can be quickly applied. The expertise he has developed in intelligent software systems for autonomous decision-making turns out to have all sorts of applications outside robotics. He’s working with Qantas on an automated system for advanced flight planning to maximise fleet efficiency.
But that’s easy compared to his work with Ladybird. Harvesting fresh produce is enormously difficult for a robot. The Ladybird has to contend with changing light, wind, rain and dust . But these are the least of a farming robot’s worries. While every shipping container is the same, every plant is unique. The fruit is always in a different spot, often hidden by leaves, and can vary greatly in size, shape and colour. The robot then has to decide if it’s ripe for picking, and then pluck it without squashing the produce. Even the geekiest shopper would soon tire of tomatoes dented by robot fingers.
Sukkarieh is tackling one challenge at a time: first master perception; then try simple tasks such as pruning and weeding; and finally work on harvesting.
When it comes to perception, Sukkarieh’s first machines didn’t even get their wheels dirty. For the past 10 years, he has been pioneering unmanned aerial vehicles – UAVs, or drones as they are now known.
'Over the last two years the focus has been on machine perception – what type of sensors do I use, what algorithms do I have to put together?'
“Drones have suddenly become of much wider interest, but we had a major UAV program started in the ’90s,” says Durrant-Whyte. “We did programs that were significantly beyond what you often see now.” The team was the first to fly multiple autonomous UAVs in formation. Each plane was equipped with a different type of sensor, allowing them to collectively build a picture of the environment below.
The technique is now being used to map weeds and invasive animals in the northern tablelands, says Sukkarieh. As a fringe benefit, the hovering drones exploit their rotors downdraft to spray weeds with herbicide.
But more recently he’s been working on farm robots that keep their wheels on the ground. “Over the last two years the focus has been on machine perception – what type of sensors do I use, what are the algorithms that I have to put together?” That’s useful technology in its own right, for example for autonomously monitoring crop yield, or assessing crop health, or keeping an eye out for weeds or insect pests.
“We’re in a period where that technology can be spun off and used,” says Sukkarieh. Applications for these kinds of technologies aren’t limited to agriculture. For the past six years, Marathon Targets has been manufacturing the world’s first “smart targets” for live fire training by specialist military marksmen. With humanoid torsos and heads, but wheels instead of legs, these machines pilot themselves around training grounds while snipers take careful aim. And they react to their environment – if a robot gets hit, neighbouring robots will scatter.
Back on the farm, the next step for Sukkarieh is to combine perception with simple plant manipulation, such as spraying or pruning. Once that’s mastered, it will be time to tackle the ultimate challenge: harvesting.
Ladybird will take the kind of sensor technology Sukkarieh has developed for tree crop robots, and apply it to vegetables. But that’s just the start. The team has plans to play with other sensors, such as instruments to sniff crops to help determine whether a veggie is ripe. And the team is also working to combine data from all of Ladybird’s sensors to try to identify the tell-tale signals that will show whether a plant is sick or merely thirsty. Brute force computer power could no doubt crack the problem, but Ladybird is a mobile platform running on batteries and can’t simply guzzle electricity to solve a task – the challenge is to be as computationally efficient as possible.
Tucked under its shiny red shell, Ladybird also has a robotic arm.
“It will be to spray or even mechanically weed,” Sukkarieh says. Perhaps one day a small fleet of Ladybirds will be guided to weed or disease hotspots by a UAV or two keeping an airborne eye on the whole farm. And ultimately, these UAVs will send in the Ladybirds when there is produce to be harvested. Farmers are clearly excited by the prospect. In June the Australian Vegetable Industry’s peak body, Ausveg, named Sukkarieh researcher of the year for his work on Ladybird.
“I think Australia will maintain its leadership in this field,”
“We have diverse industries interested in field robotics, and investing in R&D for field robotics, and it will stay like that I think.”
Rio Tinto is still investing. In 2007 it funded the formation of the Rio Tinto Centre for Mine Automation at the ACFR at Sydney. The collaboration is still going strong, and there are many projects in the works. As in agriculture, robot technology is not ready to take on every task in the mine.
The loaders that scoop up the blasted ore and dump it into the autonomous trucks are still human-operated. However carefully the ore is blasted, the result is a complex environment that demands a skilled human.
“The loader operator is using that incredible computing power called the human brain to visually look at the blasted rock pile and interact with it,” says McGagh. It will be a long time before we have the computing power and sensor technology to replace the human operator for this job, he says.
Other targets are closer to hand. Next year the company plans to introduce autonomous trains – that will weigh 32,000 tonnes, fully loaded – to bring the ore from the mines to the ports where it is loaded on to ships and exported. The company has a roadmap projecting the next several generations of its autonomous systems. But the system will never entirely run the show, says McGagh – they will always be tools under human direction. The maintenance staff, geologists and blast planners
will always be on site. “There will never be a mine without people. Never,” he says.
Durrant-Whyte has led the creation of many field robots over the years, but he is proudest of the dockside AutoStrads he developed when he first came to Australia.
“The reason for that is, it works – we walked away,” he says. “What amazes me now is, everyone in the terminal, the truckies who drive up, they don’t pay any attention at all, they just expect these massive robots to do these things. And that, I think, is the tick of success.
“For mining, the same thing will happen in the next few years. When people like me are no longer involved, you will know it’s a success.”