hide announcement WIN your very own meteorite! Subscribe, gift or renew a subscription to Cosmos and automatically go into the draw – Shop now!

The Google car: Driving hands-free


Cars that drive themselves are safer than those driven by humans, researchers say, but are still a long way from becoming a practical reality. Tim Dean reports on progress.


Google

1. Radar: Front- and rear-mounted radars detect other vehicles and measure their speeds. This helps determine whether the Google car should speed up or slow down. Radar has the added advantage that it works perfectly well in rain and fog.

2. Orientation sensor: The orientation sensor helps the computer keep track of the car’s motion, balance and orientation, such as whether the car is going uphill or is on a sloping road.

3. Optical camera: As lidar cannot see colour an optical camera picks up visual details such as traffic lights and the writing on road signs.

4. Position sensor: A sensor in the wheel tracks the car’s motion and feeds that data to the main computer. This, along with GPS, is used to pinpoint its position.

5. Lidar: The Velodyne HDL-64E lidar sports 64 infrared lasers and rotates at a rate of between five and 15 times per second. It paints a picture of the world around the car, identifying objects and enabling the car to determine its own position in the environment.

SUE settled in for the long drive home. It had been an exhausting week but now she was looking forward to the relaxing commute across town. The car pulled smoothly away from the curb as she pulled her tablet out of her bag. Soon the scenery was whizzing by. Her only regret was that she would arrive home before she finished reading the news.

In many ways driving is an ideal job for computers. Unlike us they never blink, tire or have their attention distracted. With a slew of sophisticated sensors and a powerful enough processor, they can navigate even the busiest city streets with digital precision.

In most autonomous cars, including Google’s self-driving cars (pictured), the sensor that does most of the heavy lifting is the roof-mounted lidar – the word’s a fusion of “light” and “radar” – which uses multiple infrared laser beams to paint a detailed picture of the world 70 metres around the car in every direction.

Lidar has the virtue of high resolution and works just as well at night as during the day. Its main shortcoming is that it struggles in rain and fog – but then again so do we humans. The other problem with lidar is the cost. “With a street price of $70,000, that’s more than the car’s worth in one sensor,” says Hugh Durrant-Whyte, CEO of the information and communications research body, NICTA.

However, companies such as Velodyne, which makes the lidar for Google’s prototype autonomous cars, are working on reducing the cost and improving performance, particularly in bad weather.

Just as our eyesight is complemented by other senses, so too is the lidar aided by a full suite of sensors including GPS for navigation, optical cameras for spotting traffic lights and street signs, radar for detecting cars ahead, and orientation sensors so the car knows precisely where it is. The information is synthesised by the main computer which predicts the movements of any objects in range and calculates the safest path through them – all the while sticking to the speed limit and obeying the road rules.

THE MINUTE Sue got home, she was ambushed. “Hi mum! Can I drive to band practice? Please?” Sophia was putting on her most innocent pleading look.

“Of course you can, just as long as you’re back before five. Don’t forget the car has to pick up your grandfather before dinner.”

Sue tapped the authorisation for Sophia to “drive” as her daughter ran out the door with her violin case. “Sure mum! Bye!”

One of the predicted boons of automated cars is a dramatically lower road toll, especially among young and inexperienced drivers. According to the Eno Center for Transportation, replacing half the cars on US roads with autonomous versions would mean 1.8 million fewer crashes each year, saving nearly 10,000 lives and the American economy $49 billion.

Future developments will see automated cars networking in order to improve safety and efficiency even further. “There is a lot of work going into car infrastructure systems, like cars talking to traffic light systems and cars tracking people and passing that information to the other vehicles,” says Salah Sukkarieh, director of Research and Innovation at the Australian Centre for Field Robotics in Sydney. “It becomes like an autonomous train system.”

THE CAR pulled up to the curb and the door popped open. An elderly gentleman in dark glasses eased himself out of the front seat and into a warm hug from Sue.

“Hi Dad. How was the drive?” “Smooth, as always,” he said. He reached into the back seat and felt around for his white cane. Sue took his elbow and walked him up the path to the front door.

Google’s car might be grabbing most of the headlines, but many automotive manufacturers are also developing self-driving cars, from Audi to Toyota.

Durrant-Whyte believes it’s likely to take several years before automated cars are ready for our roads. “It’s not as straightforward as people might imagine to go from a really nice YouTube video to something that will work in all weathers at all times with 100% reliability. I suspect we are much further away from having autonomous cars than people actually think we are.”

However, as Sukkarieh points out, we don’t have to wait for fully autonomous cars to start enjoying safer, easier driving. “Vehicles are becoming more and more intelligent,” he says. Smart headlights that anticipate corners, reactive cruise control systems, automatic braking and optimal swerving to avoid collisions are already appearing in new cars.

This article is part of our special edition, Rise of the Robots. For more stories click here.

Contrib tim 20dean 2014.jpg?ixlib=rails 2.1
Tim Dean is a science writer and philosopher based in Sydney.
Latest Stories
MoreMore Articles