Wednesday, August 20, 2014

Driverless cars mastering cities

LOS ANGELES—Google says it has turned a corner in its pursuit of a car that can drive itself.
The tech giant’s self-driving cars already can navigate freeways comfortably, albeit with a driver ready to take control.

But city driving—with its obstacle course of jaywalkers, bicyclists, and blind corners—has been a far greater challenge for the cars’ computers.
In a blog entry posted today, the project’s leader said test cars now can handle thousands of urban situations that would have stumped them a year or two ago.
“We’re growing more optimistic that we’re heading toward an achievable goal—a vehicle that operates fully without human intervention,” project director Chris Urmson wrote.
Urmson’s post was the company’s first official update since 2012 on progress toward a driverless car—a project within the company’s secretive Google X lab.
The company has said its goal is to get the technology to the public by 2017.
In initial iterations, human drivers would be expected to take control if the computer fails. The promise is that, eventually, there would be no need for a driver.
Passengers could read, daydream, even sleep—or work—while the car drives.
Google maintains that computers one day will drive far more safely than humans, and part of the company’s pitch is that robot cars substantially can reduce traffic fatalities.
The basics already are in place. The task for Google—and traditional carmakers, which also are testing driverless cars—is perfecting technology strapped onto its fleet of about two dozen Lexus RX450H SUVs.
Sensors including radar and lasers create 3D maps of a self-driving car’s surroundings in real time while Google’s software sorts objects into four categories: moving vehicles, pedestrians, cyclists, and static things such as signs, curbs, and parked cars.
Initially, those plots were fairly crude. A gaggle of pedestrians on a street corner registered as a single person.
Now, the technology can distinguish individuals, according to Google spokeswoman Courtney Hohne, as well as solve other riddles such as construction zones and the likely movements of people riding bicycles.
To deal with cyclists, engineers initially programmed the software to look for hand gestures that indicate an upcoming turn.
Then they realized that most cyclists don’t use standard gestures—and still others weave down a road the wrong way.
So engineers have taught the software to predict the behaviour of cyclists based on thousands of encounters during the roughly 10,000 miles the cars have driven autonomously on city streets, Hohne said.
The software projects a cyclist’s likely movements and plots the car’s path accordingly—then reacts if something unexpected happens.

More stories