As I drive the Mercedes through Palo Alto, I am reminded of a horseback outing in South America a few weeks earlier. A novice, I was put on an experienced and quite tame horse. It knew the route we were on by heart, accelerating when it could smell the comfort of its own stable, and I had to make only occasional corrections. “Driving an automated car is very much like riding a horse,” says Donald Norman, author of The Design of Future Things and a consultant for BMW, among other automakers. “You can ride a horse with tight reins or loose reins. Loose reins means the horse is in control—but even when you’re in control, the horse is still doing the low-level guidance, stepping safely to avoid holes and obstacles.”
My driving, for example, is being constantly monitored by the car’s Attention Assistance function, which tracks more than 70 elements—from minor steering wheel movements to my use of turn signals—for signs of operator fatigue. After 20 minutes, the baseline is set and the car will flag subsequent deviations. If, while parsing the data, it senses that I’ve grown weary, a coffee cup icon pops up in the instrument cluster. (It’s up to me to pull over for the coffee.)
There are people that don’t have hands free and drive their car with only one hand, people that text and totally take their eye off the road and type conversations to each other....
The object, vaguely pink, sits on the shoulder of the freeway, slowly shimmering into view. Is it roadkill? A weird kind of sagebrush? No, wait, it’s … a puffy chunk of foam insulation! “The laser almost certainly got returns off of it,” says Chris Urmson, sitting behind the wheel of the Prius he is not driving. A note is made (FOD: foreign object or debris, lane 1) as we drive past, to help our computerized car understand the curious flotsam it has just seen.
When Michael Allen first learned to drive, he was so excited. He was 16 years old, which in California means he can get a driver's permit. After six months with a permit, he could have a license to drive. Michael had been pestering his father to teach him how to drive for months and now came the big day. His father Edward was a big strong man, who insisted on teaching Michael on a standard transmission car. That meant Michael had to learn how to operate a clutch and gear shifter, which is not easy.
I was thinking about all this as the Lexus headed south from Berkeley down Highway 24. What I wasn’t thinking about was my safety. At first, it was a little alarming to see the steering wheel turn by itself, but that soon passed. The car clearly knew what it was doing. When the driver beside us drifted into our lane, the Lexus drifted the other way, keeping its distance. When the driver ahead hit his brakes, the Lexus was already slowing down. Its sensors could see so far in every direction that it saw traffic patterns long before we did. The effect was almost courtly: drawing back to let others pass, gliding into gaps, keeping pace without strain, like a dancer in a quadrille.
Levandowski kept a laptop open beside him as we rode. Its screen showed a graphic view of the data flowing in from the sensors: a Tron-like world of neon objects drifting and darting on a wireframe nightscape. Each sensor offered a different perspective on the world. The laser provided three-dimensional depth: its sixty-four beams spun around ten times per second, scanning 1.3 million points in concentric waves that began eight feet from the car. It could spot a fourteen-inch object a hundred and sixty feet away. The radar had twice that range but nowhere near the precision. The camera was good at identifying road signs, turn signals, colors, and lights. All three views were combined and color-coded by a computer in the trunk, then overlaid by the digital maps and Street Views that Google had already collected. The result was a road atlas like no other: a simulacrum of the world.
These days, Levandowski and the other engineers divide their time between two models: the Prius, which is used to test new sensors and software; and the Lexus, which offers a more refined but limited ride. (The Prius can drive on surface streets; the Lexus only on highways.) As the cars have evolved, they’ve sprouted appendages and lost them again, like vat-grown creatures in a science-fiction movie. The cameras and radar are now tucked behind sheet metal and glass, the laser turret reduced from a highway cone to a sand pail. Everything is smaller, sleeker, and more powerful than before, but there’s still no mistaking the cars. When Levandowski picked me up or dropped me off near the Berkeley campus on his commute, students would look up from their laptops and squeal, then run over to take snapshots of the car with their phones. It was their version of the Oscar Mayer Wienermobile.
They started the project with Levandowski’s pizza car and Stanford’s open-source software. But they soon found that they had to rebuild from scratch: the car’s sensors were already outdated, the software just glitchy enough to be useless. The DARPA cars hadn’t concerned themselves with passenger comfort. They just went from point A to point B as efficiently as possible. To smooth out the ride, Thrun and Urmson had to make a deep study of the physics of driving. How does the plane of a road change as it goes around a curve? How do tire drag and deformation affect steering? Braking for a light seems simple enough, but good drivers don’t apply steady pressure, as a computer might. They build it gradually, hold it for a moment, then back off again.
Although there are still automobile advertisements that show happy families in nice cars; there are also advertisements for certain vehicles that are more geared towards single people.
There is a saying that goes like this “a car is only as clean as the owner”, so why not make a good first impression pulling up to a scene and showing off that hard earned money invested into a luxury ride....
Google Street View has since spread to more than a hundred countries. It’s both a practical tool and a kind of magic trick—a spyglass onto distant worlds. To Levandowski, though, it was just a start. The same data, he argued, could be used to make digital maps more accurate than those based on G.P.S. data, which Google had been leasing from companies like NAVTEQ. The street and exit names could be drawn straight from photographs, for instance, rather than faulty government records. This sounded simple enough but proved to be fiendishly complicated. Street View mostly covered urban areas, but Google Maps had to be comprehensive: every logging road logged on a computer, every gravel drive driven down. Over the next two years, Levandowski shuttled back and forth to Hyderabad, India, to train more than two thousand data processors to create new maps and fix old ones. When Apple’s new mapping software failed so spectacularly a year ago, he knew exactly why. By then, his team had spent five years entering several million corrections a day.