Self Driving photo
SHARE

Nearly two years ago, Uber rolled into Pittsburgh, raided a bunch of roboticists from Carnegie Mellon University’s famed robotics program, and set up a secretive facility across the road to build out a fleet of self-driving cabs. Now they’re finally pulling back that curtain of secrecy to show us what they made. For the past two days, the global ride-hailing company has hosted dozens of eager-to-try-it tech journalists, inviting them to town to take the wheel (or really just sit behind it as the car does its thing) in advance of unleashing them on the public before summer ends–nine days and counting. We’ll let you know what it’s like as soon as we get a ride in one. (Update: our impressions of Uber’s self-piloting car!)

We grabbed an early and exclusive sit down with the man behind Uber’s program, Engineering Director Raffi Krikorian, to discuss how the cars work, what Uber will do in case of an accident, how seasonal changes can confuse the car’s sensors, and why Uber wants to one day replace its current 5 million daily global fares with robot drivers.

The driverless car business is getting crowded. Google, Apple, Lyft, everyone is chasing this technology. You obviously don’t want to be left behind or left at the mercy of someone else’s tech. Is that the reason you’re doing this? What are the benefits to you?

Raffi Krikorian: Think of it this way. Driving is actually a pretty dangerous thing. I think something like over a million people die in car accidents each year. Ninety percent of those are from human error. So if you think of the number of rides Uber offers on a daily basis–five million on average–part of it ends up being a safety issue for us. We also think we can do better in cities with autonomous vehicles. We think we can do much better congestion planning. We can be smart about how to move people around.

What technological or physical challenges have you faced in your home turf of Pittsburgh?

We jokingly call Pittsburgh the double black diamond of driving. It’s completely organically grown, in the sense that it’s a really old city: many of the roads aren’t wide enough for two-way traffic, they don’t come together at right angles, and the signage is antiquated and constantly in need repair. Additionally you have weather conditions that you may not get in places like San Francisco or the South. We say if we can drive in Pittsburgh, we can drive anywhere.

Have you ever commuted in one of these self-driving cars?

Every day. I call the car in the morning to come get me and it takes me to the office. It gives me a chance to see the latest code the team has worked on and the latest mapping we’ve done. Riding in a self-piloting car gives you a different view of the road. You’re a lot more aware of what’s really going on. You start to think, why do people jaywalk?” or, “why would you cut me off?” It’s very entertaining.

How do your driverless cars work?

There’s a laser scanner that sits on top of the car and spins really fast. Sixty-four laser beams constantly sweep the area to detect and measure the distance of objects around it. Using this data, we can build accurate three-dimensional maps of the streets we’re on. We also use sensors to ‘localize’ the car. The car’s tires are equipped with encoders, which allow it to sense how many times it has turned over or what fraction it has turned, so the car can calculate how far it’s moved. Machine learning, wireless networks, and improved computing power have allowed us to do this, at this moment, and at the scale Uber needs it to be done.

rooftop scanner
Uber’s self-driving Ford Fusion scans the road with 64 lasers, creating a 3D map of the surroundings. Ray Lego

What are some of the toughest challenges for automated driving?

Perception–can you teach the car to see all the important things it needs to see on the road? Think vegetation. You drive down a tree-lined road and it looks different the next time because vegetation grows. The trees may be bigger or the leaves have fallen off. So we have to change the way the car perceives it. The next biggest problem is prediction. So if you identify there’s a car in front of you, what do you predict that car will do? Or if a car passes on the left, in the back of your mind you wonder about a number of scenarios that could happen next. Our cars do the same thing.

How do you solve those problems?

We have incredibly accurate maps of the areas we want to drive in because we’ve driven our mapping vehicles multiple times in those areas. By now we know what we consider to be “background.” After that it’s purely a machine-learning problem. So we build classifiers so that the car can say, “yes, that’s a bicycle.” Once we know it’s a bicycle, we can predict how cyclists normally move in the world.

What happens when the first driverless Uber accident occurs?

Immediately, we’d make sure everyone is safe. Then we’d start a deep dive to understand what happened so we can learn from it. We see crazy stuff on the road every single day and we understand how well we would’ve done in that situation, through simulations or log analysis. We’ll look at what went wrong and figure out if there’s more data to feed the system so we can learn how to handle the situation better.

So how long until driverless is the default Uber experience?

That’s a long road. There are three spurs to account for. First, technology: how can we make sure the car accounts for all situations. For everything that happens in real life–you could encounter a duck crossing the road, for example–it’s going to take a long time before we figure out how we want the car to respond to those situations. Next is regulatory: locally, nationally, globally, is regulation ready for us? Finally there’s the societal spur. Will the average user get into a car with no driver in it? Will the average commuter drive next to a car with no driver in it?

Condensed and edited. A version of this article was published in the November/December 2016 issue of Popular Science, under the title “The Professional Passenger.”