Left turns are hard for self-driving cars and people alike

How Waymo autonomous cars train for the challenge of unprotected left turns.
Turning left is no fun. Photo by Aditya Chinchure on Unsplash

Share

Turning left is hard. In fact, one of the hardest maneuvers for any driver is called an “unprotected left.”

Imagine you’re pulling out of a driveway to go left on a two-lane road: there’s nothing protecting you—no traffic light or stop sign—from the traffic barreling toward you from the left. Not only do you have to sneak through a gap in the traffic speeding from that side, you also need to ensure that once you do, cars coming from the right in the lane you’re joining don’t hit you, either.

These tricky unprotected maneuvers come in multiple flavors. A slightly less complicated version is when you’re at a traffic light, and a circular green light—not a green arrow—gives you the go-ahead. You want to turn left, and must find a gap in the oncoming traffic.

You get the idea: left turns without protection are hard, and not just for cars driven by people.

“Unprotected lefts are one of the trickiest things you can do in driving,” says Nathaniel Fairfield, a software engineer who leads the behavior team at Waymo (formerly, Google’s self-driving car project), a group that focuses on problems like how self-driving cars plan their routes, position themselves in lanes, make decisions, and predict the behavior of other vehicles.

They’re difficult because the driver will have to pull off tricks like physically execute the turn, find a hole in traffic, and maybe nonverbally negotiate with others—especially if traffic is dense and slow-moving and they have to nudge the front end out—as they fit through a gap. “Sometimes it’s hard just because everybody’s going fast and it’s scary,” Fairfield adds.

Self-driving cars don’t feel fear, but the problem is challenging for them, too. That’s because it involves the other people on the road, Fairfield says.

Autonomous vehicles must interact with cars driven by humans, and make calculations like figuring out if others will slow down for them if they start turning—just like the mental calculus a person makes when doing the same maneuver. Or, they might need to figure out how to “politely request,” he says, that other cars yield for them, but do it without totally committing to making the turn. Of course, others on the road don’t always cooperate.

“So that’s why it is hard,” Fairfield says. “Because understanding people is hard.”

Figuring it out in simulation

Humans get better at driving by practicing in the real world. But self-driving cars can practice in simulation, where time is sped up, thousands of cars can drive at the same time, and engineers can change the situation to test how the vehicle performs. Waymo racks up 10 million miles in simulation daily, according to the company.

Waymo started using simulation years ago to answer a question about disengagements—the industry term for when the human in a self-driving car takes over the controls from the autonomous software. In California, companies file “Disengagement Reports” about their cars. Simulation gave them a way to test an alternate reality. “The natural question arises: Well, what would have happened if we hadn’t disengaged?” recalls James Stout, the technical lead on the simulation team at Waymo. “How would that situation have played out?”

Simulation now allows them to do more than that, of course. With unprotected left turns, computer time is helpful because Waymo can draw on experience from the real, asphalt-filled world, but modify it as they need to. The main way they use simulation is to “massively amplify the variety of cases we encounter in the real world,” Fairfield, the behavior group lead, says.

There, they can change geographical factors of an intersection, like the number and positions of lanes, and whether or not there are wild cards like railroad tracks or crosswalks.

“On top of that, you sprinkle all of the dynamic elements,” Fairfield says. Dynamic elements, in this case, are other vehicles—from motorcycles to big trucks—which can vary in speed, arrival pattern (are they in a “flock”?) and number.

“You can even vary how accommodating they’re feeling, or how aggressive they are,” Fairfield says.

Simulation also helps Waymo engineers make sure that if they update the self-driving software, that it doesn’t make the computer a worse driver in some situations—a regression in its ability.

“We use simulation both to explore the space very thoroughly, as we’re designing and building the system in the first place,” he says. “But then we also use it to evaluate any potential regressions as we’re adjusting or improving or refining the system.”

Maybe we should just take people out of the loop

All of this humans-are-tricky stuff raises the question: wouldn’t it be easier for autonomous cars to do things like unprotected lefts, and other hard maneuvers, if all the other cars on the road were also self-driving?

“I think to some extent it would be,” Fairfield reflects. “But I don’t really believe too much in that world.”

That because the roads aren’t just filled with cars. They’re also occupied by humans doing things like walking, jogging, and riding bikes. Also, even if all cars were Waymo vehicles and they could communicate with each other to signal their intentions, trying to hypothetically orchestrate all their routes with some insanely powerful central computer would just be too challenging and risky.

Another problem? Some people understandably just like driving. “Hey man,” Fairfield says, imagining a human who never wants to quit actually steering their car. “You can pry my ‘76 Buick out of my cold dead hands.”

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.