How Waymo is teaching self-driving cars to deal with the chaos of parking lots

We spoke with Waymo about training for these complex environments.

If you drive a car, you know that some maneuvers are harder than others. For example, an unprotected left turn—one with no green arrow to tell you when to go—will always be trickier than a standard right-hand turn. Humans aren’t alone in those struggles. Unprotected lefts are also a challenge for cars driven by computers, and so they train for them in simulation. Autonomous car company Cruise even published a video highlighting the 1,400 left turns its vehicles tackle in San Francisco in a day.

A left turn is a tricky maneuver, but the driving environment itself is also a factor in what kind of obstacles the human—or the self-driving car—might encounter. A two-lane road on a sunny day with clearly painted lines and scant traffic offers an easy landscape. But an IKEA parking lot on a Saturday afternoon? Ouch.

In fact, parking lots are a distinctive enough environment that Waymo, the self-driving car company that’s a sibling to Google, specifically trains its vehicles to deal with them by setting up real-world scenarios in a controlled environment. We spoke with Waymo engineers to learn more about how.

Things can lurk near a dumpster

Parking lots may technically have stop signs, speed limits, and crosswalks, but drivers and pedestrians tend to do whatever they want in these environments. Shoppers carrying boxes might dart across an active driving route instead of using a crosswalk, carts can be left in the wrong place, and vehicles might drive the wrong way or over empty parking spots. “Parking lots are uniquely challenging from surface streets,” says Stephanie Villegas, who leads what Waymo calls its “structured testing” efforts. “They don’t really have standardized rules for how people should and can move about within them.”

“They’re just kind of lawless,” she adds.

Adding to the Wild-West vibe, social cues play an important role in ambiguous driving situations. If you’re waiting for a parking spot to open up and see someone getting into a parked car, that person might wave to you to signal they’re not going anywhere—or that space is indeed about to open up. Humans understand those signals intuitively, but self-driving cars need to be taught in different ways. “A self-driving car is not social,” she says.

Waymo cars can practice on 91 acres at the former Castle Air Force base near Merced, California. There, the team can mock-up different parking lot types, from the easy to the crowded and annoying. The company’s engineers can dial up how complex the environment is, controlling dynamic factors like cars reversing out of spaces, pedestrians walking where they should be (or in front of the car’s path), to people carrying large objects that can change how the car’s perception system sees them. “Each time you turn the dial to increase complexity, the vehicle is beginning to have to assess and predict the behaviors of multiple things at the same time,” Villegas says. “That is challenging for sensors.”

Of course, the vehicles can’t magically get better on their own after being exposed to these situations. The engineers see how the cars perform, then change the software as needed. “It’s not just simple exposure,” she says; engineers update the software iteratively.

For example, the cars had to be taught to be careful around dumpsters—specifically, the structures that surround dumpsters that can have swinging doors and cinder-block walls. Waymo discovered that these spots can present opportunities for a shopping cart to roll out from behind one of the walls or for a pedestrian to emerge. Just as human eyes can’t see through concrete, a self-driving car’s spinning lasers or cameras can’t either, so Waymo had to ensure its cars moved with caution in near these occlusion zones. The car will “slowly creep ahead,” she says.

A dumpster area in Chandler, Arizona. Google Street View

Build the map, label the map, use the map

It makes sense to wonder: how would a self-driving car know it’s in a parking lot? And does it hate going to IKEA as much as humans do?

The answer to the first question is that it knows where it is because it uses GPS and follows a map. But it can’t just use Google Maps, because autonomous vehicles need cartographic resources that are much more granular than the ones you rely on while navigating.

To build the maps, Waymo deploys their own autonomous cars down the route they want to chart, using their sensors—lasers, cameras, and radar—to gather information about the physical space, like the width of roads, height of curbs, or where a stop sign is located in three-dimensional space. After gathering the data, the engineers then remove the elements, like pedestrians, that are dynamic and won’t be in the same plane next time the car drives there. They also have to label those maps for the cars, telling them that the edge of the road is, in fact, a road edge. This information “bootstraps the car’s understanding of the world,” says David Margines, the product manager for the mapping efforts at Waymo.

What that means in practice is that when a car enters a parking lot, it already knows the layout in its silicon brain. And that’s important, because parking lots may not be planned out with the same engineering rigor that goes into constructing, say, an interstate highway.

Margines says they’ll map out the whole lot for the car, including details like the orientation of the parking stalls if they’re angled, giving it an important cue as to what direction traffic should flow in that lane. They’ll tell the car what the “drivable areas” of the lot are, and even let it know that a place where service vehicles might drive—like the back of a strip mall—are off-limits. In a sense, the car’s knowledge going into a lot is what humans already have when entering a deeply familiar parking lot. “What we’re providing to our vehicles is a little bit like the maps we have in our heads,” Margines says.

The maps that Waymo gives the cars also provide data on the path it should take to get to its specific destination, as well as a fall-back route if the primary route is blocked by something like a truck with its hazards on. And because parking lots don’t always have stop signs where they need to be, Waymo maps can include “implied stops” where the car should halt anyway.

All of this is an important reminder that autonomous cars typically can’t just cruise anywhere magically—they only venture into places where they know the street layout already.