Teslas keep hitting emergency vehicles, and now the government is investigating

The cars in 11 crashes were all in Autopilot or adaptive cruise control mode.
fire trucks lines up on a road
Teslas on Autopilot have crashed into emergency vehicles at least 11 times. Photo by Connor Betts on Unsplash

Share

In the early morning hours of March 17, 2021, in Lansing, Michigan, a Tesla traveling on Interstate 96 smashed into a state trooper’s vehicle. Even though neither the police officer nor the Tesla driver were injured, the accident is notable because the Tesla was cruising in Autopilot mode. That collision is just one of 11 accidents involving Teslas in Autopilot that are the basis for a new inquiry from the National Highway Traffic Safety Administration, or NHTSA. 

The accidents, which occured in states such as California, Florida, and Texas, collectively resulted in 17 injuries and one death, according to NHTSA. The earliest incident dates to January, 2018 and the most recent occurred this past July, in San Diego. They all involve a Tesla in Autopilot mode hitting an emergency vehicle, such as a police car. The Tesla vehicles, according to a NHTSA document, “encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” 

The inquiry from NHTSA’s Office of Defects Investigation, opened on August 13, is formally known as “preliminary evaluation.”

“A preliminary evaluation starts the agency’s fact-finding mission and allows additional information and data to be collected—in this case about Tesla Autopilot,” a NHTSA spokesperson said in an emailed statement. In these nearly one dozen collisions between Teslas and emergency vehicles, “NHTSA has confirmed that in all of these cases the Tesla vehicles had either Autopilot or Traffic Aware Cruise Control engaged just prior to the crashes.”

Post Unavailable

Teslas use a series of cameras on the outside of the vehicles to gather information about the scene around it and to help make its Autopilot work; the company even employs a supercomputer to analyze, and learn from, what all those cameras detect. 

[Related: How Tesla is using a supercomputer to train its self-driving tech]

Autopilot is distinct from the kind of experimental technology that can allow a car to completely drive itself, like a Waymo vehicle. An undertaking like that usually involves sensors such as LiDAR, radar, and cameras; Teslas do not use LiDAR. 

The company offers a confusing array of driver-assistance features, from Autopilot, which consists of functions called Traffic Aware Cruise Control (TACC) and Autosteer, to another feature set it calls Full Self-Driving. The TACC feature, as Tesla describes it on their website, “Matches the speed of your car to that of the surrounding traffic” while Autosteer helps keep the vehicle “within a clearly marked lane” while TACC is engaged. “Full Self-Driving” packs other features, such as “Autopark.” 

[Related: Electric vehicle fires are rare, but challenging to extinguish]

The other accidents within the scope of this inquiry follow similar patterns as the incident in Michigan. In a May, 2018 accident, a Tesla in Laguna Beach, California, hit a stationary police vehicle. In December, 2019, in Connecticut, another Tesla smacked into a state police cruiser. And also in December of that year, a Tesla Model 3 hit a firetruck on an Indiana highway, killing a 23-year-old passenger in the electric vehicle. 

The NHTSA document notes that the streetscapes where the accidents happened “included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones.”

“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” a NHTSA spokesperson added via email. “Every available vehicle requires a human driver to be in control at all times, and all State laws hold human drivers responsible for operation of their vehicles.”

Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University, told The New York Times: “I think this investigation should have been initiated some time ago, but it’s better late than never.”

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.