Tesla driver blames self-driving mode for eight-car pileup

The Thanksgiving Day crash in San Francisco sent two children to the hospital.
The insignia of Tesla on the steering wheel of the plug-in electric car Model 3
Add it to the ongoing list of crashes potentially caused by Tesla's autopilot software. Deposit Photos

Share

The driver of a 2021 Tesla Model S says a Full Self-Driving (FSD) Mode malfunction is behind a Thanksgiving Day eight-vehicle crash on San Francisco’s Bay Bridge. The accident  resulted in two children receiving minor injuries. The incident, made public on Wednesday via a local police report and subsequently reported on by Reuters, is only the latest in a string of wrecks, some fatal, to draw scrutiny from the National Highway Traffic Safety Administration (NHTSA).

As The Guardian also notes, the multi-car wreck came just hours after Tesla CEO Elon Musk announced the $15,000 autopilot upgrade would become available to all eligible vehicle owners in North America. Prior to the expansion, FSD was only open to Tesla drivers with “high safety scores.”

[Related: Tesla is under federal investigation over autopilot claims.]

Although Full Self-Driving Mode has drawn consistent criticism and scrutiny since its debut, Musk has repeatedly attested to the software’s capabilities, going so far as to take interview questions from the driver’s seat of a Tesla engaged in the feature. FSD Mode utilizes a complex network of AI, sensors, machine learning, and camera systems to supposedly control the basics of driving in real time, such as steering, speed, braking, and changing lanes. According to the Thanksgiving Day crash’s police report, the driver claims his car suddenly and inexplicably slowed from 55 mph to around 20 mph while attempting to switch lanes, resulting in a rear-end collision that set off a chain of related wrecks.

[Related: YouTube pulls video of Tesla superfan testing autopilot safety on a child.]

The police report makes clear the crashes’ cause is still unconfirmed, and that the driver still should have been paying sufficient attention to take control of the vehicle in the event of FSD malfunctioning. Tesla’s own website cautions that its Autopilot and FSD Modes “do not make the vehicle[s] autonomous.”

Since 2016, the NHTSA has opened 41 investigations involving Teslas potentially engaged in Full Self-Driving Mode—19 people have died as a result of those crashes. Yesterday, the carmaker began notifying Tesla owners around the world of a complimentary, 30-day trial of its Enhanced Autopilot software, which supposedly offers automated lane changes, parking, and navigation.

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.