Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Tesla’s ‘Full Self-Driving’ Under Investigation After Fatality in Fog

Tesla’s “Full Self-Driving” vehicles are under investigation following reports of crashes in low-visibility conditions, including an incident where a pedestrian was killed.
The probe by the U.S.’s National Highway Traffic Safety Administration (NHTSA) was initiated on Thursday after Tesla reported four crashes attributed to factors including sun glare, fog, and airborne dust.
The investigation will examine whether the “Full Self-Driving” technology is capable of detecting and responding effectively to reduced visibility on roadways.
It will also analyze the circumstances contributing to these incidents. The inquiry encompasses around 2.4 million Tesla vehicles manufactured between 2016 and 2024.
Newsweek has contacted Tesla via email for comments regarding the investigation. The company has consistently maintained that its “Full Self-Driving” system does not operate autonomously and that human drivers must remain vigilant and ready to take control at all times.
Last week, the company unveiled plans for a fully autonomous robotaxi during an event in Hollywood.
CEO Elon Musk has previously promised the launch of fully self-driving vehicles, with aspirations for operational robotaxis by 2026.
In addition to assessing the recent crashes, the NHTSA will investigate whether similar incidents involving the “Full Self-Driving” system have occurred in low visibility conditions.
The agency is seeking detailed information from Tesla about any software updates that may have influenced the system’s performance in these scenarios.
The review will focus on the timing and intent of such updates, along with Tesla’s evaluation of their safety implications.
Tesla has faced scrutiny from the NHTSA before, having issued two recalls related to the “Full Self-Driving” system. These followed a July incident reported by multiple media sources where a Tesla vehicle using the system struck and killed a motorcyclist near Seattle.
The recalls addressed issues where the system was programmed to ignore stop signs at low speeds and other traffic regulations, with fixes implemented through online software updates.
Critics have pointed out that Tesla’s system relies solely on cameras for hazard detection, lacking the advanced sensors—like radar and laser systems—that many competitors use to enhance visibility in adverse conditions.
This latest investigation diverges from NHTSA’s previous approach, which primarily viewed Tesla’s systems as driver assistance tools rather than fully autonomous driving capabilities.
The current probe shifts the focus to the performance of the “Full Self-Driving” technology itself, examining whether it can appropriately identify safety hazards regardless of driver attentiveness.
Michael Brooks, executive director of the nonprofit Center for Auto Safety, pointed out that earlier investigations into Tesla’s Autopilot system did not address the reasons behind Teslas failing to recognize and stop for emergency vehicles.
He said: “Before, they were putting the onus on the driver rather than the car. Now, they’re saying these systems are not capable of appropriately detecting safety hazards, whether the drivers are paying attention or not.”
This article contains additional reporting from The Associated Press

en_USEnglish