The U.S. National Highway Traffic Safety Administration (NHTSA) recently issued a statement citing suspicions that the Autopilot and FSD accident reporting data were false.Officially launches investigation into TeslaLater, it was announced earlier that a preliminary investigation had been launched on Tesla electric vehicles equipped with the "FSD" (Full Self-Driving) system, coveringApproximately 288 million vehicle models.
The National Highway Traffic Safety Administration (NHTSA) stated in a statement that some Tesla vehicles using the FSD feature have engaged in "autonomous driving behaviors that violate traffic safety regulations," including running red lights and driving against traffic. Six of the accident reports involved vehicles continuing to enter intersections at red lights, ultimately colliding with other vehicles.
The investigation is currently in the preliminary assessment stage. If subsequent results show that the risks are serious, the National Highway Traffic Safety Administration may require Tesla to recall vehicles or correct the system.
This isn't the first time Tesla has faced scrutiny from the National Highway Traffic Safety Administration (NHTSA). This year, the company has launched several investigations into its Autopilot features. Earlier this year, NHTSA conducted safety reviews of the "Smart Summon" and "Actual Smart Summon" features after some users reported unusual behavior while their vehicles were automatically exiting parking spaces.
In addition, the National Highway Traffic Safety Administration has also launched an investigation into the electronic door structure of the 2021 Model Y in recent weeks to confirm whether there is a potential risk that "passengers may be trapped in the car."
In response to external concerns, Tesla has long maintained that its autonomous driving system improves overall driving safety. According to the company's own safety report, vehicles with FSD autonomous driving mode enabled have a lower accident rate per million miles driven than the average for vehicles without the feature enabled.
However, Tesla has repeatedly refused to disclose more specific accident details and data sources, citing commercial secrets and user privacy as reasons, which has led the National Highway Traffic Safety Administration to question the safety of its technology.
The National Highway Traffic Safety Administration (NHTSA) has not yet announced a timeline for its investigation and subsequent procedures, but the scale of this investigation is already the agency's most extensive action targeting self-driving systems. It is widely believed that as U.S. self-driving car regulatory standards become increasingly stringent, Tesla's FSD autonomous driving system may become a key case study in testing autonomous driving safety and regulatory compliance.
While Tesla frequently improves its driver assistance capabilities through software updates, this investigation highlights the challenges of implementing autonomous driving technology in real-world road conditions. The true test for Tesla will likely be balancing the need for rapid innovation with public safety regulations.



