Washington D.C. — The U.S. National Highway Traffic Safety Administration (NHTSA) said Thursday it has opened an investigation into nearly 2.88 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) system, following more than 50 reports of traffic violations and a series of crashes that have raised new questions about the safety of automated driving technology.
According to the agency, FSD — an advanced driver-assistance system that requires driver supervision — has, in some cases, “induced vehicle behavior that violated traffic safety laws,” including running red lights and driving against the proper flow of traffic.
Reports of Crashes and Injuries
NHTSA said it has received 58 reports involving possible safety violations, including 14 crashes and 23 injuries. In six incidents, the agency said, vehicles operating with FSD engaged “approached an intersection with a red traffic signal, continued into the intersection, and subsequently collided with other vehicles.” Four of those crashes resulted in injuries.
The probe, described as a preliminary evaluation, is the first step before the agency could demand a recall if it determines the system poses an unreasonable safety risk. News of the investigation sent Tesla shares down 2.1 percent in early trading.
FCRF Launches CCLP Program to Train India’s Next Generation of Cyber Law Practitioners
Driver Complaints and Regulatory Pressure
One Houston driver told NHTSA last year that FSD “is not recognizing traffic signals,” adding that the vehicle “proceeds through red lights and stops at green ones.” The driver said Tesla “refused to fix or even acknowledge the problem,” despite observing it firsthand during a test drive.
Tesla did not immediately respond to a request for comment, though the company issued a new software update for FSD earlier this week.
The investigation arrives amid heightened scrutiny of Tesla’s advanced driver-assistance systems from Congress and under a newly confirmed NHTSA administrator. Lawmakers, including Senators Ed Markey and Richard Blumenthal, have urged the agency to examine the system’s behavior near railroad crossings after a series of reported near-collisions.
A Pattern of Safety Probes
This is not the first time Tesla’s self-driving technology has drawn regulatory attention. In October 2024, NHTSA opened a separate investigation into 2.4 million Tesla vehicles equipped with FSD after four crashes occurred in low-visibility conditions — including fog, sun glare, and airborne dust — one of which was fatal.
In January 2025, the agency launched another probe into 2.6 million Teslas over accidents involving a remote movement feature that allows drivers to reposition their cars without being inside. NHTSA is also reviewing the company’s robotaxi program in Austin, Texas, which began operations in June.
Broader Implications for Automation
Tesla markets FSD as a system that can “drive you almost anywhere with your active supervision,” but explicitly states it does not make the vehicle autonomous. The gap between those assurances and the system’s real-world performance has become a focal point for regulators and safety experts.
Oliver Carsten, professor of transport safety at the University of Leeds, said the NHTSA’s move “should serve as a wake-up call for Europe.” He added,
“We are seeing an increasing number of systems on the market that blur the line between driver assistance and full automation — and that’s where safety risks multiply.”
What Comes Next
If NHTSA determines that FSD poses systemic risks, it could order a recall — one that analysts say would be among the largest ever involving an automated driving feature.
For Tesla, the outcome of this probe may define not just the future of FSD, but the broader trajectory of how human oversight and machine autonomy can coexist on public roads.