Tesla’s Full Self-Driving technology has landed the company in hot water once again. The National Highway Traffic Safety Administration has escalated its investigation into 3.2 million Tesla vehicles after nine crashes occurred while using the self-driving feature, with regulators examining whether the software failed to alert drivers quickly enough in poor visibility conditions.
The timing couldn’t be worse for CEO Elon Musk, who’s preparing to roll out a new robotaxi model without a steering wheel or pedals. The regulatory probe has moved to an “engineering analysis”—a more serious level of scrutiny that could lead to enforcement action or recalls. Meanwhile, drivers and experts continue to question whether Tesla’s camera-only approach to autonomous driving can deliver on promises Musk has been making for nearly a decade.
The controversy centers on a fundamental disconnect between what the technology is called and what it actually does. Despite the name change from “Full Self-Driving” to “Full Self-Driving (Supervised),” drivers must still keep their eyes on the road and be ready to take control at any moment. As Tesla faces growing scrutiny from regulators and customers, the gap between Musk’s ambitious promises and the technology’s actual capabilities has never been more apparent.

The Reality Behind Tesla’s Full Self-Driving Promises
The National Highway Traffic Safety Administration has escalated its investigation into Tesla’s FSD systems after multiple crashes involving 3.2 million vehicles, while the company has quietly shifted away from its original autonomy claims.
Ongoing Gaps Between Promises and Performance
Elon Musk has been promising fully autonomous Teslas since 2015, but the gap between his predictions and what drivers actually get keeps widening. Despite the name, Full Self-Driving remains a driver assistance system that requires constant human supervision. Tesla owners who paid thousands of dollars for FSD expected their cars to eventually drive themselves without intervention.
Tesla has changed the meaning of Full Self-Driving to step back from its original promise of delivering unsupervised autonomy. The software still requires drivers to keep their hands on the wheel and stay alert at all times. What was marketed as a path to autonomous driving has become what the industry calls Level 2 driver assistance—far from the self-driving capability many customers anticipated when they purchased the feature.
Tesla’s Full Self-Driving in the Real World
Drivers report that FSD struggles with basic scenarios that human drivers handle easily. The system has particular trouble in low-visibility conditions including fog, sun glare, and airborne dust. Tesla’s cameras can’t always see what’s ahead when weather or lighting conditions degrade, yet the software doesn’t always alert drivers until it’s too late.
The NHTSA found that in crashes it reviewed, Tesla’s system “did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.” Tesla owners in cities like San Francisco have documented instances where FSD makes sudden, unexpected maneuvers or fails to recognize pedestrians and cyclists. Software updates continue to roll out, but each new version brings its own quirks and limitations that drivers discover on public roads.
Safety and Regulatory Investigations
The NHTSA probe has been elevated to an engineering analysis, the final phase before potentially ordering a recall. The investigation covers Tesla Model 3, Model S, Model X, Model Y, and Cybertruck vehicles equipped with FSD systems. Federal regulators are examining whether the technology creates safety defects that put drivers and pedestrians at risk.
Key concerns identified by the NHTSA:
- FSD fails to detect degraded visibility conditions
- System doesn’t warn drivers appropriately when cameras can’t see clearly
- Multiple crashes occurred within 30 seconds of FSD being active
- One fatal crash involved a Tesla driver using FSD who struck and killed a pedestrian
The investigation represents one of the most serious challenges to Tesla’s autonomous driving ambitions. A potential Tesla recall could affect millions of vehicles and force the company to fundamentally redesign how its driver monitoring system handles situations where visibility is compromised.
The Robotaxi Rollout and Its Shortcomings
Elon Musk has repeatedly promised that Tesla robotaxis would be operating in U.S. cities by now, generating income for owners while the cars drive themselves. The company unveiled the Cybercab concept as its vision for autonomous ride-hailing, but the reality hasn’t caught up. Tesla’s approach to autonomous vehicles relies solely on cameras and software, unlike competitors who use additional sensors like lidar.
The robotaxi promise assumed Tesla would achieve full autonomy, but federal auto regulators are questioning how the cars handle self-driving mode just as Musk prepares to roll out models without steering wheels or pedals. Without regulatory approval for truly autonomous operation, Tesla can’t legally deploy robotaxis that operate without human drivers. The gap between the robotaxi vision shown in Tesla showrooms and what the technology can actually do safely continues to widen.
Response, Backlash, and the Growing Divide
Tesla’s claims about Full Self-Driving have triggered multiple class-action lawsuits, regulatory investigations, and sharp criticism from competitors who say they’ve taken a fundamentally different approach to autonomous driving.
Lawsuits and Legal Scrutiny
U.S. District Judge Rita Lin ruled that Tesla must face a certified class action by California drivers who said Elon Musk misled them about self-driving capabilities for eight years. The judge found that Tesla’s claim from October 2016 to August 2024 that its vehicles contained hardware for full self-driving justified group lawsuits.
The classes include drivers who bought the Full Self-Driving package between May 2017 and July 2024. One customer purchased a Model 3 in December 2018 and paid $6,400 for the package in fall 2019, expecting autonomous capabilities soon after.
Tesla tried to dismiss the lawsuit by arguing it was unreasonable to assume all class members saw the challenged statements. Judge Lin disagreed, noting that Tesla’s distinctive advertising strategy—which doesn’t use mass advertising or independent dealers—meant interested buyers likely went to Tesla’s website for information.
Regulator and Safety Expert Criticism
The National Highway Traffic Safety Administration has examined whether Tesla’s Full Self-Driving software is safe. Several lawsuits raise concerns over the technology as Musk banks the company’s future on autonomous robotaxis.
Safety experts have questioned Tesla’s sensor suite and approach. The judge noted Tesla’s inability to “demonstrate a long-distance autonomous drive with any of its vehicles” and its lack of sensors to achieve high-level autonomy as key issues.
Competitors and Industry Comparisons
Tesla’s competitors are quietly winning the autonomy war while the company fights in court. Waymo and Cruise have deployed vehicles with more advanced sensor arrays, including lidar systems that Tesla has avoided.
The legal troubles come as Tesla faces additional lawsuits and falling sales. The mounting pressure highlights a growing divide between Tesla’s camera-based approach and competitors using multiple sensor types for autonomous driving.
More from Steel Horse Rides:

