A Tesla Cybertruck crashed into a concrete overpass barrier on a Houston highway in August 2025, and now dashcam footage released in a lawsuit is sparking intense debate about what really happened. Elon Musk quickly pointed to Tesla’s logs showing the driver disengaged Full Self-Driving four seconds before impact, but the video tells a more complicated story about those final moments.
The dashcam video shows the Cybertruck barreling straight toward a concrete barrier at highway speed as the road curved right, with the driver apparently trying to take control only after the system failed to navigate the turn. Justine Saint Amour, who had her one-year-old child in the backseat, is now suing Tesla for over $1 million after suffering spinal injuries and wrist damage in the collision.
The case highlights a growing controversy around Tesla’s driver-assistance technology and whether a system that works most of the time creates a dangerous false sense of security. With the actual footage now public and NHTSA investigating FSD after documenting crashes and violations, the question isn’t just about who was in control during those final four seconds—it’s about what led up to them.

Inside the Cybertruck Crash: Footage, Facts, and Controversy
A Houston highway crash involving a Tesla Cybertruck has sparked intense debate after dashcam footage contradicted claims that the vehicle’s Full Self-Driving system bore no responsibility. The incident left a mother injured and raised questions about what actually happened in those critical seconds before impact.
Dashcam Video Breakdown
The dashcam footage released by Hilliard Law shows the Cybertruck approaching a Y-shaped overpass split at full highway speed. The road curved to the right, but the vehicle continued straight ahead without any visible attempt to slow down or turn.
Traffic cones separating the lanes were knocked aside as the truck barreled forward. The Cybertruck then slammed directly into a concrete barrier at the edge of the overpass, with parts of the vehicle flying off on impact.
The footage suggests the vehicle was traveling at highway speed with no indication of braking or steering correction until the final moments. What makes the video particularly concerning is the apparent lack of any system response to the approaching curve.
Timeline of the Houston 69 Eastex Freeway Incident
The crash occurred on August 18, 2025, on Houston’s 69 Eastex Freeway. Justine Saint Amour was driving with her 1-year-old child in the backseat when the incident unfolded.
According to Tesla’s logs, the driver disengaged the Full Self-Driving system four seconds before the crash. Saint Amour’s legal team acknowledged this disengagement but explained she grabbed control after realizing the system wasn’t going to navigate the turn correctly.
Those four seconds proved insufficient to correct the vehicle’s trajectory at highway speed. The Cybertruck struck the concrete barrier with enough force to cause significant damage to both the vehicle and its occupants.
Claims by Justine Saint Amour and Legal Response
Saint Amour suffered two herniated discs in her lower back, one in her neck, sprained tendons in her wrist, and numbness and weakness in her right hand. Her infant remained unharmed in the crash.
She’s now suing Tesla for over $1 million through attorney Bob Hilliard. The lawsuit includes 16 allegations of negligent conduct and makes the unusual claim that Tesla was negligent in retaining Elon Musk as CEO.
Hilliard emphasized that his client attempted to take control only after the system was already failing. He argued that by the time Saint Amour recognized the danger and disengaged FSD, it was too late to avoid the collision.
Immediate Reactions from Tesla and Elon Musk
Elon Musk responded to the viral video by posting on X that logs showed the driver disengaged Autopilot four seconds before the crash. Tesla supporters seized on this detail as evidence that the media unfairly blamed FSD for what was technically a manual driving incident.
The company provided no additional context about what the system was doing before disengagement. Tesla hasn’t commented on whether FSD was slowing down or preparing to navigate the curve when the driver took over.
Critics pointed out that the four-second disengagement doesn’t address why the driver felt compelled to intervene in the first place. The timing suggests the system had already failed to recognize or respond appropriately to the road conditions ahead.
The Bigger Picture: FSD, Technology Reliability, and Legal Fallout
Tesla’s advanced driver-assistance systems have become central to mounting legal challenges and federal scrutiny. The technology’s limitations and how drivers interact with it continue to generate controversy across courtrooms and regulatory agencies.
How Full Self-Driving and Autopilot Work
Tesla offers two main driver-assistance systems: Autopilot and Full Self-Driving. Autopilot comes standard on all new Teslas and handles basic tasks like adaptive cruise control and lane keeping on highways.
Full Self-Driving is a paid upgrade that adds more advanced features. The system can navigate city streets, recognize traffic lights and stop signs, and attempt to handle complex intersections. Despite its name, FSD still requires constant driver supervision.
Both systems use cameras, sensors, and neural networks to interpret road conditions. The vehicle processes visual data in real-time to make driving decisions. Tesla does not use lidar or radar in its current FSD implementation, relying solely on camera-based vision.
FSD: Supervised Autonomy and Driver Trust
Tesla markets FSD as a supervised system that demands driver attention at all times. The company requires drivers to keep their hands on the wheel and remain ready to take control instantly.
This creates a trust problem. Drivers using FSD often report situations where the vehicle fails to merge or change lanes appropriately, forcing last-second interventions. One driver acknowledged his “big fail” after his Cybertruck crashed into a light post while FSD was active.
The gap between the system’s name and its actual capabilities leads to confusion. Some drivers treat FSD as more autonomous than Tesla legally claims it to be, while the company continues using marketing language that suggests higher capability levels.
Lawsuits, NHTSA Investigations, and Industry Scrutiny
Tesla faces multiple legal challenges over FSD performance. A Texas woman filed a lawsuit seeking over $1 million in damages after her Cybertruck allegedly tried to steer toward the edge of an overpass while in FSD mode. Another case involves a $243 million verdict related to FSD claims.
The National Highway Traffic Safety Administration has opened investigations into Tesla’s driver-assistance technology following crashes. These regulatory reviews examine whether the systems adequately ensure driver engagement.
Tesla has also faced criticism for releasing misleading crash safety data for years. The company recently began providing what appears to be more honest comparative data, though questions about transparency remain.
More from Steel Horse Rides:
- 13 Most Powerful Muscle Cars of All Time
- 13 Underrated JDM Cars That Deserve More Love
- 15 JDM Cars That Were Illegal in the U.S.
- 13 SUVs From the ’90s That Are Surprisingly Cool Today

