
A Tesla automobile on the newest Full Self-Driving (FSD) Supervised replace abruptly veered off street and flipped the automobile the wrong way up – making a scary crash that the driving force mentioned he couldn’t stop.
We now have seen many crashes involving Tesla’s Supervised FSD over time, however the overwhelming majority of them have a significant contributing think about widespread: the driving force is just not paying consideration or is just not able to take management.
A typical crash situation with Tesla FSD is that the automobile doesn’t see an impediment on the street, like a automobile, and crashes into it, though the driving force would have had time to react in the event that they have been paying sufficient consideration.
Regardless of its title, Full Self-Driving (FSD) continues to be thought of a degree 2 driver help system and isn’t absolutely self-driving. It requires drivers to remain attentive always and for them to be able to take management – therefore whereas Tesla has extra not too long ago added ‘Supervised’ to the title.
In accordance with Tesla, the driving force is at all times accountable in a crash, even when FSD is activated.
The automaker has applied driver monitoring techniques to make sure drivers’ consideration, however it’s regularly stress-free these.
Simply as we speak, Tesla launched a put up on X through which it mentioned drivers simply need to “lean again and watch the street” when utilizing FSD:

Sitting again and watching the street was precisely what Wally, a Tesla driver in Alabama, was doing when his automobile abruptly veered off the street in Toney, Alabama, earlier this yr.
Wally leased a model new 2025 Tesla Mannequin 3 with FSD and understood that he wanted to concentrate. When speaking with Electrek yesterday, he mentioned that he would repeatedly use the characteristic:
I used FSD each probability I may get I really watched YouTube movies to tailor my FSD settings and expertise. I used to be glad it may drive me to Waffle Home and I may simply sit again and calm down whereas it could drive me on my morning commute to work.
Two months in the past, he was driving to work on Tesla Full Self-Driving when his automobile abruptly swerved off the street. He shared the Tesla digital camera video of the crash:
Wally instructed Electrek that he didn’t have time to react though he was paying consideration:
I used to be driving to work had Full Self-Driving on. The oncoming automobile handed, and the wheel began turning quickly, driving into the ditch, and side-swiping the tree, and the automobile flipped over. I didn’t have any time to react.
The automobile ended up flipping the wrong way up from the crash:



Happily, Wally solely suffered a comparatively small chin harm from the accident, but it surely was a scary expertise:
My chin cut up open, and I needed to get 7 stitches. After the influence, I used to be hanging the wrong way up watching blood drip right down to the glass solar roof, not understanding the place I used to be bleeding from. I unbuckled my seatbelt and sat on the material inside in the course of the 2 entrance seats, and noticed that my cellphone’s crash detection went off and instructed me the primary responders have been on their manner. My complete physique was in shock from the incident.
The Tesla driver mentioned that one of many neighbors got here out of their home to verify he was okay and the native Firefighters arrived to get him out of the upside-down Mannequin 3.
Wally mentioned he was on Tesla FSD v13.2.8 on {Hardware} 4, Tesla’s newest FSD expertise. He requested that Tesla ship him the info from his automobile to raised perceive what occurred.
Electrek’s Take
That is the place Tesla FSD will get actually scary. I get that Tesla admits that FSD could make errors on the worst potential second and that the driving force wants to concentrate always.
The concept is that in case you listen, you’ll be able to right these errors, which is true more often than not, however not at all times.
On this case, the driving force had lower than a second to react, and even when he had reacted, it might need made issues worse, like correcting, however not sufficient to get again on the street and hit the tree head-on as an alternative.
In circumstances like this one, it’s exhausting to place the blame on the driving force. He was doing precisely what Tesla says it’s best to do: “lean again and watch the street.”
A really comparable factor occurred to me final yr when my Mannequin 3 on FSD veered to the left, attempting to take an emergency exit on the freeway for no cause. I used to be in a position to take management in time, but it surely created a harmful scenario as I nearly overcorrected right into a automobile in the best lane.
In Wally’s case, it’s unclear what occurred. It’s potential that FSD believed it was about to hit one thing due to the shadows on the bottom. Right here’s the view from the front-facing digital camera, a fraction of a second earlier than FSD veered to the left:

Nevertheless it’s simply hypothesis at the moment.
Both manner, I believe Tesla has an issue with complacency with FSD the place its drivers are beginning to pay much less consideration on FSD – resulting in some crashes, however there are additionally these even scarier crashes that look like 100% attributable to FSD with little or no to no alternative for the drivers to stop them.
That’s even scarier.
FTC: We use earnings incomes auto affiliate hyperlinks. Extra.