close
close

Tesla Full self-touring from the road and does not turn the car in a creepy crash driver

A Tesla vehicle on the latest complete self-driving (FSD) update suddenly turned off the street and turned the car on the head-one frightening crash, which the driver could not prevent.

We have seen many accidents with Tesla's supervised FSD over the years, but the vast majority of them have an important factor for common: the driver does not pay attention or is not ready to take control.

A common crash scenario with Tesla FSD is that the vehicle does not see any obstacle on the street like a vehicle and collapsed, even though the driver would have had time to react if they had given enough attention.

Despite its name, the complete self-driver (FSD) is still considered as a level 2 driver assistant system and is not completely self-driving. The drivers must remain attentive at any time and are ready to take control – while Tesla has recently “monitored” the name.

Advertising – scroll for further content

According to Tesla, the driver is always responsible for a crash, even if FSD is activated.

The car manufacturer has implemented driver monitoring systems to ensure the attention of the drivers, but gradually loosens them.

Only today did Tesla publish a contribution to X, in which the drivers only have to “lean back and observe the street” when using FSD:

I leaned back and watched the street exactly what Wally, a Tesla driver in Alabama, did when his car suddenly rejects from the street in Toney, Alabama at the beginning of this year.

Wally leased a brand new Tesla model 3 of 2025 with FSD and understood that he had to attentive. In conversation with Electrical Yesterday he said that he would use the function regularly:

I used FSD for every chance I could get. I actually looked at YouTube videos to adapt my FSD settings and experiences. I was glad that it could drive me to the Waffle House and I could just sit back and relax while it would go to work on my morning.

Two months ago he drove to work on the complete self -experience of Tesla when his car suddenly broke off the street. He shared the Tesla camera video of the crash:

Wally said Electrical that he didn't have time to react even though he was aware of:

I drove to work. The oncoming car drove past, and the bike turned quickly, drove into the ditch and lowered the tree on the side, and the car turned. I didn't have time to react.

The car drove upside down at the end of the crash:

Fortunately, Wally only suffered a relatively small chin injury from the accident, but it was a scary experience:

My chin split off and I had to get 7 stitches. After the impact I hung around upside down and watched blood dripped to the glass roof without knowing where I was bleeding. I detached my seat belt and sat down on the internal in the middle of the two front seats and saw that the crash recognition of my phone went out and said that the first aiders were on the way. My whole body was shocked by the incident.

The Tesla driver said that one of the neighbors had come out of their house to make sure that he was doing well and that the local firefighters arrived to get him out of the wrong model 3.

Wally said he was on Tesla FSD V13.2.8 on hardware 4, Tesla's latest FSD technology. He asked that Tesla sent him the data from his car to understand what happened better.

Electric setting

Here Tesla FSD becomes really scary. I understand that Tesla admits that FSD can make mistakes at the worst moment and that the driver has to draw attention at any time.

The idea is that you can correct these mistakes, which is the case most of the time, but not always.

In this case, the driver had less than a second to react, and even if he had reacted, things might have made things worse, like correction, but not enough to get back on the street and instead meet the tree frontally.

In such cases it is difficult to blame the driver. He did exactly what Tesla says should do: “Lean back and watch the street.”

Last year a very similar thing happened to me when my model 3 on the FSD changed links and tried to take an emergency exit on the highway for no reason. I was able to take control in time, but it created a dangerous situation when I almost corrected into a vehicle in the right lane.

In Wally's case it is unclear what happened. It is possible that FSD believed that it would hit something because of the shadow on the floor. Here is the view from the camera -faced camera, a fraction of a second before FSD has deviated to the left:

But it is only speculation at that time.

I think Tesla has a problem with complacency with the FSD, in which his drivers pay less attention to FSD – which leads to some crashes, but there are also more frightening accidents that are caused by 100% by FSD that have little or no opportunity for the drivers.

This is even more scary.

FTC: We use income income for auto -refiliate links. More.

Leave a Comment