Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model 3 Owner Wrecks in Europe, then blames everyone but himself.

This site may earn commission on affiliate links.

buttershrimp

Click my signature to Go Mad Max Mode
Supporting Member
Jun 17, 2017
3,328
8,924
ATX
I can't take this any longer. A guy posts various videos on reddit and you tube testing autopilot limits and showing himself not monitoring the situation and doing dangerous things.... is all over the internet acting sanctimonious complaining about autopilot and tesla. And then allows himself to wreck. I don't understand how anyone at Tesla or on the autopilot team works on these vehicles day in and day out, just to have some butt muppet blame them for wrecking.
 
Studying the second photo at the bottom (looking back at the accident location) I would say that approaching that bend at 75mph on autopilot, rather than 'veering to the left' the car struggled to navigate the corner due to speed and continued straight into the barrier. If the driver was expecting the car to make the turn, the car continuing straight on would feel to him like the car veered left. Might be wrong but that's my take on it.
 
  • Helpful
Reactions: Krugerrand
From the owner's Facebook page describing the events just prior to the accident...

"The highway in my direction of travel divided at a fork, with the #2 right lane changing into the exit lane, and the #1 left lane remaining the lane for thru traffic. I was travelling in the #1 lane.

My left hand was grasping the bottom of the steering wheel during the drive, my right hand was resting on my lap. The vehicle showed no signs of difficulty following the road up until this fork. As the gore point began, approximately 8m before the crash barrier and end of the fork, my Model 3 veered suddenly and with great force to the right. I was taking a glance at the navigation on my phone, and was not paying full attention to the road. I was startled by the sudden change in direction of the car, and I attempted to apply additional grip onto the steering wheel in an attempt to correct the steering. This input was too late and although I was only a few inches from clearing the crash barrier, the front left of the vehicle near the wheel well crashed into the right edge of the barrier, resulting in severe damage...."

"By looking at my navigation and by not having both hands on the wheel, I was not paying full attention to the road while the vehicle was in Autopilot and was not following Tesla’s directions in regards to the correct use of the software. I want to make it clear that I take responsibility in regards to my actions."
 
The design of the EAP at the present stage inherently makes drivers complacent. This is human nature we're fighting with here. And in fact this problem will get even worse as EAP improve, because as human intervention becomes rarer, the driver will be even more likely to lose focus.

Tesla seems to be willfully ignoring this (at least publicly they are), and most fanboys on this forum are only too quick to point the finger at the drivers when things go wrong. But things shouldn't be that simple. Tesla has responsibility in the "usability" of the feature, and can do a more to make the feature more intuitive, less error-prone, more foolproof.

Put it another way, if "user error" crashes are frequent or likely, then maybe the problem is with Tesla in not understanding how real users work, rather than the fault of all the users lured into making mistakes. That is, the feature is poorly designed and difficult to use correctly. This is not a right/wrong issue, but what could be done better to improve safety. (A similar example: I know that requiring users to press brake before shifting from P has prevented a lot of sudden acceleration mistakes.) I don't think there is an easy solution, however, and I don't have any quick ideas to this off of the top of my head.
 
Tesla seems to be willfully ignoring this (at least publicly they are), and most fanboys on this forum are only too quick to point the finger at the drivers when things go wrong. But things shouldn't be that simple. Tesla has responsibility in the "usability" of the feature, and can do a more to make the feature more intuitive, less error-prone, more foolproof.

That is not a solution to the problem, that is like putting a plaster on your bursted jugular. We humans are not made for driving cars to begin with, doing so is not something that comes natural to us so it does require a lot of mental capacity to it without compromising safety too much. The underlying problem is we let people of poor mental capacity drive heavy machines at high speeds on our roads. This problem is bigger or smaller in different countries depending on how easily your license can be attained but it exists everywhere

You can add all the safety features you like, there will always be people dumb enough to crash either way as long as we do not address the actual issues.

Until we get fully self driving cars we will have this problem in some sort of form.
 
From the owner's Facebook page describing the events just prior to the accident...

"The highway in my direction of travel divided at a fork, with the #2 right lane changing into the exit lane, and the #1 left lane remaining the lane for thru traffic. I was travelling in the #1 lane.

My left hand was grasping the bottom of the steering wheel during the drive, my right hand was resting on my lap. The vehicle showed no signs of difficulty following the road up until this fork. As the gore point began, approximately 8m before the crash barrier and end of the fork, my Model 3 veered suddenly and with great force to the right. I was taking a glance at the navigation on my phone, and was not paying full attention to the road. I was startled by the sudden change in direction of the car, and I attempted to apply additional grip onto the steering wheel in an attempt to correct the steering. This input was too late and although I was only a few inches from clearing the crash barrier, the front left of the vehicle near the wheel well crashed into the right edge of the barrier, resulting in severe damage...."

"By looking at my navigation and by not having both hands on the wheel, I was not paying full attention to the road while the vehicle was in Autopilot and was not following Tesla’s directions in regards to the correct use of the software. I want to make it clear that I take responsibility in regards to my actions."
Wow. He was in the LEFT lane and the car decided to try to take the exit 8m before the lanes split? Something's not right here.
 
Wow. He was in the LEFT lane and the car decided to try to take the exit 8m before the lanes split? Something's not right here.

Remember he was looking at his phone at the time, so his account of the distance will not be that exact at all. If his speed is correct at being at 120 km/h he was traveling at 33 m/s. So even if he would have felt the car steering sharply, stopped looking at his phone, looked out the window, analysed what he saw there, all in less than half a second, he would have hit the barrier before even finishing that. If it started turning 8 meters from the lane split that is, so his account of what happened is clearly not what did happen.

If we give him the benefit of the doubt and say he has exceptional reaction times I would say we can assume a 2 second reaction time from veering to him looking and being in control, that is still almost 70 meters traveled in that speed. If he was tired his reaction time would be even longer.
 
Last edited:
because as human intervention becomes rarer, the driver will be even more likely to lose focus.
So, the total number of accidents and dead humans will:
a) increase
b) stay the same
c) decrease

It can only be one of above, you have to decide which.
Yes, some people will ignore warnings and stop paying attention. What is important though is how many.

a) If the number of accidents increases then yes, AP made the situation worse. It will be stopped, if not by Tesla then by the government.

b) If it stays the same, then what exactly is your point? The same number of people died, more of them by their own choosing.

c) If the total decreases and you are still beating this horse you actually are wishing for more people to die. Is that really your position? More people need to die therefore the AP must end?

Your are missing the forest because you can only see one tree.
 
So, the total number of accidents and dead humans will:
a) increase
b) stay the same
c) decrease

It can only be one of above, you have to decide which.
Yes, some people will ignore warnings and stop paying attention. What is important though is how many.

a) If the number of accidents increases then yes, AP made the situation worse. It will be stopped, if not by Tesla then by the government.

b) If it stays the same, then what exactly is your point? The same number of people died, more of them by their own choosing.

c) If the total decreases and you are still beating this horse you actually are wishing for more people to die. Is that really your position? More people need to die therefore the AP must end?

Your are missing the forest because you can only see one tree.

I'm not missing the forest. I'm actually firmly in the (c) camp, and a big believer in technology and automation. I use EAP practically every time I drive nowadays.

However, I do think Tesla could be doing more. And more importantly, I think people on this forum should be more sympathetic to drivers who succumb to such mistakes, if it is the case that such technology makes this type of mistake more likely. The title of this thread isn't helping.
 
I can't take this any longer. A guy posts various videos on reddit and you tube testing autopilot limits and showing himself not monitoring the situation and doing dangerous things.... is all over the internet acting sanctimonious complaining about autopilot and tesla. And then allows himself to wreck. I don't understand how anyone at Tesla or on the autopilot team works on these vehicles day in and day out, just to have some butt muppet blame them for wrecking.
OMG.
This post speaks volumes. No words. Someday you'll look back on this and be ashamed.
 
He also been complaining of a lot of noise coming from his suspension every time he turned the wheel more than 10 degrees for the past several days. He didn’t get it looked at because he couldn’t go to a service center. It’s quite possible he suffered a suspension failure rather than AP randomly lurching 30ft in one direction. I’ve had AP move around on me but never by that much. Of course I stick to areas where they have ADAS maps...
 
  • Informative
Reactions: Krugerrand