I can't take this any longer. A guy posts various videos on reddit and you tube testing autopilot limits and showing himself not monitoring the situation and doing dangerous things.... is all over the internet acting sanctimonious complaining about autopilot and tesla. And then allows himself to wreck. I don't understand how anyone at Tesla or on the autopilot team works on these vehicles day in and day out, just to have some butt muppet blame them for wrecking.
Tesla Model 3 unofficial road trip ends in crash, driver blames Autopilot and prior to this.... Tesla fan: Autopilot glitches brought peril to road trip
Studying the second photo at the bottom (looking back at the accident location) I would say that approaching that bend at 75mph on autopilot, rather than 'veering to the left' the car struggled to navigate the corner due to speed and continued straight into the barrier. If the driver was expecting the car to make the turn, the car continuing straight on would feel to him like the car veered left. Might be wrong but that's my take on it.
From the owner's Facebook page describing the events just prior to the accident... "The highway in my direction of travel divided at a fork, with the #2 right lane changing into the exit lane, and the #1 left lane remaining the lane for thru traffic. I was travelling in the #1 lane. My left hand was grasping the bottom of the steering wheel during the drive, my right hand was resting on my lap. The vehicle showed no signs of difficulty following the road up until this fork. As the gore point began, approximately 8m before the crash barrier and end of the fork, my Model 3 veered suddenly and with great force to the right. I was taking a glance at the navigation on my phone, and was not paying full attention to the road. I was startled by the sudden change in direction of the car, and I attempted to apply additional grip onto the steering wheel in an attempt to correct the steering. This input was too late and although I was only a few inches from clearing the crash barrier, the front left of the vehicle near the wheel well crashed into the right edge of the barrier, resulting in severe damage...." "By looking at my navigation and by not having both hands on the wheel, I was not paying full attention to the road while the vehicle was in Autopilot and was not following Tesla’s directions in regards to the correct use of the software. I want to make it clear that I take responsibility in regards to my actions."
The whole idea of Autopilot is to let the driver just do the risk analysis part, looking at your phone at a sharp turn at a median and loosely holding the steering wheel with one hand shows this individual is terrible at risk analysis.
Soooo he took a car that is made to drive on north America roads to Europe ( which most cars are right hand drive) against the manufactures advice, then crashes it and blames autopilot. Really.
Going viral: When YouTube stunts turn deadly The viral Internet stunts parents should know - CNN Nuff sad.
Didn't know that (haven't visited Europe, plan too) Thanks for the info. I don't like the idea of taking a car over to Europe when the manufacture says don't do it.
The design of the EAP at the present stage inherently makes drivers complacent. This is human nature we're fighting with here. And in fact this problem will get even worse as EAP improve, because as human intervention becomes rarer, the driver will be even more likely to lose focus. Tesla seems to be willfully ignoring this (at least publicly they are), and most fanboys on this forum are only too quick to point the finger at the drivers when things go wrong. But things shouldn't be that simple. Tesla has responsibility in the "usability" of the feature, and can do a more to make the feature more intuitive, less error-prone, more foolproof. Put it another way, if "user error" crashes are frequent or likely, then maybe the problem is with Tesla in not understanding how real users work, rather than the fault of all the users lured into making mistakes. That is, the feature is poorly designed and difficult to use correctly. This is not a right/wrong issue, but what could be done better to improve safety. (A similar example: I know that requiring users to press brake before shifting from P has prevented a lot of sudden acceleration mistakes.) I don't think there is an easy solution, however, and I don't have any quick ideas to this off of the top of my head.
That is not a solution to the problem, that is like putting a plaster on your bursted jugular. We humans are not made for driving cars to begin with, doing so is not something that comes natural to us so it does require a lot of mental capacity to it without compromising safety too much. The underlying problem is we let people of poor mental capacity drive heavy machines at high speeds on our roads. This problem is bigger or smaller in different countries depending on how easily your license can be attained but it exists everywhere You can add all the safety features you like, there will always be people dumb enough to crash either way as long as we do not address the actual issues. Until we get fully self driving cars we will have this problem in some sort of form.
Wow. He was in the LEFT lane and the car decided to try to take the exit 8m before the lanes split? Something's not right here.
Remember he was looking at his phone at the time, so his account of the distance will not be that exact at all. If his speed is correct at being at 120 km/h he was traveling at 33 m/s. So even if he would have felt the car steering sharply, stopped looking at his phone, looked out the window, analysed what he saw there, all in less than half a second, he would have hit the barrier before even finishing that. If it started turning 8 meters from the lane split that is, so his account of what happened is clearly not what did happen. If we give him the benefit of the doubt and say he has exceptional reaction times I would say we can assume a 2 second reaction time from veering to him looking and being in control, that is still almost 70 meters traveled in that speed. If he was tired his reaction time would be even longer.
So, the total number of accidents and dead humans will: a) increase b) stay the same c) decrease It can only be one of above, you have to decide which. Yes, some people will ignore warnings and stop paying attention. What is important though is how many. a) If the number of accidents increases then yes, AP made the situation worse. It will be stopped, if not by Tesla then by the government. b) If it stays the same, then what exactly is your point? The same number of people died, more of them by their own choosing. c) If the total decreases and you are still beating this horse you actually are wishing for more people to die. Is that really your position? More people need to die therefore the AP must end? Your are missing the forest because you can only see one tree.
I'm not missing the forest. I'm actually firmly in the (c) camp, and a big believer in technology and automation. I use EAP practically every time I drive nowadays. However, I do think Tesla could be doing more. And more importantly, I think people on this forum should be more sympathetic to drivers who succumb to such mistakes, if it is the case that such technology makes this type of mistake more likely. The title of this thread isn't helping.
My favorite quote from Xue from the link above “I’ll confess that I have fallen asleep on the road at least 15 to 25 times on the journey,” Xue said. SMH SMH SMH
He also been complaining of a lot of noise coming from his suspension every time he turned the wheel more than 10 degrees for the past several days. He didn’t get it looked at because he couldn’t go to a service center. It’s quite possible he suffered a suspension failure rather than AP randomly lurching 30ft in one direction. I’ve had AP move around on me but never by that much. Of course I stick to areas where they have ADAS maps...