Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model 3 Owner Wrecks in Europe, then blames everyone but himself.

This site may earn commission on affiliate links.
Except... he was there and he's stating it as fact... But you're not believing him?! Somehow you think your doubts are more reliable than his first hand experience?

He may be mistaken, but you're far far more likely to be mistaken, by your own standard. And by my standard, which is what we've seen from past accidents that are similar.

I based my post on what he has publically written, and there's nothing wrong with that. You seem to be very willing to cherrypick what you like to hear from him. Things like suspension problems, or looking at his phone are believable. But when he criticizes AP, then he's no longer credible? I'm sorry, but I'd rather give him more benefit of the doubt.

He is obviously lying or has an incorrect account of what happened. He claims that he was able to react to the veering and even tried to save it, all in 0.24 seconds. That is faster than most people can react to a light flashing (that they are watching and waiting for to flash) yet he claims he not only started looking at the road but also tried to save it in that same time. More likely the car didn't veer 8 meters from the barrier, more like 70-100 meters before it, something he could have saved if he just had his eyes on the road.

Nobody is claiming Autopilot is perfect because it isn't, looking at the road as it splits is a given for anyone who knows anything about it. Then again this guy falls asleep behind the wheel, drives in the middle of the night with no lights on so reckless driving is not unknown to him. Hope the police takes his license so he can't keep making the roads unsafe.

Doesn't matter if he did this in a Tesla, Volvo, Mercedes or Audi, I would call him on his BS either way.
 
Last edited:
Wow. Horrible experiences. Guess I'm still 'on the fence' about full autonomous driving.

Who's at fault? The responsibility I see Tesla owning is initially using the term "AUTO-PILOT. It's NOT like were piloting a Boeing 767, at 35,000 feet, with no traffic or obstructions within five miles. It's a driver assist feature - ONLY. Drivers must never lose sight of the fact that they're operating a 5,000 # machine at a high rate of speed, nor the fact that they may be carrying passengers or pets.

So. What should Tesla engineers do, wire 480VDC to the drivers seat, and apply it momentarily to the driver's butt when they take their hands off the wheel, and ignore the flashing instrument cluster warning?!? Yes. I've experienced the 'hunting' the car does when lane markers disappear but I kept my hands on the wheel and took over control immediately. It's ultimately the drivers DUTY to DRIVE the car - responsibly. The responsibility is mine, not the Tesla engineers.

Not every state has achieved national road marking concepts (I. E.: dashed lines at highway exits/entrances). There are places where damaged crash barriers haven't been replaced, road repairs have been made but lane lines haven't been painted yet. Some DOTs have actually used two different materials (like the X crash site in Mountain View CA) - concrete right next to asphalt, that can confuse AP software. Is the local DOT at fault? Maybe, but putting your life in the hands of your cars computer in places like these is simply asking for trouble.

Do I use it? I use 'auto-steer' and 'speed control' frequently on road trips, but only on divided, limited-access highways with clear lane markings. I also remain focused on the road to anticipate well in advance of any required maneuvers. The phone stays in the console cradle and never let it distract me (luckily, I usually have my wife co-piloting for us). The time I spend waiting at SCs is the time I use waze, G-maps, or DOT web sites to review the next leg of the trip, looking for construction ahead, confusing 'spaghetti-bowl' interchanges, good food stops, etc..

The best thing I can say is - no one ELSE has yet died because of experimentation by these 'crash-test-dummies'.

K. I'm done ranting. Fire away...
 
There was nothing wrong with his initial statement. From everything we've seen so far, it is quite likely that AP did not correctly handle the fork in the road correctly. This is consistent with what happened in the fatal accident in California, and with videos posted by others after that accident.

If you read his full statement, he was very clear that he admits #2, and that he takes responsibility. I find that to be a lot more responsible and nuanced statement than a lot of the armchair commentators on this forum, because he gives credit where it's due, but is not shy about the problems as well.

I’m certainly not a fanboy/“Tesla is always right” person. My points all along are that people are complacent and then blame AP *FIRST*. No personal responsibility. AP did not correctly handle the situation, but it’s not advertised to do so. That’s why it’s learning.
It is indeed very similar to the Cali accident because guess what? AP is not FSD/L5. Again, You You admits things after the initial post and Tweet to Elon just stating that AP malfunctioned.
I’m not armchair QBing anything. Just stating the same thing I would in the Fire Truck accident as well as the Cali accident. Using your cell? Hands off the wheel for XX seconds in rush hour traffic? What about that guy in the UK that swapped seats while AP was engaged?
In fact, I’d respond in kind in those Summon accidental dent threads. Highly unfortunate incidences, but people knowingly take the risk, so wanting Tesla to pay for the repairs is ludicrous.
Like I said prior, I really think we are heading down a path where AP will be disabled. Either that or owners will have to take an online class to certify they understand the limitations and sign a new waiver to protect them from themselves.
 
Except... he was there and he's stating it as fact... But you're not believing him?! Somehow you think your doubts are more reliable than his first hand experience?

He may be mistaken, but you're far far more likely to be mistaken, by your own standard. And by my standard, which is what we've seen from past accidents that are similar.

I based my post on what he has publically written, and there's nothing wrong with that. You seem to be very willing to cherrypick what you like to hear from him. Things like suspension problems, or looking at his phone are believable. But when he criticizes AP, then he's no longer credible? I'm sorry, but I'd rather give him more benefit of the doubt.

No, I don’t believe him. I believe he thinks he knows what happened, but I don’t take much of what he said (as reported in the links) as credible, Tesla enthusiast or not. There were too many unbelievably bad choices made on his part. This calls his judgement into focus, and his decision to consult his phone on an unknown road, approaching a lane split into a turn without properly paying attention to the road, is inexcusable.

The important fact that was mentioned was the noise heard earlier from the suspension. That, coupled with how sharply he said the vehicle veered to the right is not consistent with AP as I know it, at least on a model S. While I don’t own a 3, I’m sure there’s little difference in that aspect. This, perhaps, seems more likely to have been a suspension failure, at least to me. That based on the facts he presented.

The driver would have little time to react at the speed he was traveling at, and the distance that was stated. One quarter of one second while not looking at the road and aware the car was starting to veer right is not possible to correct from.

Properly holding the wheel, which is not the same as resting his hand at the bottom of the wheel, would have been recoverable with his eyes on the road and feeling the wheel starting to turn by simply holding the wheel tightly, causing AS to disengage. Had that been done, the car should not have moved out of his lane if it were under control of AS and not a suspension failure.

Not cherrypicking at all. Just evaluating based on the more important facts he mentioned.
 
  • Helpful
Reactions: Krugerrand
From everything we've seen so far, it is quite likely that AP did not correctly handle the fork in the road correctly. This is consistent with what happened in the fatal accident in California, and with videos posted by others after that accident.

Yes, he didn't take corrective action in time to prevent the accident, and ultimately it is his responsibility and his fault. But these are two separate issues that people need to keep their head clear about:
1) AP incorrectly swerved at the fork
2) You You did not save the car in time

AP not knowing where the driver intended to go could have "correctly" taken the exit but the driver grabbed the wheel trying to stay to the left and swerved into the barrier. It's possible AP on it's own would have successfully taken the exit without any accident.
 
Read his latest post and poll. Senior Tesla Mgr as well as multiple national media outlets have been in contact. He just scored IMO. He will lawyer-up (or at least he will next be contacted by some).
I’ll bet he becomes the first PM3 owner free of charge... to continue spreading the word...
 
Read his latest post and poll. Senior Tesla Mgr as well as multiple national media outlets have been in contact. He just scored IMO. He will lawyer-up (or at least he will next be contacted by some).
I’ll bet he becomes the first PM3 owner free of charge... to continue spreading the word...

Great. He can continue what he's been doing even faster.
 
  • Funny
Reactions: xyeahtony
The driver is a risk taker. He has posted videos of using AP at night with the headlights off, commented about falling asleep with AP running, ignored the sounds of something wrong with the suspension on a car that has had moderate damage to the suspension system in the past. On top of that, the statements of what he claimed happened don't jive with any failure mode in the AP system that anyone else has ever noted at all. Everyone here knows that if anyone ever reported their car veering so hard to one side that they could not overcome the steering change that every naysayer in the world would be all over that. On top of that, anyone that has ever used AP knows how little force it takes to kick the car out of autosteer mode.
 
The fanbois are thick in this thread, not surprisingly. It's a race to the bottom insofar as which are worse - people with their heads in the sand when not blinded with optimism, or people pushing the envelope of safety and common sense with their cars. Neither are particularly helpful to Tesla in the end.

Bad drivers are not unique to Tesla. What's unique to Tesla is (their flavor of) AP. And by the way, there is no EAP in public release at this time, unless of course you consider the E to be silent.

If you haven't had the car with AP2 either jerk without warning to the left toward a median, or slow down for no visibly-discernible reason (phantom braking), or get confused in some other manner whilst upon a perfectly good highway, then you've been fortunate.

Doesn't mean it hasn't and doesn't happen to others. *All* of those scenarios happened to me during a recent 3,000 mile round trip to South Dakota and back to SoCal. And before that, and they'll happen tomorrow and next week and next month. Yes, I have been current with the latest firmware, most recently *.18*eee and now *.20*ff9.

It also doesn't mean that AP is not still capable of providing a safer experience than good old-fashioned manual driving (aka what the GUM* sadly experience on a daily basis - oh, the horror).

Yes, Virginia, both scenarios can co-exist - even in today's divided society.

Yes, fanbois, Tesla still leads the pack and yes, their employees work so hard every day.

Yes, moderate folks, there are both reasons to be concerned, and reasons to rejoice.

And yes, skeptics/shorts, occasionally drivers who push the envelope are going to get stung by the laws of probability. Put another way, if you push your daily schedule to the point that you're nodding off 10x/day, it's entirely possible that one of those inattentive moments one of these months or years will indeed intersect with a sub-optimal episode of AP behavior and then you'll be well and truly fooked.

That's the beauty and the curse of today's AP, which I try to refer to as DriverAssist. For 99.x% of the time, it'll reduce stress and react as well or better than you can during an extended drive - including upon surface streets (gasp). However, your task, role, and, dare I say, responsibility, whether you iz a fanboi, a short, or somewhere in between, is to remain (at least) minimally vigilant ONE HUNDRED PERCENT OF THE TIME.

Those two words, "minimally vigilant" I suspect will be the subject of discussion well past AP and EAP into the realm of FSD for years to come. After all - it was none other than Elon himself who suggested having a bit of a nap during a transcontinental trip would be possible someday (yes, in the driver's seat). Personally, I believe we'll have people on Mars first. But I also look forward to being pleasantly surprised at the accelerated timeline for both scenarios, and I also believe Elon will be first with both (Sorry, Richard. Sorry, Jeff. Not so sorry, GM, Ford, VW Group, Mercedes, BMW, Toyota, Nissan, Subaru, and Volvo.)


* GUM - Great unwashed masses
 
Well put Tao.
I have to add that Tesla cannot solve the AP problem on its own. Infrastructure has to be agreed on and implemented. Probably not all driving on the same side, but elimination of conflicting traffic controls from place to place.
 
  • Helpful
Reactions: TaoJones
I can't take this any longer. A guy posts various videos on reddit and you tube testing autopilot limits and showing himself not monitoring the situation and doing dangerous things.... is all over the internet acting sanctimonious complaining about autopilot and tesla. And then allows himself to wreck. I don't understand how anyone at Tesla or on the autopilot team works on these vehicles day in and day out, just to have some butt muppet blame them for wrecking.

I think this person has also admitted to falling asleep while on Autopilot. I am sure it has been mentioned already but I only read the first post.
 
looking at your phone at a sharp turn at a median and loosely holding the steering wheel with one hand shows this individual is terrible at risk analysis

I don't see how you can define how Humans are supposed to be able to figure that out (in the context of what AP may/may not decide to do, unexpectedly).

Do you ever look at the Dash? Are all those instances safe? Do you talk to fellow passenger? Listen to Radio? Aren't those things distracting?

I've looked at the dash on a straight road with generous distance to the cars in front, which then slowed down unexpectedly and I only looked up because of sudden braking by AP (which, on that occasion, did not need any input from me). Without AP I probably would have read-ended the car in front.

What about all those times when AP jumps on the brakes for no reason whatsoever (that I can figure out), and if I had not been vigilant and immediately pressed accelerator the traffic behind me, not seeing anything untoward ahead, and definitely not expecting me to jump on the brakes, might well have run into the back of me ... so I could conclude that there is no safe time to look at the dashboard ...

I have never had a "veer off the road" style intervention from AP, and (following on from that) I think the biggest problem with AP is the over-confidence that it builds. These weird, potentially fatal, one-off events are probably statistically one-in-a-100-car-ownerships, or maybe even once-in-1000-ownerships. But they do happen to someone; it might be either of us today ...

He threw himself in the barrier.

... or maybe not. Whilst I'm inclined to think that the fatal accidents we have read about appear to have had plenty of time to react, but the driver didn't, so I draw the conclusion that the driver was not paying attention. But if You You is correct, and it was a "sudden and violent veering", then maybe that's what happened in the fatal accidents too ...

Like the stick-jam that occurred on aircraft, at low altitude such that recovery was not possible, and the accident investigation only figured it out when it happened to someone at higher altitude such that they were able to recover it, land it, and present the (damaged) aircraft for inspection.

I increasingly think that Tesla needs to do something to marry this, brilliant, accident-reducing, stress-reducing, technology with the increasing over-confidence (or poorly informed re: the correct usage) of drivers.

I always, absolutely 100% of the time, drive with one hand on the wheel when using AP. I similarly 100% don't understand people who are content to drive hands-on-lap. If this once-in-Nnnn-ownships issue happens, even with eyes on the road, reacting will take longer, possibly with fatal consequences.

So why does Tesla actually allow hands-off-wheel, when it says "Keep your hands on the wheel" every time I engage AP? I do understand that people WANT it to be hands-off, and even that they might treat it as that (even though I don't, and wont ...) but if the technology isn't quite there yet then why allow this state to continue? As soon as hands-on-lap, or even text-while-driving, is safe then great, just fire the update down the OTA
 
I don't see how you can define how Humans are supposed to be able to figure that out (in the context of what AP may/may not decide to do, unexpectedly).

Do you ever look at the Dash? Are all those instances safe? Do you talk to fellow passenger? Listen to Radio? Aren't those things distracting?

I've looked at the dash on a straight road with generous distance to the cars in front, which then slowed down unexpectedly and I only looked up because of sudden braking by AP (which, on that occasion, did not need any input from me). Without AP I probably would have read-ended the car in front.

What about all those times when AP jumps on the brakes for no reason whatsoever (that I can figure out), and if I had not been vigilant and immediately pressed accelerator the traffic behind me, not seeing anything untoward ahead, and definitely not expecting me to jump on the brakes, might well have run into the back of me ... so I could conclude that there is no safe time to look at the dashboard ...

I have never had a "veer off the road" style intervention from AP, and (following on from that) I think the biggest problem with AP is the over-confidence that it builds. These weird, potentially fatal, one-off events are probably statistically one-in-a-100-car-ownerships, or maybe even once-in-1000-ownerships. But they do happen to someone; it might be either of us today ...

Risk analysis does not have to be about predicting what might happen, in the case of AP it can be scenarios where AP might have trouble.

I am not looking at the dash without knowing what is ahead of me, the same way I wouldn't in a non AP car. If I am on a decently straight highway with good visibility and nothing in front of me, sure I might fix something with the navigation or so for a few seconds before looking at the road again. I always keep enough grip of the wheel to be able to counteract the AP at any time. Part of the time I look away with is spent taking a glance at the navigation to see what is ahead of me if I am traveling on unfamiliar roads.

Risk analysis is not about making things safe it is about identifying risks so you are aware of them. If the person would have done his risk analysis he would not have been using his navigation as he was getting close to a road split, he would have seen it and stayed prepared.
 
One thing that concerns me is that I tried to find other manufacturers driving aids crashing cars into things and so far was not successful. Is it really only Tesla's system? If so, why? Waymo had a few well documented crashes, Uber the fatal one. But what about Cadillac, MB, Audi and all the others? I doubt their capabilities in the area are better, but why don't they crash then? Am I missing something?
 
One thing that concerns me is that I tried to find other manufacturers driving aids crashing cars into things and so far was not successful. Is it really only Tesla's system? If so, why? Waymo had a few well documented crashes, Uber the fatal one. But what about Cadillac, MB, Audi and all the others? I doubt their capabilities in the area are better, but why don't they crash then? Am I missing something?
I test drove the Honda system. It would not keep between the lanes well enough to have any faith in it. So the driver must comply with the hands on wheel.
 
  • Like
Reactions: bambam4171
If I am on a decently straight highway with good visibility and nothing in front of me, sure I might fix something with the navigation or so for a few seconds before looking at the road again

Sorry, I probably wasn't clear, that's exactly the sort of situation where I get sudden, unexplained, braking - so I'm not sure its any less of a "risk" of something unexpected happening than at the gore-point.

However, I made note of your point and will take into account that the consequence if AP does something unexpected on a clear straight road is much less likely to be dire than on a curve / gore point etc. and thus will adjust my driving-style accordingly..

I've made several such AP-driving-style adjustments, over the years, from reading this forum. What of drivers who don't read the forum and gain none of this education I wonder ...