Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phone Distracted Driver on Autopilot Slams into Firetruck

This site may earn commission on affiliate links.
The text is full of ambiguity. They call it Beta but it is better than beta?
"To be clear, when we say "beta", we do so to encourage a higher level of vigilance. If this were PC desktop or mobile software, we would not refer to it as such."
And all the crap about fleet learning.

They should really only have one line in that disclaimer.
Autopilot and TACC is experimental at this point, and pay close attention (more than when driving yourself) while using as the system may take actions an attentive driver usually will not.

Note that, as indicated, that was from early 2017. Actually it was the first time Tesla turned the AS function on.

AP2 has progressed exponentially IMHO since the early days. Your personal opinions are, of course, yours.
 
No it's not enough....if people still don't get it, it's not enough.
You can't take seemingly intelligent people by the hand and educate them to the reality of the current state of the technology, unless you want to sit beside them as they drive and teach common sense mile after mile. Guidelines and experience should be a good teacher for those interested in driving safely.

People have close calls all the time, yet then still do stupid things behind the wheel. We all see it every time we drive.
 
Actually if the AP camera detects the vehicle in the lane, then it will absolutely stop even when the radar did not ever see it. The trouble is that the camera is not so hot at detecting vehicles in the lane, especially at sufficient stopping distances at high speed. It seems likely that it is especially bad at recognizing fire trucks. In my experience it's not too bad at detecting cars with lit brake lights, I'd guestimate that it gets them 80% of the time in my AP1 car.

But it is sufficiently not-good at recognizing stationary cars that nobody should expect to rely on it, ever.

Couldn't agree more.

I also have a Volvo whose adaptive cruise control works fine except for stationary cars. In those situations my wife just can't trust the volvo and always brakes on her own.

While I don't brake with the Tesla I have my foot hovering over it in case it does not brake. I have braked a couple of times myself earlier a few months ago but in the last couple of months never needed to brake.
 
The word "Autopilot" by definition is the issue. I have AP1 and use it all the time, but as an enhanced cruise control. The Autopilot name implies the car will drive itself. While certainly it can "drive itself" it cannot be blindly trusted. I was in 1 month old, latest software AP2x loaner car and the car randomly slammed on the brakes going under an overpass. Totally clear day, but I was on the highway, going 75 mph. If someone was behind me, I would have caused an accident. Tesla needed and still needs MobileEye or a similar partner to get the system working better...Until then, Tesla owners need to be aware of limitations and realize we are part of a Beta test for this new technology.
 
  • Disagree
Reactions: Esme Es Mejor
You can't take seemingly intelligent people by the hand and educate them to the reality of the current state of the technology, unless you want to sit beside them as they drive and teach common sense mile after mile. Guidelines and experience should be a good teacher for those interested in driving safely.

People have close calls all the time, yet then still do stupid things behind the wheel. We all see it every time we drive.

My point has nothing to do with common sense. Your own guesstimation put the number of times this limitation has had to be explained on this board at "957,000". If after all those explanations in addition to the clear language in the on-screen manual in the car people still come in the threads after crashes like these proclaiming that their car can do something it cannot then a logical conclusion is that the message isn't being communicated effectively enough.

My guess is that most Tesla drivers don't use message boards that frequently and most drivers don't read manuals. But most drivers did have a conversation with a Tesla salesperson and/or delivery specialist prior to getting behind the wheel of their car. Having a spiel that all sales folks and delivery specialists could give emphasizing some of the limitations to AP would be far more effective. The sales folks could be more reactive to questions about AP, the delivery specialist could have it as a part of the standard things they go over with the customer when turning the keys.
 
  • Like
Reactions: alcibiades
You’re assuming that the problem is that these people don’t get it. Do you also believe that everyone who, say, gets a speeding ticket just didn’t realize they had to stay below a set speed?

Agreed - there is a subset of folks who already text and drive in regular cars, so "getting" how AP works isn't going to help them out. I rode for 5 hours on a company road trip with 6 of us in a Suburban once. The fellow who insisted on driving was also the one who insisted on trying to use his email on his phone WHILE he drove. Wouldn't give the phone to someone else to email for him, and wouldn't give up the driver's seat either (as we FULLY tested the Lane Keep Assist on the Suburban while we wove around as he e-mailed and drove). Good times.

That is the sort of person who would nod and agree during a "AP training session", then proceed to misuse the system anyway. And I guess there is an argument that he would actually be safer 90% of the time even misusing AP by letting it drive while he texts, since he is going to do it anyway AP or not.
 
  • Like
Reactions: croman
My point has nothing to do with common sense. Your own guesstimation put the number of times this limitation has had to be explained on this board at "957,000". If after all those explanations in addition to the clear language in the on-screen manual in the car people still come in the threads after crashes like these proclaiming that their car can do something it cannot then a logical conclusion is that the message isn't being communicated effectively enough.

Huh? Not sure where you cite guesstimates attributable to me as I made no quantitative comments. Perhaps you are confusing me with someone else...

And I don't agree with you. It is absolutely about common sense. "Don't do this, it may fail and you or others could be seriously injured or die" (paraphrased and implied from disclaimer and manual) and you do it regardless? Clearly, common sense took a back seat to natural selection.

There are many examples on TMC of people lacking common sense in the things they do to game the system, or posting videos of them passing long lines of slower traffic at 155 MPH just to see if they could hit the advertised limit. I doubt a delivery specialist spiel will help.

I agree with Az_Rael's post: They'll just nod their head, then go do stupid things.
 
You’re assuming that the problem is that these people don’t get it. Do you also believe that everyone who, say, gets a speeding ticket just didn’t realize they had to stay below a set speed?

No. But I believe the person that comes in here and states that their Model S's Autopilot/TACC will stop for a car that was stationary in the road from the moment it came into view genuinely believes it. Why would they lie about that?
 
Huh? Not sure where you cite guesstimates attributable to me as I made no quantitative comments. Perhaps you are confusing me with someone else...

And I don't agree with you. It is absolutely about common sense. "Don't do this, it may fail and you or others could be seriously injured or die" (paraphrased and implied from disclaimer and manual) and you do it regardless? Clearly, common sense took a back seat to natural selection.

There are many examples on TMC of people lacking common sense in the things they do to game the system, or posting videos of them passing long lines of slower traffic at 155 MPH just to see if they could hit the advertised limit. I doubt a delivery specialist spiel will help.

I agree with Az_Rael's post: They'll just nod their head, then go do stupid things.


I confused "SomeJoe" who I originally quoted with the "Joe" who started responding to me. Mea culpa.

As "SomeJoe" noted, it's common for people to come in here and show that they legitimately believe their car is capable of doing something it is in fact not capable of. Clearly, they haven't received the message.
 
No. But I believe the person that comes in here and states that their Model S's Autopilot/TACC will stop for a car that was stationary in the road from the moment it came into view genuinely believes it. Why would they lie about that?

For one, they may not be able to see better than the radar, at least not to the point that, if the car that far ahead seems to be stopped, it may in fact be moving enough that the radar can discern a speed difference as it approaches the "stopped" car. Just because their stop lights are on doesn't mean they are at a full stop. Perhaps.

As "SomeJoe" noted, it's common for people to come in here and show that they legitimately believe their car is capable of doing something it is in fact not capable of. Clearly, they haven't received the message.

Well, my MS can cure depression. Fact. :)
 
1) AEB and AP use the same hardware. If AP doesn't "see" a stationary object, AEB won't either. However, it is certainly possible that somewhere between gentle stop and collision the car decides there is a stationary object and activates AEB or AP brakes hard. Do we even know if AEB is active when TACC is active? Theoretically there's no need for it. I'm not surprised there's no evidence of AEB acting in these cases.

2) It's not true that AEB/AP can't detect stationary objects, There's just a significant probability that it won't. And any car depending on radar in particular will have the same problem. The road itself is a stationary object that the radar will see, and the radar is pretty low resolution. Vision just doesn't seem to be there yet, and maybe hasn't trained on fire trucks?

3) The flip side of improving braking for stationary objects is decreasing instances of phantom braking. Currently AP has a problem recognizing 100% of stationary objects, but it also has problems braking for non-existent objects. Currently, lowering the threshold to recognize more stationary objects would result in more phantom braking events, and vice versa. Tesla needs to work on better recognition algorithms/Vision to improve both problems at the same time. Humans can do it, so there's no reason AP can't do it, but it will obviously take more time.
 
The Utah case is a perfect bad example. 60 MPH, attention focused on cellphone, hands off the wheel. Slams into stopped traffic.
I hope they press charges in this case. There is no grey area to me.
Some penalty, loss of license.

FSD does not exist. The driver is responsible. This is a serious thing, there are now a lot of cars with various gadgets in them, people have to be held responsible.

They will piss and moan, whine and cry "I want my nanny", and arrive with lawyers and demands for cash. SSDD.

Does not change the fact that the driver is responsible for what happens.
Has to be held responsible.

Imagine for a moment it is you, or a loved one hurt or worse.
There has to be a line drawn, and I think it needws to be now.
Anyone able to shake some trees in Utah?
They don't play with stupid generally, there is a good chance of some com mon sense intervention there.
Tell them you want something done.

The line has to be drawn, and a serious message sent out, you are in that driver's seat, you are responsible for what happens, period.
/R
 
I’m in total agreement with you that features should be easy to use, and understand.

But, that might be more challenging than either of us realize. Even when they know better people do stupid things.

I have no doubt that it's challenging. But it's also part of designing a product that is ready for public use. If a convenience feature (like AS) isn't easy to use and understand, and through misuse can put the driver (and others on the road) at risk that convenience feature needs more work. AS shouldn't have been released until that work was completed.

Releasing a product with hard-for-the-user-to-predict behavior with basically just a disclaimer to "always be careful" should not have been an option. A legal disclaimer is very different from a set of instructions on proper use. And even Elon has been demonstration driving the 3 with his hands off wheel on national television. If even he ignores the disclaimer, there's something really wrong with the way this has been released.