Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Reported autopilot accident

This site may earn commission on affiliate links.
i simply don't understand? what is so hard/user unfriendly about identifying the blue steering wheel indicating autopilot is active or not?

It's second nature. You assume that AP is available, you double pull the cruise stalk, you hear bing bong (which means it failed, instead of bong bong, which means it succeeded, but you're thinking about something else so they sound the same to you), you let go of the wheel only to realize that AP didn't engage.


Ever do something so many times and have it work so many times, that the one time you didn't check if it worked, is when it failed? Yeah, this is like that.

Think about how many times you'll get into the model S and your reflexes will kick in and your hand will reach for the pushbutton/keystarter thing and/or your hand reaches for the gear selector in the wrong place? What's so hard about reaching for the right place? It's not hard at all, but when it's 2nd nature, you don't think about it and bam, no shifter there! (or in this case, bam AP didn't engage!)
 
i simply don't understand? what is so hard/user unfriendly about identifying the blue steering wheel indicating autopilot is active or not?

Let me reply with a brain teaser - why does everyone turn the radio down when they are trying to concentrate - i.e. find a new place, look for parking, etc?

The answer leads to a lot of user interface design, and how Tesla has done it wrong.. The bong and a corresponding 15 pixels of blue somewhere on the dash aren't good enough. If you're correcting due to trying to not crash into something, no human brain is going to pay attention to the bong. Again the user interface design is about how humans interact with the machine.

There are many other bad design flaws here too, for example the park distance control chimes are completely useless. Let's take a manufacturer that had a good design, like BMW - you get shorter interval beeps until you're at terminal distance where there is a solid tone. The tone communicates both distance and rate of approach without having to LOOK at the dash, as the user is already looking in 3 different mirrors, none of which are in the dash.


Basically, if I had access to the source code, there are about 100 different things I'd change to make the car not just easier, but better to use.
 
The bong and a corresponding 15 pixels of blue somewhere on the dash aren't good enough.

I disagree completely. Everyone is always looking to blame someone or something else other than themselves. At some point, somewhere, personal responsibility has to matter. There is a clear icon that changes color, it's easily visible with an audible chime to go with engaging and disengaging. No matter what Tesla does here, there is always going to be someone who refuses to take responsibility for their actions.

AWDtsla said:
There are many other bad design flaws here too, for example the park distance control chimes are completely useless.

Tesla's sensors are seeing in multiple depths to identify "3D objects", I don't want or need different chimes to tell me what I can clearly see from the sensor readout. If you went the faster beep option then it's have to latch on to the closest point to base the beeps off of which would be counter productive, from my perspective, just based on how the sensors read things. I don't know how your parking but I rely on the backup camera and sensor display, I'm not constantly looking at my mirrors.

Jeff
 
Warning: I got into a collision this morning while using Autopilot!

I took delivery of my new P90D last weekend (trading in my model 85 from 2013) and downloaded Firmware 7.1 two days ago. I had used autopilot for a few days with Firmware 7.0 and found it wonderful (astounding). Today was the first day I tried it using 7.0.

At about 8:30 AM this morning on I90 (road conditions perfect, visibility good), I was doing about 60 MPH and switched on Autopilot. I initiated a lane change with my turn signal and the car switched lanes seamlessly. My car automatically modulated my speed (with a two car distance) with the car in front of me, and I was cruising along happily when the car in front of me changed lanes and my car caught up to the car in front of him. After following this new car for a few minutes, the traffic began to slow.

My car slowed as well. But when the car in front of me came to a complete stop (not a sudden emergency stop, but rather a gradual stop), I expected my car to do the same (as it had been doing previously). It didn't. I slammed on the brakes in that dreadful instance before I realized my car wouldn't stop in time, but I still hit the car in front of me (while going maybe 5-10 MPH). I'd like to mention that I consider myself a very safe driver and have never been involved in any accident before (I'm 52). I damaged that car's rear bumper and cracked the plastic cover on my new Tesla (see attached photo).

After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.

They guy at Tesla said they will remotely run the logs from my car to see what happened. I told him that I understand it's my financial responsibility to pay for the repair to both cars, but that my main issue is to find out what happened, and if there is a flaw in the system to fix it. I hope they will be transparent and not try to cover up anything.
 

Attachments

  • REAR END.jpg
    REAR END.jpg
    65.7 KB · Views: 131
Last edited:
I disagree completely. Everyone is always looking to blame someone or something else other than themselves. At some point, somewhere, personal responsibility has to matter. There is a clear icon that changes color, it's easily visible with an audible chime to go with engaging and disengaging. No matter what Tesla does here, there is always going to be someone who refuses to take responsibility for their actions.
You're not getting it. THIS IS NOT ABOUT BLAME. It is about proper user interface design.

Tesla's sensors are seeing in multiple depths to identify "3D objects", I don't want or need different chimes to tell me what I can clearly see from the sensor readout. If you went the faster beep option then it's have to latch on to the closest point to base the beeps off of which would be counter productive, from my perspective, just based on how the sensors read things. I don't know how your parking but I rely on the backup camera and sensor display, I'm not constantly looking at my mirrors.

Jeff

They are NOT 3D. They simply detect distance from objects. The audio should indicate distance without looking at anything, and a more advanced version would map the location of the sensor to a speaker in the audio system so that people that still have 2 functional ears could fully utilize this information.
 
I'm sorry to hear that you were unable to stop your car in time and were involved in a collision.

After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.
His answer was half right. Autopilot is not "flawless", however it IS your responsibility. All actions of the car at this stage are your responsibility, and it is your responsibility to take over if the car is not doing what you need it to do. The only possible way for the car to be at fault in any collision in it's current form is if it fails to allow the user to take over, and I have never heard of such a case with a Model S.

As many people have said before, and Tesla has made abundantly clear, Autopilot != Autonomous.
 
I took delivery of my new P90D last weekend (trading in my model 85 from 2013) and downloaded Firmware 7.1 two days ago. I had used autopilot for a few days with Firmware 7.0 and found it wonderful (astounding). Today was the first day I tried it using 7.0.

At about 8:30 AM this morning on I90 (road conditions perfect, visibility good), I was doing about 60 MPH and switched on Autopilot. I initiated a lane change with my turn signal and the car switched lanes seamlessly. My car automatically modulated my speed (with a two car distance) with the car in front of me, and I was cruising along happily when the car in front of me changed lanes and my car caught up to the car in front of him. After following this new car for a few minutes, the traffic began to slow.

My car slowed as well. But when the car in front of me came to a complete stop (not a sudden emergency stop, but rather a gradual stop), I expected my car to do the same (as it had been doing previously). It didn't. I slammed on the brakes in that dreadful instance before I realized my car wouldn't stop in time, but I still hit the car in front of me (while going maybe 5-10 MPH). I'd like to mention that I consider myself a very safe driver and have never been involved in any accident before (I'm 52). I damaged that car's rear bumper and cracked the plastic cover on my new Tesla (see attached photo).

After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.

That sucks, sorry to hear.

I'm not trying to victim blame, but I have a question: Look at my bolded above, how far behind the car where you when the car in front of you came to a complete stop? (i.e. was he stopped and you were a mile away doing 65mph, and then realized TACC can't see him or were you following him, and your car just didn't slow down)

- - - Updated - - -

I'm also surprised AEB didn't kick in.
 
He didn't say accident.

As for sources: count me as one.

Yes, he did say "accident".

Listen, I agree that the UI should be the best possible, and if they can make improvements, they should.
But the driver has responsibility over the car. They have to pay attention to the big blue lines on the screen.
As instructed in the manual, they should be ready to take control of the car at any moment.

People using the AP on roads it is not yet designed for, or in ways it isn't designed for (such as filming themselves not sitting in the drivers seat).

I also believe Tesla should not be calling it auto-pilot, as it isn't yet.
It is drivers assist. The best drivers assist I have seen, but it isn't a fully autonomous pilot yet.
 
My car slowed as well. But when the car in front of me came to a complete stop (not a sudden emergency stop, but rather a gradual stop), I expected my car to do the same (as it had been doing previously). It didn't. I slammed on the brakes in that dreadful instance before I realized my car wouldn't stop in time, but I still hit the car in front of me (while going maybe 5-10 MPH). I'd like to mention that I consider myself a very safe driver and have never been involved in any accident before (I'm 52). I damaged that car's rear bumper and cracked the plastic cover on my new Tesla (see attached photo).

After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.

While I couldn't possibly speak to your specific circumstances as I wasn't there, I can tell you that your scenario happens to me all of the time on CA freeways and I have yet to ever have the car not stop or give me the impression it wasn't going to stop. I have had a time or two where the car in front of me changed lanes to avoid rapidly slowing down traffic which caused my forward collision warnings to go off the moment the car picked up the "new" car and it's much reduced speed. The few times this has happened I've instinctively braked very hard and all was well, it's not that I didn't trust TACC but old habits die hard. I have no idea whether or not the car would have stopped in time if I hadn't hammered the breaks...

The logs would tell Tesla exactly what did or didn't happen there. The first thing that popped into my head was you instinctively hit the break not realizing that it disengaged TACC\AP then removed your foot assuming the car would stop not realizing that TACC\AP had been disengaged. You should have at least had the car beep something furious at you in the moments leading up to the collision.

Jeff

- - - Updated - - -

You're not getting it. THIS IS NOT ABOUT BLAME. It is about proper user interface design.



They are NOT 3D. They simply detect distance from objects. The audio should indicate distance without looking at anything, and a more advanced version would map the location of the sensor to a speaker in the audio system so that people that still have 2 functional ears could fully utilize this information.

Well I don't have two fully functional ears so that would be a problem for me... :) I didn't mean 3D as in "3D" I was struggling with how to describe how it see's things in a manor thats different from other cars that I've owned with ultrasonics. My bad...

Regarding the interface design part, I think it's perfectly fine and was pointing out that no matter what you do to it personal responsibility still matters. That was all.

Jeff
 
Well I don't have two fully functional ears so that would be a problem for me... :) I didn't mean 3D as in "3D" I was struggling with how to describe how it see's things in a manor thats different from other cars that I've owned with ultrasonics. My bad...
It's not more complicated than my last car, except you have the front and rear. Audio should indicate both rate of closing and minimum distance reported at _any_ sensor, period. The more advanced stuff is for another day.

Regarding the interface design part, I think it's perfectly fine and was pointing out that no matter what you do to it personal responsibility still matters. That was all.

Jeff

Yes, yes, yes, yes, it's been covered many times. We know that's the legal strategy, that's fine, I have no problem with that or interest in arguing that at all.

Now, fix the stupid UI problem Tesla....

If I were designing it - autopilot becomes modal in that when you request it to be on, it tries to steer the vehicle. If you must make a correction it gives up and lets you, but if it detects you removing your hand from the wheel (Yeah and don't bring up the jargon in the manual here, not a productive argument) then it automatically re-engages with the appropriate bong sound.

edit - this would actually work even if you left your hand on the wheel, autopilot would have try to steer it to where it wants it in the lane, slight torque applied by the driver would re-disable.
 
I want to mention a few things about my collision this morning:

I love AP so much, I'm almost hoping that I did something wrong! As I mentioned in my post, when my car picked up the scent of the second car, it caught up to it, and followed it normally for a while. There was nothing out of the ordinary with the way the car in front of me stopped. And my car had been performing the same stops flawlessly. Why my car didn't stop is a complete mystery to me. While I take responsibility for what happened, I can't think of anything I could have done (besides not having used AP) to avoid the accident. My slamming on the brakes avoided a worse collision. The entire episode was HIGHLY unpleasant and I can't imagine how I will trust AP again. It's like being jilted by a lover!
 
It's second nature. You assume that AP is available, you double pull the cruise stalk, you hear bing bong (which means it failed, instead of bong bong, which means it succeeded, but you're thinking about something else so they sound the same to you), you let go of the wheel only to realize that AP didn't engage.


Ever do something so many times and have it work so many times, that the one time you didn't check if it worked, is when it failed? Yeah, this is like that.

Think about how many times you'll get into the model S and your reflexes will kick in and your hand will reach for the pushbutton/keystarter thing and/or your hand reaches for the gear selector in the wrong place? What's so hard about reaching for the right place? It's not hard at all, but when it's 2nd nature, you don't think about it and bam, no shifter there! (or in this case, bam AP didn't engage!)

but these are all well know human error modes (lapses -> memory/omission/repetition and attention slips combined with confirmation bias), which of course can be alleviated with a good user interface, alerts about the state of automation AND human training, systems knowledge (e.g. in which conditions does the AP disengage) and acceptance of responsibility.

for example simply trusting that the pull of the lever and a bing bong results in AP activation is not enough, no matter how good "the UI". as long as the human is boss, verification is required, and perfectly easy by checking the icon, which is always at the same location, central on the IC and enhanced with color.

i am curious about what changes to the UI you would propose that would make the verification easier?

overriding the AP temporarily with the AP remaining engaged sounds interesting. even then, verification is required.

for me a good strategy is always having the same scanning pattern in a directional flow that scans the outside (80%), the IC (speed, TACC/AP state, energy, ...), in repeating sweeps, thus always aware of the state of the vehicle.
 
but these are all well know human error modes (lapses -> memory/omission/repetition and attention slips combined with confirmation bias), which of course can be alleviated with a good user interface, alerts about the state of automation AND human training, systems knowledge (e.g. in which conditions does the AP disengage) and acceptance of responsibility.

for example simply trusting that the pull of the lever and a bing bong results in AP activation is not enough, no matter how good "the UI". as long as the human is boss, verification is required, and perfectly easy by checking the icon, which is always at the same location, central on the IC and enhanced with color.

i am curious about what changes to the UI you would propose that would make the verification easier?

overriding the AP temporarily with the AP remaining engaged sounds interesting. even then, verification is required.

for me a good strategy is always having the same scanning pattern in a directional flow that scans the outside (80%), the IC (speed, TACC/AP state, energy, ...), in repeating sweeps, thus always aware of the state of the vehicle.

I'm not arguing any of that nor do I have a solution that would make this work better (I think it's good as it is, it's very rare for someone to think AP turned on when in fact it didn't), you just said you don't understand what's so hard to understand if the blue steering wheel icon is on or not. I explained why some people are having problems with it.
 
From January 11th,
Elon Musk says Tesla's autopilot is already better than human drivers - The Washington Post

"Musk said, he was not aware of any accidents caused by autopilot. He said the closest scenario were accidents where drivers mistakenly believed they were in autopilot mode."

I wonder if this was one of those times.

I started a thread on this (may be others, I didn't see them). http://www.teslamotorsclub.com/show...Autopilot-does-not-enable-indication-adequate

I wish it would give a much more clear indication if you double-pull the stalk but it doesn't go into AP mode. Wouldn't be surprised if there were accidents because of this.

- - - Updated - - -

i am curious about what changes to the UI you would propose that would make the verification easier?

I can think of many. How about a huge red "X" and a screaming alarm? How about the whole display blinks in and out of visibility with a message "AP DID NOT ENGAGE", anything BUT what we have now. What we have now is hard to discern as different from when AP is successfully engaged. Many times I've thought AP was on and drifted out of my lane.
 
When I've had a failure to go into AP, a pull up window has appeared that says something like Autopilot Not Available at This Time or words to that effect in addition to the bong.

However, I do agree there should be a more visible indication when the car is in Autopilot. The wheel icon turning blue is a bit subtle for something that important.
 
Sandstruck, I appreciate you sharing this incident! I hope you will share as you get more information from Tesla, it is concerning that an experienced user had this happen! I understand the feeling of betrayal toward auto pilot!