Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot: Crashed at 40mph

This site may earn commission on affiliate links.
Tesla AP works quite well on the freeway. Have you used it? If not, please stop telling me I'm wrong without any experience to back up your statements.
No haven't used it, but what part of my statement is not true? It is very far from driving the car completely and has a huge amount of limitations (not the least bit being the subject of this thread: it still needs your undivided attention). This is unlikely to change until 2.0 at least.
 
Humans are never going to treat a system the same that does something differently. Autopilot takes over more control and requires less attention, this will have an effect on humans and change how they interact with the system.

It's like telling parents to pay as much attention to their 10 year old kid as to their 1 year old. It's just not going to happen, because the parents brain is already set on 1 year old = trying to kill himself 24/7, 10 year old = knows he should not walk into the street.
It is more like a telling parents to pay as much attention to their 2 year old kid than their 1 year old kid.

AP today only represents a slight step ahead of other autosteering systems (according to the naysayers here, it is worse in other aspects). Perhaps the proper analogy is that some people think their 2 year old kid is a 10 year old kid (mismatched expectation). I use "some" because apparently a vast majority of people are able to use AP responsibly just perfectly fine, while there are a few that complain (on the surface to me as primarily trying to skirt personal responsibility).
 
It seems likely in the incident that this thread is about that the driver inadvertently disabled the autopilot before the collision and the EBA did what it was supposed to and dropped the speed while effectively warning the driver of the crash. If the autopilot failed then the logs should show that. I have inadvertently disabled the autopilot myself but the double bong and the dashboard indicators greying out has warned me at the time - you can reasonably argue that the indicators are too small and the bongs too quiet at speed which makes the indication not obvious enough but for me it works OK.

In the first month I had my car the autopilot was a lot more flakey than it is now. I don't use it much on single carriageway roads, but on highways it is utterly invaluable in my book and I use it all the time. I find I'm both paying much more attention to what's going on around me and am also more relaxed than without because the car is doing the bulk of the high workload computing to keep the car in lane and at the correct speed. I have done around 8k mile on autopilot in 6 months and not once has it failed to pull up automatically in these conditions. I do monitor the conditions i.e.: weather, lane markings, idiot drivers etc and am ready to take over at all times particularly in those conditions I think the system may have an issue - I am amazed at how often it copes perfectly.

I always drive with one hand on the wheel the majority of the time - I don't know about the US, but in the UK the system will nag if you don't at an interval that seems to vary with speed. It starts with a gentle bong and message in the dash building up to a bong plus muting the sound and then final a loud warning with muted sound and graphics to ask you to take over which you'd have to be asleep to ignore.

IMHO this system dramatically reduces the likelihood of a crash assuming the driver is paying attention and is ready to take over if necessary, It still reduces driver stress even with this precondition in my experience, and is a massive boon to the longer distance driver. It is after all a Level 3 system not level 4 like the google cars.
 
No haven't used it, but what part of my statement is not true? It is very far from driving the car completely and has a huge amount of limitations (not the least bit being the subject of this thread: it still needs your undivided attention). This is unlikely to change until 2.0 at least.

What is it about people expressing opinions about things they know little about? Do you think you are THAT smart?

The limitations you mentioned occur when using AP on city streets and indeed you cannot use AP on city streets without intervening constantly. AP is not supposed to be used on city streets anyways, so it is kinda pointless to talk about it.

On the freeway, the lane keeping and traffic aware speed keeping works really well. So much so, that you can do long distance driving with all your freeway driving using AP. You only need to take over when entering the freeway, doing the initial merges, and when exiting. If you drive at a reasonable speed (like 75 mph tops), the system can handle the gentle freeway curves. It handles cars cutting in and out, and it handles stop and go traffic (which is really nice, BTW, takes all the stress out of stop and go freeway traffic).

So, once again, the system is good enough on its intended use case that makes the driver start to trust it more and more. To the point where checking that text or fumbling for an item in the back seat seems quite safe. Unusual freeway scenarios are by definition not very common, but they can confound AP and result in an accident if the driver isn't paying attention in those few seconds.

Don't get me wrong. When AP is used as intended, driver always looking forward (and BTW, the user interface of AP actually DISCOURAGES this way of driving because the initial hold steering wheel nag is SILENT and consists of small lettering the IC display, requiring the driver to frequently glance at the IC display), then it is great - it takes significantly less brain power to the point that you arrive noticeably less tired than when driving.

It's just that the system is both too good, sucking you into trusting it more than it should be trusted, and not good enough to actually be trusted.

I think that is why other automakers are waiting to roll out their AP. They don't want to fall into the same danger zone Tesla is in now.
 
  • Like
Reactions: Lump and Ivo-G
... What really matters is data and Tesla has already announced that analysis of their data shows that drivers using autopilot are half as likely to have an accident than those not using autopilot.

Which I totally do not believe. Elon did say this (AFAIK it isn't an official Tesla statement), BUT it all depends on how you slice and dice the statistics.

Freeways are almost twice as safe as non freeway roads. The vast majority of people only use AP on freeways. So the question becomes whether or not Elon's statistic was a comparison of freeway to freeway driving, or whether it compared all road driving to AP driving. I strongly suspect the latter since the former would be hard to do. So a simple comparison of AP versus non-AP would result in a 50% decrease of accidents just because freeways are safer, not because of AP.

Then you have to factor in the accidents that happened while not using AP where the user thought AP was on.

And then you have to truly analyze the data to ferret out the accidents that occurred when AP was on just a few seconds before the accident. Those might be counted as non-AP accidents, but in actual fact, relying on AP just before the car turned off AP, was the root cause.

So, yeah, show me a real analysis and then I'll be convinced.
 
  • Like
Reactions: bhzmark
Just wanted to ask you, as a non-AP driver - you're 100% ok with other cars on the road using experimental features that completely control the car - accel, brake, and steering, with your life at risk if it malfunctions?

I'm perfectly ok being on the freeway when not in my AP car with other people using AP because human beings make mistakes as well. Until you have statistical data that shows that people driving AP are more likely to cause accidents than regular human drivers are - I'm not sure why you're bothered so much.

You're the most tolerant person of risk to innocents that I've ever seen.

I never, ever thought Tesla would release AP in a state where it could cause an accident if the driver was not 100% attentive and ready to "save the day" instantaneously, making perfect split-second decisions at any moment required. I really think Tesla may go out of business from the lawsuit and related publicity if anybody in a non-Tesla auto dies due to an AP failure (or the Tesla driver's, depending on your view).

Are you a trial lawyer with experience bringing large lawsuits against automakers?

AP today only represents a slight step ahead of other autosteering systems

According to the one comparison test which actually exists (Car and Driver February 2016), Autopilot is not just slightly ahead of other systems - it causes errors at half the rate of the next best system when dealing with poor roads. And that was on an autopilot whose firmware was multiple generations older than what we have today.

What is it about people expressing opinions about things they know little about? ...the system is both too good, sucking you into trusting it more than it should be trusted, and not good enough to actually be trusted...I think that is why other automakers are waiting to roll out their AP. They don't want to fall into the same danger zone Tesla is in now.

I dunno - you keep saying Tesla in a danger zone and yet you have no data to back up your hypothesizing. Data which would be useful would be to show that, mile-for-mile, humans as a group are more likely to have an accident while using AP than while not using AP. Yes of course it's possible that someone looks away just when AP encounters some corner case it can't handle. It's also possible that someone looks away while manual steering, doesn't pay attention, and has an accident.
 
  • Like
Reactions: bhzmark
What is it about people expressing opinions about things they know little about? Do you think you are THAT smart?

The limitations you mentioned occur when using AP on city streets and indeed you cannot use AP on city streets without intervening constantly. AP is not supposed to be used on city streets anyways, so it is kinda pointless to talk about it.

On the freeway, the lane keeping and traffic aware speed keeping works really well. So much so, that you can do long distance driving with all your freeway driving using AP. You only need to take over when entering the freeway, doing the initial merges, and when exiting. If you drive at a reasonable speed (like 75 mph tops), the system can handle the gentle freeway curves. It handles cars cutting in and out, and it handles stop and go traffic (which is really nice, BTW, takes all the stress out of stop and go freeway traffic).

So, once again, the system is good enough on its intended use case that makes the driver start to trust it more and more. To the point where checking that text or fumbling for an item in the back seat seems quite safe. Unusual freeway scenarios are by definition not very common, but they can confound AP and result in an accident if the driver isn't paying attention in those few seconds.

Don't get me wrong. When AP is used as intended, driver always looking forward (and BTW, the user interface of AP actually DISCOURAGES this way of driving because the initial hold steering wheel nag is SILENT and consists of small lettering the IC display, requiring the driver to frequently glance at the IC display), then it is great - it takes significantly less brain power to the point that you arrive noticeably less tired than when driving.

It's just that the system is both too good, sucking you into trusting it more than it should be trusted, and not good enough to actually be trusted.
I've read enough about autopilot to know all of that. My point is that it is not self driving because it doesn't even accomplish the "on to off ramp" that Elon was aiming for. It can't change lanes to reach its destination (for example at freeway junctions) and if curves are sharp enough it will deactivate. So far it serves exactly the same function as other systems: ACC and autosteering/lane keeping. It does a very good job of it, but that doesn't mean it's really doing self driving.

I think that is why other automakers are waiting to roll out their AP. They don't want to fall into the same danger zone Tesla is in now.
None of the automakers are waiting to roll out their AP. They already had them even before Tesla did. The only difference is theirs have a nag timer or their autosteering is not as smooth. As pointed out already, outside the US, AP also has the similar nag timer. Are you advocating that Tesla add the same to the US version?
 
According to the one comparison test which actually exists (Car and Driver February 2016), Autopilot is not just slightly ahead of other systems - it causes errors at half the rate of the next best system when dealing with poor roads. And that was on an autopilot whose firmware was multiple generations older than what we have today.
Yes, error rate is much lower, but my "slight" comment is more in reference to additional features beyond just lane-keeping. It does have that turn signal lane change feature that I forgot to mention, but essentially it remains an ACC/lane keeping system like the rest. And it is a system that requires the driver's full attention (there are systems being developed that don't).
 
Another overestimation on Tesla Accident Avoidance System as a 5 day old Model X crashed into a building while attempting to park:




The owner stated:

"...plus it should have sensed the building and stopped.."

I agree that in a better Accident Avoidance System, manufacturers should program it to avoid such minor parking lot accidents but Tesla System is not there just yet!
 
Last edited:
But that won't make it any easier on Tesla's PR machine. The car SHOULD have been able to stop itself and prevent the crash. Just a shame that it's apparently not capable of that.

Sorry it's been a few days, but to add to your point... Tesla's fine print, exclusions, and "beta" status are not going to get very far with a jury when confronted with a case involving bodily injury or even death at the hands of Autopilot. Tesla is advertising the feature as if it's a done deal, putting a beta label on something doesn't absolve someone of responsibility and liability.
 
  • Like
Reactions: Ivo-G and sorka
Please pass this feedback along to Tesla directly.

Also, try the easter egg. It might help as a workaround for now.

I did 4 months ago. I did get a quick reply that they would consider a change. But no change noticed.

To the person that asked about traffic lights. They are positional, so easy to tell except when very dark. In that case, I either go slow or follow. I can tell the color difference when close or when it changes.
 
Other companies such as Mercedes, Volvo, Google... will accept liability if you buy their system.

Volvo, Mercedes And Google Accept Liability For Autonomous Cars

The downside is you might have to wait for years for them to perfect the system.

So it is nice that Tesla has released its beta system for the public. The problem is the public may not realize that it is still an imperfect system that does not work in certain situations so you use at your own risk.

That said, whenever something goes wrong, it is valuable to communicate so beta-drivers can be educated and be able to attempt to avoid that scenario in future.

The article says they will accept liability should an accident occur due to the technology. Duh. In other words, they will obey the law? I imagine Tesla would also be liable if it Is their car's fault. The article title is misleading.

The problem appears to be dishonest owners who won't accept responsibility, and are falsely claiming it was the car's fault. Nothing new about this, just now there is proof (logs) they aren't telling the truth. Shame on these drivers who hurt us all.
 
The article says they will accept liability should an accident occur due to the technology.

I don't see anything wrong with the statement.

It means:

If I consent to use the technology, it will take over my responsibilities and liabilities.

On the other hand, if I refuse to use the technology such as sabotaging it by blindfolding the system with mud, thick ice/snow... then it is no longer the technology's responsibilities and liabilities, it's now mine.
 
I don't see anything wrong with the statement.

It means:

If I consent to use the technology, it will take over my responsibilities and liabilities.

On the other hand, if I refuse to use the technology such as sabotaging it by blindfolding the system with mud, thick ice/snow... then it is no longer the technology's responsibilities and liabilities, it's now mine.
I think his point was that it is not a blanket assumption of liability as soon as the system is active, as many have assumed, but still needs to see the circumstances (same as it is today in a legal sense: automaker is liable if accident was from failure of the technology).

Probably we would need to see the fine print on what is actually covered when the actual cars are out. Just some PR statements doesn't really give enough details.
 
...The problem appears to be dishonest owners who won't accept responsibility...

I think an advantage of technology is to decriminalize human behaviors.

In the past, an elevator operator had to skillfully and manually control the speed and location. An argument might break out if it landed between floors or landed on a wrong floor and someone might have to be dishonest and lie ("Operator, What are you doing? Why are you stopping here? I said I wanted thirty, not thirteen....")

Google and others are working hard so that someday, you will be able to read e-mails, tweet, being pleasantly distracted and to do lots of behaviors that are being criminalized right now and they hope to make those into a norm with technology.
 
This was caused by the #1 problem of autopilot. Not knowing whether or not it is engaged. Twice now I've been cruising on the freeway thinking AP was on when it wasn't. There are many events which cause AP to turn off once you turn it on.

The only indication Tesla gives you that AP has turned off is a relatively quiet chime (and a very small icon that turns from blue to grey). This isn't enough of a warning that AP is no longer active. Tesla needs to address this now.

Yeah, I'm thinking that not just some little icon, but the entire driver's instrument cluster ought to get a colored tint overlay, sort of what we see in Sci-Fi shows when the shields are up. Maybe blue or green. And why not? Semi-autonomous driving is a big deal. Might as well show it off a bit more with a snazzier indication that its on.

-- Ardie
 
  • Like
Reactions: Matias
I think it's wonderful how Tesla is using its owners as beta testers for critical safety features. Wait until one of these cases ends up in court, that fine print of Tesla's isn't going to get them very far.
People keep saying this, but the exact same fine print used by other manufacturers seemed to have protected them just fine so far: no one has posted an example of a successful lawsuit against ACC or CC where statements in the manual describing limitations was deemed insufficient to free the automaker from liability. As for the "beta" label, Tesla isn't relying on that, but rather the statements in the manual (and what you click through to activate the system). Tesla has essentially the same legal risk for the ACC/autosteer system as other automakers.
 
Last edited:
  • Like
Reactions: bhzmark