Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Laguna Beach, CA accident, title claims "autopilot" involved.

This site may earn commission on affiliate links.
You need to either accept this crash risk, or do not use the product.

AEB, FCW, LDW, and corrective blindspot tech are safety features. Autosteering without aggressive FCW (Forward Collision Warning) is going to produce a steady stream of crashes.

People don't watch the road in cars with no autosteering. Autosteering just makes it easier/safer to ignore the road for longer durations. But the way humans are, it's not hard to understand why these crashes occur.

Airbags are a safety feature too. We know what happens when they fail to function as designed.

Perhaps every automaker should put small print on page 297 of the manual referring to them as Beta-bags and suggest that it is the
drivers responsibility to avoid any collisions.
 
Airbags are a safety feature too. We know what happens when they fail to function as designed.

Perhaps every automaker should put small print on page 297 of the manual referring to them as Beta-bags and suggest that it is the
drivers responsibility to avoid any collisions.

Ugh... When you enable the functionality you are provided a confirmation of what you are enabling and from there you get to decide whether you want to use it or not... No one is forcing you to enable the feature, no one is forcing you to use it. All you have to do is simply pay attention while driving, that's literally it. Pay attention...

Why is it so confrontational to simply say "pay attention"?

Jeff
 
Ugh... When you enable the functionality you are provided a confirmation of what you are enabling and from there you get to decide whether you want to use it or not... No one is forcing you to enable the feature, no one is forcing you to use it. All you have to do is simply pay attention while driving, that's literally it. Pay attention...

Why is it so confrontational to simply say "pay attention"?

Jeff

You know what, none of this matters anyway. These vehicles will be fully autonomous in 2019 and these incidents will fade into history as simply growing pains.
 
  • Funny
Reactions: cwerdna and whitex
Numerous reasons, such as:
- they didn't RTFM or didn't remember all the sections related to AP and its limitations. This is why I suggested a mandatory multi-part audio tutorial for each separate key fob before the driver is even allowed to activate AP. Perhaps there should be periodic refreshers or small tips (e.g. "did you know that AP cannot reliably detect stopped vehicles?" "did you know AP cannot read traffic lights?")
- the software is a moving target (updates all the time and some capabilities get better or worse)
- most of them don't participate actively Tesla forums like TMC, esp. in AP related threads to keep up w/the improvements, regressions, limitations, etc.
- it worked fine most of the time for them
- environment conditions (weather, lighting, road, road painting, etc.) change which can cause AP to misbehave or behave differently than expected
- they became bored or inattentive due to AP
- due to lack of attention/situational awareness, they are not able to context switch and provide the correct response to avoid the accident in time

Maybe Tesla should REQUIRE a hands-on training session for all people who buy cars with AP. Maybe Tesla needs to have a huge lot of land where all scenarios where AP can and cannot be trusted entirely are laid out, including poor lane markings, merging lanes, speed signs, stop signs, construction cones, overpass,.....sort of like a movie set. The reason is most people don't read the owner's manual.

Is it fair to ask this question: If it was a car without AP, this accident would not have happened, because the driver would've been paying MORE attention on the road ahead? OTOH, he could've been texting and still hit the police cruiser, with or without AP.
 
  • Love
Reactions: MelaniainLA
Maybe Tesla should REQUIRE a hands-on training session for all people who buy cars with AP. Maybe Tesla needs to have a huge lot of land where all scenarios where AP can and cannot be trusted entirely are laid out, including poor lane markings, merging lanes, speed signs, stop signs, construction cones, overpass,.....sort of like a movie set. The reason is most people don't read the owner's manual.

Is it fair to ask this question: If it was a car without AP, this accident would not have happened, because the driver would've been paying MORE attention on the road ahead? OTOH, he could've been texting and still hit the police cruiser, with or without AP.

Agreed. Make owners have to take a one hour online AP training followed by a quiz to be certified for AP before AP is activated. Easy. Airline pilots have to get a special certification added to their pilot’s license to become “instrument rated” — why not human drivers of semi-autonomous and autonomous cars?

This said, I was interviewed at length for an upcoming Fox News Radio piece to air on 5/31/18 about Tesla and AutoPilot safety. The reporter drove my car and went into AP and I taught her how it’s safe IF used as directed. Manage to turn around a skeptic!! Will post here once it’s live 6am EST.
 
Stupid people doing stupid things and not paying attention. This is why we can't have nice things. They're running this story in Oklahoma every half hour on channel 9.2 in OKC, but not a peep about the Model 3 crash in Washington state where the family of 4 was safe and fine. Nor anything about Tesla pushing the Model 3 brake improvement over the air and getting a recommendation from Consumer Reports. Media bias? Agenda? I'm about to put on a tin foil hat...
 
  • Like
Reactions: MelaniainLA
Again why? Isn't solid green always a left turn yield on green.

No.
Some are binary: red arrow stop and green arrow go
Some are trinary: red arrow stop, green arrow go, green solid (no red arrow) yield on green.

The sign is a helpful reminder. I have an arrowless intersection in my city that could really do with such a sign. Have had people fail to yield there plenty of times.
 
It rained in So. Cal. this morning and I was reminded how poorly the auto-wipers work on AP2 cars. Tesla has not yet solved that relatively simple machine learning problem, so I’ve lowered my expectations for the car to drive itself.
 
Maybe Tesla should REQUIRE a hands-on training session for all people who buy cars with AP. Maybe Tesla needs to have a huge lot of land where all scenarios where AP can and cannot be trusted entirely are laid out, including poor lane markings, merging lanes, speed signs, stop signs, construction cones, overpass,.....sort of like a movie set. The reason is most people don't read the owner's manual.

Is it fair to ask this question: If it was a car without AP, this accident would not have happened, because the driver would've been paying MORE attention on the road ahead? OTOH, he could've been texting and still hit the police cruiser, with or without AP.

Well, the delivery guys can also do a better job. My AP safety briefing was exactly this:

"When you turn on autopilot for the first time, do it on a road that you know in good weather. It will take time for you to build trust in it".


I expected to hear all about parked cars, traffic lights, divided highways etc., but apparently in his mind the primary issue with AutoPilot is that people don't trust it enough.
 
So casualty and property loss are necessary--welcome to the world of free simulations!

Sarcasm aside, I strongly believe that Level 3 or 4 does not make sense. Cars should go straight to Level 5 without steering wheels.

Exactly what Elon will be doing next year, laughing in the face of doubters. I can't wait to see driverless Teslas zipping around.
2019 is almost here, the future is here!!!
 
  • Like
Reactions: whitex
As always, at the end of the day, this is driver error. Driver is always in control and responsible no matter what drivers aids are engaged.
Exactly correct. Even if AP swerves suddenly into oncoming traffic or into a truck next to you, the driver is always responsible.

But cases like this really do point out how poorly designed and maintained some areas of our roads are.
I mean, who puts parking on a narrow exit ramp with a line the leads to the parked cars and just to make sure you don't miss them, an arrow that points to right at them???[/ATTACH]

BINGO! Real world if very far from perfect. Even official rules of the road leave so many ambiguities as they were meant to be interpreted by humans, not computers. A human doesn't have a problem making a choice between "depart the lane vs. hit a parked police car". Recently Elon tweeted:
Elon-Humans-Unrderrated.png


So he realized that even in a much smaller and much tighter controlled environment such as factory floor (as compared to the entire live road system of the world) it is extremely difficult if not impossible to replace humans. In a factory he can mark things clearly, set and and follow strict rules, it doesn't rain or snow in there, you know ahead of time of any "construction", and of course it's a much smaller area than the world. In the real worlds roads are not marked clearly, sometimes marked abiguously or not at all, and other times you have to actually drive in the wrong lane because of construction or an accident. In a completely automated factory robots could deal with only robots, cars will have to deal with humans, animals, weather, etc.

So given all that, isn't true FSD a far greater challenge than factory automation (you know, to achieve what he tweeted a while back - see below)?
Elon-Summon-2016.png
 
Exactly correct. Even if AP swerves suddenly into oncoming traffic or into a truck next to you, the driver is always responsible.



BINGO! Real world if very far from perfect. Even official rules of the road leave so many ambiguities as they were meant to be interpreted by humans, not computers. A human doesn't have a problem making a choice between "depart the lane vs. hit a parked police car". Recently Elon tweeted:
View attachment 305048

So he realized that even in a much smaller and much tighter controlled environment such as factory floor (as compared to the entire live road system of the world) it is extremely difficult if not impossible to replace humans. In a factory he can mark things clearly, set and and follow strict rules, it doesn't rain or snow in there, you know ahead of time of any "construction", and of course it's a much smaller area than the world. In the real worlds roads are not marked clearly, sometimes marked abiguously or not at all, and other times you have to actually drive in the wrong lane because of construction or an accident. In a completely automated factory robots could deal with only robots, cars will have to deal with humans, animals, weather, etc.

So given all that, isn't true FSD a far greater challenge than factory automation (you know, to achieve what he tweeted a while back - see below)?
View attachment 305051

Hey now, stop making sense. Everyone here is going to bed tonight dreaming that tomorrow, yes tomorrow is the day that FSD is turned on in their vehicle and you will be sooooooo jealous.
 
Exactly correct. Even if AP swerves suddenly into oncoming traffic or into a truck next to you, the driver is always responsible.
I don't completely agree. I do think likely all the current crop of accidents are driver attentiveness issues but I also believe it possible that they are ways autopilot could fail that it would be impossible for even a fully attentive driver to correct for. This would be the fault of AP.
 
  • Like
Reactions: cwerdna