Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Why you should not ignore Auto Pilot warnings to keep you hands on the wheel

This site may earn commission on affiliate links.
This post is SO misinformed and such BS frankly. I can make a post about how easily it is to die in a Bentley, or Honda, or Chevy, and then allow the car to drive off a cliff with my hands off the wheel.

AutoPilot is NOT full autonomous driving. It is designed to have the driver REMAIN ATTENTIVE and keep their hands on the wheel at ALL times ready to take over in a split second if the car is headed somewhere wrong.

In this case, the (moron) driver was using his hands to hold the camera to make this stupid video, rather than pay attention to the road. He ignored multiple visual and audio alerts to put hands on the wheel and had to drop the camera when a crash almost happened.

This is an example of what NOT to do. It’s foolish and immature and threatens to hurt the entire move toward (eventual/safe) autonomous driving.

AP is safely used hundreds of thousands of times a day by people who read the manual and follow the warnings. All it takes, however, is for one daredevil like this who wants to score “likes” on YouTube to ruin it for all the rest of us!

STOP DOING THIS! If you’re going to make AutoPilot “show off” videos, do it with a Dash Cam or a dash-mounted GoPro or cell phone, NOT with a handheld camera. You’re inviting trouble!
 
That's pretty much how I imagine the 101 accident happening.

That video is crazy because I would think AP would be programmed to avoid chevrons and not just continue driving.
I think we should all be required to do one of these before our cars will start. You know, to lend the neural net programmers/trainers a hand.

image2017-8-2%2010%3A55%3A7.png
 
What I see:

Autopilot working as designed by following the most clearly marked white line on the highway. IT IS KNOWN that Autopilot as of right now does not read road markings other than lanes — no stop lines, no median markings, no differentiation between outside lanes, dashed lanes, solid lanes, double-solid lanes, or anything like that. It is also known that Autopilot as of right now has difficulty detecting stationary obstacles.

This is what happens when you don't pay attention. Obviously, this was done for demonstration purposes (with a terrible clickbait video title), but all the responsibility in this video rests with the driver — the car had steered into the median zone for at three full seconds before the driver took control.

Is this what happened in the tragic accident in California? Quite possibly, and it only further underscores that Autopilot is still not a fully autonomous system. You're advised to not take your hands off the wheel, and though I know most, if not all, of us do, you still have to maintain awareness of the road around you. You are responsible for letting Autopilot almost kill you.
 
What I see:

Autopilot working as designed by following the most clearly marked white line on the highway. IT IS KNOWN that Autopilot as of right now does not read road markings other than lanes — no stop lines, no median markings, no differentiation between outside lanes, dashed lanes, solid lanes, double-solid lanes, or anything like that. It is also known that Autopilot as of right now has difficulty detecting stationary obstacles.

This is what happens when you don't pay attention. Obviously, this was done for demonstration purposes (with a terrible clickbait video title), but all the responsibility in this video rests with the driver — the car had steered into the median zone for at three full seconds before the driver took control.

Is this what happened in the tragic accident in California? Quite possibly, and it only further underscores that Autopilot is still not a fully autonomous system. You're advised to not take your hands off the wheel, and though I know most, if not all, of us do, you still have to maintain awareness of the road around you. You are responsible for letting Autopilot almost kill you.
You're not wrong of course. But this should probably be one of the next things that AP programmers take on. It is a known limitation of the system but one which should be able to be overcome with the current hardware. Atleast situations above where the car is driving over chevrons which is never an acceptable lane.

I think the reason we're seeing such interest in this accident is because there is a difference between AP failing to stop an accident such as not seeing a fire truck after a vehicle moved out of the way, or hitting a semi versus contributing to the accident by putting the driver in a life threatening situation.
 
Attached is a satellite photo of the 101 fatal crash site. In the lower right, just above the "B" in Bayshore, you can see where the collision occurred into the barrier at the edge of a wall. Just above it is the lane for the "flyover" ramp. Below it is the main highway left lane. You can see how the pair of solid lines marking a pavement-level "virtual island" could be mistaken for a traffic lane.

Tesla Crash.jpg
 
Suppose we change the semantics. Suppose we call it Lane Keeping Assist (LKAS) rather than Autopilot (let's set aside the Speed Assist aspect for now). With this semantic change it completely takes on a different and more realistic meaning. It also changes the expectation. Tesla is not the only automaker with LKAS, and any car with LKAS will drive into a gore point barricade just as easily, because we know LKAS is not self-driving. It could happen to any LKAS-equiiped manufacturer. Why is Tesla put on the defensive here and is it natural to assume that Autopilot is self-steering? It is not--it's really just LKAS, a good one. Nissan Leaf will have ProPilot LKAS. BMW/VW/MB have LKAS. Chevy Trucks will have LKAS this year.
 
  • Like
Reactions: Callahan
Suppose we change the semantics. Suppose we call it Lane Keeping Assist (LKAS) rather than Autopilot (let's set aside the Speed Assist aspect for now). With this semantic change it completely takes on a different and more realistic meaning. It also changes the expectation. Tesla is not the only automaker with LKAS, and any car with LKAS will drive into a gore point barricade just as easily, because we know LKAS is not self-driving. It could happen to any LKAS-equiiped manufacturer. Why is Tesla put on the defensive here and is it natural to assume that Autopilot is self-steering? It is not--it's really just LKAS, a good one. Nissan Leaf will have ProPilot LKAS. BMW/VW/MB have LKAS. Chevy Trucks will have LKAS this year.

I think it comes down to marketing. Most other car companies don't market it's lane keep assist as a way to help with driving. Tesla started with that, including this video below with Elon. I know this is not how the system behaves in real life, but the initial marketing pitch of autopilot really started this misperception of autopilot'' capabilities.

BTW, note the discussion at the end on how the car stopped for a stationary car. That's another marketing feature that has not become reality.

 
  • Informative
  • Helpful
Reactions: croman and jsmay311
Attached is a satellite photo of the 101 fatal crash site. In the lower right, just above the "B" in Bayshore, you can see where the collision occurred into the barrier at the edge of a wall. Just above it is the lane for the "flyover" ramp. Below it is the main highway left lane. You can see how the pair of solid lines marking a pavement-level "virtual island" could be mistaken for a traffic lane.

View attachment 291110

Or he can just look up for a split sec....
 
Also plenty of warnings where you should have taken control.

Except the screen flash/pulse warnings are just about having your hands on the wheel, right?

I.e., it's not the car warning that it doesn't know where the lane is and that the driver needs to take over, correct?

So I don't see how these warnings are relevant to the topic. It wouldn't change anything for a distracted driver who doesn't notice the car veering off course.
 
  • Like
Reactions: croman
Except the screen flash/pulse warnings are just about having your hands on the wheel, right?

I.e., it's not the car warning that it doesn't know where the lane is and that the driver needs to take over, correct?

So I don't see how these warnings are relevant to the topic. It wouldn't change anything for a distracted driver who doesn't notice the car veering off course.

True but moot point. Currently in AP, the driver should not be distracted.