Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A Public Letter to Mr. Musk and Tesla For The Sake Of All Tesla Driver's Safety

This site may earn commission on affiliate links.
ut I personally vote for geofencing AP and excluding two lane, undivided roads (for now). It shouldn't be used there anyway, so owners won't be losing any supported functionality.

So you don't want auto-pilot to EVER be used on these types of roads? Tesla's path for creating a working autonomous driving system is to have driver's train the system. That means at some point the system needs to be in control and have drivers correcting any issues. Geofencing prevents this.

Thank you kindly.
 
Fatigue is also a big factor, since it's a pretty long and stressful drive (especially coming back home at night, after a long day at the beach, trying to see the road past everybody's headlights)
This is an underappreciated value of auto-steer: by reducing the visual demand on the driver during "routine" driving -- particularly
the need to watch the center line and thus stare into the headlights of oncoming cars -- the driver's visual acuity is spared for those
times when it really matters. This benefit hadn't occurred to me until I was surprised to find my wife -- who's generally pretty skeptical
about AP -- using it at night on a two-lane road. When I asked her she said she actually felt safer driving at night on that sort of road
with AP "having her back", as it were.
 
Do you mean that while you cannot predict what, exactly AP will do, you can predict that it is going
to do something wacky and unexpected?

Yes, that's what I meant. If you spot something ahead you know will confuse AP then you can disable it and take full control.

if you think everything ahead looks normal and you're using AP in the way Tesla recommends, but it gets confused by something it senses and makes a decision to suddenly steer in a different direction, then it doesn't matter how attentive you are being, there is a chance AP could put you in a dangerous position and that's not a good situation. But is that scenario happening?
 
This appears to be a contradiction. Do you mean that while you cannot predict what, exactly AP will do, you can predict that it is going
to do something wacky and unexpected?
I think you can predict when the risk rises by trying to anticipate what might confuse a machine vision system. So on interstates that would obviously include construction areas / lane shifts, faded markings, bad skid marks, long underpasses and big trucks. On two lane roads that would include shaded areas entering a turn, stripy tree shadows, right hand exit lane coming off a left hand turn, change in appearance of surface due to road repairs, a line of oncoming cars, and (especially) hill crests. I like to use AP with my hands off the wheel, but I put a hand on or very near the wheel when circumstances like these arise.
 
So you don't want auto-pilot to EVER be used on these types of roads? Tesla's path for creating a working autonomous driving system is to have driver's train the system. That means at some point the system needs to be in control and have drivers correcting any issues. Geofencing prevents this.

While we don't have very clear insight into how fleet learning works, this is very unlikely to be a correct representation. In your defense, it's often cited in AP threads - use AP to train it - but that's a form of reinforcement learning, which is rarely deployed in commercial environments. More likely, the training occurs from taking driver feedback while AP is disabled. Taking sensor data along with the driver's maneuvers is a much quicker and more efficient way to train a model. It also is a better way to build the high precision maps that Elon has referenced.

I think disabling AP on those roads does nothing to set back the future abilities of AP. Remember that Elon also referenced the thousands of human driver experts that were training the system prior to AP being released.

I continue to stand by my assertion that if Tesla says not to use it on these roads, they should just go ahead and disable it there.
 
  • Like
Reactions: HookBill and bonnie
This appears to be a contradiction. Do you mean that while you cannot predict what, exactly AP will do, you can predict that it is going
to do something wacky and unexpected?
This is why I don't use AP that often. Where I live (Southern California) it's hard not to run into one of those moments. On one very common 40 minute route down the 101 from Ventura to L.A. I encounter freeway construction, faded mixed with freshly painted lines, emergency cones, tractor trailers on my right and so on at least 6-8 times each way. Knowing that it has acted squirly in situations like that means it's not worth turning on and it wouldn't be relaxing.
 
I'm back to wondering whether it's possible for a Tesla driver to look ahead at a change in the road surface, markings, street furniture etc. and be able to predict that AP will be confused by it enough to react unpredictably.

Definitely. I'd say about 75% (very, very roughly) of the the time Autopilot misbehaves for me, I've already spotted the potential trouble and am ready to take over. Sometimes it surprises me, but usually not. There are certain situations where it fails fairly often, like extremely faded lines or cresting a sharp hill.

When I asked her she said she actually felt safer driving at night on that sort of road with AP "having her back", as it were.

I think a lot of people don't realize that it's not a question of Autopilot or human driving, but that you can do both at once. You don't have to disengage your brain when you engage Autopilot. If I'm cruising on a wide highway in the daytime with light traffic and good lane paint, I'll certainly disengage some. But if I'm on a little two-lane road at night, I'm fully engaged, even if Autopilot does the grunt work.

The real question is whether Autopilot can lull you into disengaging when you shouldn't. I don't think it does for me, and I don't think it has to for any conscientious, responsible driver.
 
[O]n interstates that would obviously include construction areas / lane shifts, faded markings, bad skid marks, long underpasses and big trucks. On two lane roads that would include shaded areas entering a turn, stripy tree shadows, right hand exit lane coming off a left hand turn, change in appearance of surface due to road repairs, a line of oncoming cars, and (especially) hill crests.
Now I'm not disagreeing that any of these things are issues for AP, but do you really think it is reasonable for a mass-market product
to expect its users to be constantly on the lookout for such a large (and certainly not even exhaustive) set of factors? At some point here
the whole "well, the driver is ultimately responsible" argument starts to get a little bit silly -- you could use it to argue against any number
of now-mandatory safety features, saying "well, just be on the lookout for ... ... ... ... and you'll be fine -- just remember, you're the one
driving".
 
  • Like
Reactions: Matias
Well lovely. Now this thread is an article (of sorts): Model X Driver Says Tesla Drivers Are "Lab Rats" - Gas 2 I'm dismayed (but not surprised) that a clickbait title was used.
The author misses that Tesla did quickly contact the driver. And fails to note that this specific road type was not how AP was meant to be used.
Thanks Bonnie. The author of that article gets a few things correct and a lot of things incorrect, showing he didn't spend enough time reading through this thread, let alone doing much other research before writing his misinformed opinions.

There are so many website seeking attention that inevitably articles like that get put online with stupid titles, as you noted. Nothing anyonecan do about it but make clear their criticisms and make clear their point of view on the incident in question. This thread contains a lot of useful information about the incident, but the author of that article failed to take note of it.

The author's wife sounds like the smarter one of the two...
 
Now I'm not disagreeing that any of these things are issues for AP, but do you really think it is reasonable for a mass-market product
to expect its users to be constantly on the lookout for such a large (and certainly not even exhaustive) set of factors? At some point here
the whole "well, the driver is ultimately responsible" argument starts to get a little bit silly -- you could use it to argue against any number
of now-mandatory safety features, saying "well, just be on the lookout for ... ... ... ... and you'll be fine -- just remember, you're the one
driving".

Wait. What exactly are you disagreeing with here? The entire act of driving is to be constantly on the lookout. Driving with autopilot engaged doesn't change that need to maintain situational awareness, but it still reduces the workload for the pilot.
 
Now I'm not disagreeing that any of these things are issues for AP, but do you really think it is reasonable for a mass-market product
to expect its users to be constantly on the lookout for such a large (and certainly not even exhaustive) set of factors? At some point here
the whole "well, the driver is ultimately responsible" argument starts to get a little bit silly -- you could use it to argue against any number
of now-mandatory safety features, saying "well, just be on the lookout for ... ... ... ... and you'll be fine -- just remember, you're the one
driving".

What's so crazy about that? Drivers already have to look out for construction zones, faded markings, and much more. Autopilot reduces the amount of stuff you need to watch out for, it doesn't increase it.
 
  • Like
Reactions: bhzmark
On an S test drive in June the DS had me put AP on and we drove right thru a construction zone with cones,curves ,traffic etc......
Looking back on it I don't think I would use it in those situations myself. Of course as a test driver I had my hands at 10 and 2 o'clock ,nervous and waiting to take over :) The car did just fine including stopping for traffic from 40 mph and then stop/go albeit for only about 1/4 mile. I disagree Tesla should disengage the AP in these situations as the car had it under control. Of course I have about 2 minutes of AP experience and cannot wait to use it everyday
 
Wait. What exactly are you disagreeing with here? The entire act of driving is to be constantly on the lookout. Driving with autopilot engaged doesn't change that need to maintain situational awareness, but it still reduces the workload for the pilot.
If I'm not using AP I'm never going to have to override it, but if AP might do something I would never do on my own then I take on
an additional burden of not only driving safely but also ensuring that the (other) nut "at the wheel" doesn't do anything dangerous.
 
  • Like
Reactions: Matias
What's so crazy about that? Drivers already have to look out for construction zones, faded markings, and much more. Autopilot reduces the amount of stuff you need to watch out for, it doesn't increase it.
See previous reply. What, exactly, do you need to "watch out for" about faded markings? Speaking for myself, lane markings are only
one of many clues as to where to drive, and many roads I drive on frequently have little-to-no markings, so they're among the least
significant things in my driving environment.

Something I wish the AP (particularly, auto-steer) system would "learn" is that very often the best thing to do is "nothing", as in
"keep doing what you were doing before, or some reasonable extrapolation of that". I guarantee you that's exactly what your
brain is doing 99% of the time you're driving. Most of the complaints I've heard about auto-steer are of the form "it did something
unexpected for no apparent (or, at least, good) reason", not "it should obviously have done X but instead it did nothing".
 
Now I'm not disagreeing that any of these things are issues for AP, but do you really think it is reasonable for a mass-market product
to expect its users to be constantly on the lookout for such a large (and certainly not even exhaustive) set of factors? At some point here
the whole "well, the driver is ultimately responsible" argument starts to get a little bit silly -- you could use it to argue against any number
of now-mandatory safety features, saying "well, just be on the lookout for ... ... ... ... and you'll be fine -- just remember, you're the one
driving".
You raise a very good point! Yes, there's a bit of paradox here, or internal tension. Personally, I drive AP as if I'm a beta tester: I'm not complacent when I use it, and I pay as much attention to the road as I would when steering, although I focus on different things. I use AP because it's interesting, not because it's relaxing (although if I were in a stop-and-go commuting situation it probably would be relaxing). The paradox is that AP is supposed to take a load off the driver in some sense, but if you relax to the point of becoming defocused, you expose yourself to finding yourself in a situation where an accident has occurred and people are reminding you that it's your fault (although I think it really was Mr. Pang's fault because AP shouldn't be used on that road at all). I think people who use AP and relax are effectively making a bet and accept this risk because AP is statistically reliable in normal conditions on the right roads. Most of them will win the bet. Given all this, is it really suitable to be a mass-market product at this point? Good question, although for selfish reasons I hope it does not become further restricted.
 
  • Like
Reactions: Akikiki and Matias
put AP on and we drove right thru a construction zone
I'm not sure I'd recommend this to anyone, but I think it may actually be true that AP can navigate these conditions better than
many drivers. The lines in construction zones are typically freshly painted (when there are deviations) and the car has no "expectation"
about where the lanes should go (which might disagree with reality), so it just calmly follows whatever weird twists and turns have been
added. TACC is really good at dealing with the unpredictable speeds encountered in such areas and isn't distracted by all of the visual
chaos around it like a human driver might be.
 
While I consider Pang's account completely absurd, there is an element of truth that I inadvertently discovered yesterday. While driving on the highway with AutoPilot, the lane I was in slowly expanded to two lanes (a carpool lane splits off from the left lane) and the Model X didn't know what to do.

So as expected, the car showed the "take over immediately" notification, beeped at me continuously, and started to slow the car. I jiggled the steering wheel to disable Auto Steer, and the warning went away. Then all of a sudden my Model X which had slowed to about 60 mph took off without warning, accelerating to 75 mph!

This ridiculous bit of illogic was because Traffic-Aware Cruise Control (TACC) was set to 75 mph, so when Auto Steer became disabled after it slowed the car, the Model X tried to quickly accelerate back to the TACC speed. Surprised the heck out of me, and I felt lucky there weren't any other cars around me as I tried to understand what was going on.

So perhaps Mr. Pang grabbed the steering wheel due to the "take over now" warning and vehicle slowing, and then TACC kicked in and accelerated him right into those wooden posts? I now say that's possible, considering my experience last night.

And why didn't the Model X apply emergency braking after the first collision? My 2010 Infiniti FX50 would stop completely if it detected an imminent collision, so why can't a 2016 Tesla stop completely if it actually has a collision? Very strange - I don't know that Pang's or Tesla's stories are 100% accurate... the truth is somewhere in the middle, I'm guessing.

Why does Tesla even allow Auto Steer to function in areas where their own manual says it shouldn't be used? While the driver should be expected to have some common sense, it's supremely stupid that Tesla says to only use on a divided highway but lets me use it on the two-lane undivided country road leading to my house.