Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
Bottom line: If you stop paying attention and/or holding the wheel (for whatever reason), are you safer in a car with AP than in a car without?
There are no other car companies I am aware of where the CEO promotes in actions to not hold the wheels. If you see a Tesla crashing, do you first thing there was driving error or driving input omission?
As this rate, it's more dangerous to be a traffic participant near a Tesla as in most cases, the driver will not be driving and in a signficifant share of that, they are texting, sleeping or otherwise engaged.
The smart software avoids some incidents where the human response time is insufficient, allowing them to skew numbers a bit. It's not making the traffic experience any safer for a Tesla driver or their surroundings to take their hands off the wheel.
 
There are no other car companies I am aware of where the CEO promotes in actions to not hold the wheels.
Demonstration != promotion.
See also: eating your own dog food.
If Musk does not show confidence in the future release of software, that is a worse message.
"Hey everyone, here is the software I was using with my hands on the wheel, go ahead and go hands free with it."

If you see a Tesla crashing, do you first thing there was driving error or driving input omission?
If it was an at fault accident, then I know there was a driver error, which includes lack of input.

As this rate, it's more dangerous to be a traffic participant near a Tesla as in most cases, the driver will not be driving and in a signficifant share of that, they are texting, sleeping or otherwise engaged.

So you are saying, in most cases, Tesla drivers are not paying attention?
Separating that questionable statistic from my question, if you are in traffic next to a driver who is texting, sleeping, or otherwise engaged, you are saying you would be safer if their car did not have AP? So that it would not attempt to stay in lane and match speed with traffic?

The smart software avoids some incidents where the human response time is insufficient, allowing them to skew numbers a bit. It's not making the traffic experience any safer for a Tesla driver or their surroundings to take their hands off the wheel.

If it is not making things safer for other drivers, where are the reports of cars with AP on hitting other cars? I can find a virtually unlimited number of cases of non-AP cars doing so.

Thus far, it seems AP is most dangerous to inattentive drivers, but at a lower rate than inattentive drivers of non AP cars.
 
If it was an at fault accident, then I know there was a driver error, which includes lack of input.

I think it is extremely disingenuous to claim "lack of input" as a cause of driver error while relying on a product called Auto Pilot.

If the product was called, demo-ed, promoted, or hyped as "advanced lane keep assist", or "stay in lane alert", it would be easier to argue that the driver failed to react to alerts or assist system warnings. But if a product is billed as an Auto Pilot, when it can not reliably perform the task of auto piloting, there is a fair amount of mis-advertising and mis-representation going by the seller.

If Elon glorifies and show-cases the behavior that gets his customer killed (rarely, but still)...
... that's just not cool. No matter how crafty the disclaimers are.


So you are saying, in most cases, Tesla drivers are not paying attention?

I can't speak for others, but I know I am not paying as much attention to the road when I engage AP, vs. drive without it.

That's the whole point of engaging AP - to offload the immediate steering and speed control responsibility so that you can relax and focus on something else. If my intent is to be fully in control of the vehicle, I am not engaging AP.

Disclaimers or not, that's the reality.


Separating that questionable statistic from my question, if you are in traffic next to a driver who is texting, sleeping, or otherwise engaged, you are saying you would be safer if their car did not have AP? So that it would not attempt to stay in lane and match speed with traffic?

This is a toss-up, but depending on the error rate of AP, I would like to believe that an average texting or radio selecting driver is more accident prone than the current Tesla AP software. I don't have any data to backup this up, and Tesla is not releasing raw data to validate their promotional claims, but I would like that to be true.

The trouble with the recent events is that they undermine our confidence in the AP.

And that is why this thread is going on for so long, with no sign of tapering off.

Thus far, it seems AP is most dangerous to inattentive drivers, but at a lower rate than inattentive drivers of non AP cars.

Attentive driver > AP > Inattentive driver.

?
 
Last edited:
  • Like
Reactions: OPRCE
And somehow we're made to believe that this guy is good for the company. He does what got his customers killed, on TV. Countless vloggers are copying it. The disclaimer on the screen is for suckers to read. And even if they state you shouldn't, they do it themselves and get hundreds of thousands of views and some 10% in thumbs up.
This guy want to make the system remove a certain kind of accidents to make it on paper safer, but for now all he's accomplished is a cult following of people who try all the time to activate AP. And AP keeps trying. Often backing out of a move last second.
It’s not hands off the wheel that will kill you, it’s eyes off the road that gets you killed.
 
I can't speak for others, but I know I am not paying as much attention to the road when I engage AP, vs. drive without it.

That's the whole point of engaging AP - to offload the immediate steering and speed control responsibility so that you can relax and focus on something else. If my intent is to be fully in control of the vehicle, I am not engaging AP.

Thank you for not speaking for the rest of us. I still pay attention even more so because I can pay more attention to my immediate surroundings instead of having to worry about my exact speed and lane position. So yes YOU should definitely NOT use AP since you acknowledge that you will be a hazard to yourself and others. As for the rest of us responsible people, we will use AP and still ensure that we can maintain full control of the vehicle.
 
The trouble with the recent events is that they undermine our confidence in the AP.
Paradoxically this actually makes autopilot safer. As long as these accidents are publicized enough it will remind people to pay attention while using it. It’s when people start having confidence in the system that it becomes dangerous.
 
radar used by Tesla gives you a speed reading on the object with a flag. The flag says is the object is static, moves at you, away from you or sideways. I have plenty of examples to demonstrate that.
Also I would suspect that semi is not super slowly moving to get into the car's path like that in 8 seconds. In fact probably quite quickly.
That's a great explanation. I gave up trying to get the point across. You do a better job anyway. Clearly (and no surprise), you know the physics of AI and RADAR better than most everyone on here.
As a side note, I saw somewhere that a poster was making the wrong assumption that because the Tesla shows an object as 0 speed that it must be stationary. They aren't realizing the Tesla Ai is tracking and processing relative speeds of other vehicles. Ok, maybe you can clarify that one better than me also ;)
 
I think it is extremely disingenuous to claim "lack of input" as a cause of driver error while relying on a product called Auto Pilot.
Airline autopilot does not do what you think it does. AP already does more than standard one or two axis systems.


That's the whole point of engaging AP - to offload the immediate steering and speed control responsibility so that you can relax and focus on something else.

No, it is to reduce the driver's work load by handing speed and lane keeping. This frees the driver to be more aware of the overal situation.

The trouble with the recent events is that they undermine our confidence in the AP.

Sure, however it may also indicate excessive belief in current AP performance.

Attentive driver > AP > Inattentive driver.

Sure, but what percentage of time are drivers attentive?
Even for attentive drivers, when checking blinds spots, they are not looking ahead. When looking ahead, they are not checking blind spots. When checking speed, changing the radio, checking the map, sneezing then are not focused on the road at all.
When attentive, they can override AP if it is messing up. When inattentive, AP may save them.
 
That's a great explanation. I gave up trying to get the point across. You do a better job anyway. Clearly (and no surprise), you know the physics of AI and RADAR better than most everyone on here.
As a side note, I saw somewhere that a poster was making the wrong assumption that because the Tesla shows an object as 0 speed that it must be stationary. They aren't realizing the Tesla Ai is tracking and processing relative speeds of other vehicles. Ok, maybe you can clarify that one better than me also ;)

There is vehicle relative speed and there is ground speed. AP cares about vehicle relative speed, objects with a relative speed of 0 have the same ground speed at the car. Everything that is part of the scene that is not a moving object has a ground speed of 0 and a relative speed that matches the car's velocity (igniting beating). To remove this clutter, the radar filters out returns where the relative speed of the object (due to doppler shift) matches the speed of the vehicle.

A car going 50 would see an overpass as an object going 50 towards it and filter it out. It would see a car going 50 in the same direction at having 0 relative speed and pass that target onto the AP software.
 
There is vehicle relative speed and there is ground speed. AP cares about vehicle relative speed, objects with a relative speed of 0 have the same ground speed at the car. Everything that is part of the scene that is not a moving object has a ground speed of 0 and a relative speed that matches the car's velocity (igniting beating). To remove this clutter, the radar filters out returns where the relative speed of the object (due to doppler shift) matches the speed of the vehicle.

A car going 50 would see an overpass as an object going 50 towards it and filter it out. It would see a car going 50 in the same direction at having 0 relative speed and pass that target onto the AP software.
I like that explanation. :D
 
Sorry this is so long and done from my phone.

After sleeping on this entire subject, here is where I land on this. All the technical jargon on this aside.

For about a year after getting my car and a few dead stopped fire engine accidents on highways is when this Amateur Tesla owner figured out this car will not be stopping, attempt to stop or maybe even not maneuvering around stopped objects in highways at higher speeds then 35mph or some threshold I am still trying to find that answer. I had gone a year not knowing this. I thought my car did everything. All the sheepish expressions when asking Tesla employee a question are all coming into view now in my memory.

Two years prior to buying and years after up until yesterday before I knew Jeremy was killed, I had written essays and recommended people to either buy a Tesla or seriously consider it even as their next car. To date I have zero referrals. There are 5 Teslas in my neighborhood and I know probably 3-4 others around my circle that own Tesla’s. Did I influence any of that? Either way I now almost feel it’s my duty as another human being to inform them completely that these cars are not going to save your life. In fact they might give you a false sense that they will save your life.

I am certainly from this point forward shutting my mouth to help this cause for fear I will have blood on my hands. What makes me think this way. If I had been able to sit down and just in a few minutes let Jeremy know some facts that he would have found enlightening, I’m sure being a software engineer he would have taken that info and at least investigated for himself. Would it have changed the outcome we have here, nobody knows.

I know for sure there are many aspects of at least of a Model S that cause undue loss of life. How about the BLUE Model S fire in Ft Lauderdale. People reported trying to open the car door. The handles were not or did not pop out upon impact. They sat and watched that person burn to death. Had they even broken the window (one Tesla employee suggested)and with all the heat and the rush: Do you think they would have found the special release handle located on the door we all know about up by the stationary wing window?
The one we all grab daily if your an owner of a model S. Think about it. It’s not easily understood.

My first long trip in my car, I showed the other “Emergency Release” to my daughter since she rides in the backseat. I wanted her to know in the event of a crash and the electronics (the back doors depend 100% on electricity) shut down how to get out in the event so as to perhaps save her life. She could barely do it after struggling to find it under her seat. So that’s it. Special releases for people to exit a badly damage or burning vehicle. In the model S that’s just the rear passengers have to know that. Now if any Model S owner just read this and you never knew this, well you need to get your manual out on your computer and read it from to back and do it again in another week.

So my point is this, this car, this entanglement of tech and not so tech, before a laymen, laywoman, person, child, uses it, rides in it, drives it, you would have to hold classes on the entire aspect of the car, then do it again, before you could safely say I informed, I educated, I have done my part to make sure everyone understands what to do in an Event, ie stopped fire engines, 90 degree facing semis, accidents, fire, on and on. In fact the Model S is the only car I have ever owned where you would have to do that. Dare I say there are other Tesla’s where it’s a requirement. I just have not read the manual and done the in-depth study on those vehicles like I have here. Still learning.

Now Jeremy was competitive. He was also a jealous personality. He was human. How do I know this. The camera quote in my previous posts. He like so many humans fed his ego on facts related to your behind on your tech. He was a software engineer after all. I cannot help but think he kept up with us through Facebook and those posts of my red Tesla Model S, well he was going to do one better with a Red Model 3. You see we are the marketing team for Tesla. The whole universe is looking at our cars. Last night I must have had at least 6-10 people I noticed at lights looking at my car. Those are the ones I noticed. Probably well over 1000 in the whole trip. These are special cars, they take special understanding. Even then I am not convinced they are that special any longer. I’m sorry for myself, others and the planet, because the dream of an electric car has been mine from a very early age. Unfortunately Tesla has taken it too far, too fast and made it so the human beings are not able to understand fully what they have gotten into here. Elon Musk wants to win, the cost for that is steep, the cause I get, the speed at which he is trying to do it in, that’s business.

I in no way blame myself for anything, I do though feel it important to educate. Perhaps I have found my next calling forward. Thanks for reading and safe Travels in whatever your driving. Have a great weekend.
 
Airline autopilot does not do what you think it does. AP already does more than standard one or two axis systems.
No, it is to reduce the driver's work load by handing speed and lane keeping. This frees the driver to be more aware of the overal situation.
Yup, that is exactly how I use AP, both on the ground and in the air. It lets me focus on other things around me that I normally would not be able to give more attention without AP.
And yea, my King Air does a lot of very impressive AP stuff that helps a lot.
 
Paradoxically this actually makes autopilot safer.

Not really a paradox: not trusting AP is objectively safer because that is the way it is designed to be used, the human supervisor is the backup when things start going wrong.

The real paradoxical trick is how to train oneself to stay effectively engaged in the task of supervising, while the system naturally lulls one into overestimating its competence, thus tending to relax too much, which leads directly to crunchy-time.
 
Not really a paradox: not trusting AP is objectively safer because that is the way it is designed to be used, the human supervisor is the backup when things start going wrong.

The real paradoxical trick is how to train oneself to stay effectively engaged in the task of supervising, while the system naturally lulls one into overestimating its competence, thus tending to relax too much, which leads directly to crunchy-time.
The paradox is that making autopilot better may actually make it less safe.

What's crazy to me is that the three autopilot related deaths in the United States have all been engineers.
 
The funny thing about the debates around the name Autopilot and if that conveys a false sense of its capababilities to the general non-pilot public is Tesla has actually one-upped the name game.

What people buy today when they get the package with NoA and summon is now called Full Self Driving. If people think the AP name is misleading, the FSD name is going to be even more misleading for years to come as I seriously doubt Tesla will be allowing eyes-off/hands-off usage for a very long time. Sure, sure, there is some fine print and a disclaimer that “Full Self Driving” isn’t actually anything of the sort, but with the way Tesla is marketing it, people are going to misunderstand.

Sad as it is, we may see more of these type events, not less. There was a fellow who recently bragged here about using AP to keep him safe while driving when he had 5 microsleep events. Instead of pulling over and taking a nap like he should have.
 
Sorry but I haven’t read this entire thread to see what others have said.

I use autopilot in all kinds of driving situations. Highway, city, back country roads at 60mph with windy roads. Every second of the time I’m paying attention the exact same as though it wasn’t even turned on. My hands are ready to take over at a moments notice, like a trigger happy cowboy ready to fire. Even though it’s very good in most situations, I just don’t trust it. It’s tried to take me into the other lane on a slight curve and many times on a 90° turn. It’s jerked hard left and right more than a few times. Every time I’m close to a large object like a semi I’m ready. Not only for what mistakes it might make but also for the mistakes of other drivers.


Even with all of that, it’s very useful and takes the stress out of driving. If all users had this mentality, I feel these crashes due to its faults wouldn’t occur at all. People get too comfortable and complacent. With a system like autopilot the negative outcomes can be deadly as we’ve seen more than a few times. Tesla needs to be more repetitive and detailed about it’s capabilities or lack there of. The warnings you do get just don’t hammer it down enough.
 
Status
Not open for further replies.