Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
I eat pizza while I have autopilot on. Worst case scenerio, I have to drop my pizza to grab the steering wheel if something happened.
Damn you eat pizza in your Tesla? I love pizza, but it's not allowed in my car, makes the steering wheel all gooey and the mozzarella is hard to get out of the carpet!

But seriously it all comes down to personal responsibility, the driver must be responsible for everything that happens with the car that is in his/her control.
 
"FSD" doesn't mean anything other than that it is the name for an option package. It does not mean literally Full Self Drive.
And yet, that's literally what Musk said in the January earnings call:

"We already have full self-driving capability on highways. So from highway on-ramp to highway exit, including passing cars and going from one highway interchange to another, full self-driving capability is there"
 
  • Helpful
Reactions: Kant.Ing
And yet, that's literally what Musk said in the January earnings call:

"We already have full self-driving capability on highways. So from highway on-ramp to highway exit, including passing cars and going from one highway interchange to another, full self-driving capability is there"
Yeah and I wish he hadn't said that but I do think EAP has come a long way in a short period of time. Let's be real however no matter how good it gets there will still be accidents like a firetruck or other truck coming into the path with little warning. In some cases no human or AI driver will be able to eliminate all accidents.
 
We have a new M3 and I just saw behavior that suggests the truck perhaps was not identified by AP. A truck with a flatbed trailer turned left about 50m in front of me while AP was engaged, and my car immediately slowed when the truck cab entered my lane. But as soon as it was out of the way and (only :) the flatbed trailer was across my lane, the car sped up as if things were all clear. I decided it was time to manually brake rather than test the theory/hope that the trailer was visible to AP. I would have thought this problem would have been fixed after that first incident in FL last year. But it seems not.

This and many other "unusual" situations already encountered in less than 2 months of M3 ownership make me skeptical of claimed FSD any time soon. I love the car! It has amazing capabilities. But the hype bothers me.
 
  • Informative
Reactions: OPRCE
We have a new M3 and I just saw behavior that suggests the truck perhaps was not identified by AP. A truck with a flatbed trailer turned left about 50m in front of me while AP was engaged, and my car immediately slowed when the truck cab entered my lane. But as soon as it was out of the way and (only :) the flatbed trailer was across my lane, the car sped up as if things were all clear. I decided it was time to manually brake rather than test the theory/hope that the trailer was visible to AP. I would have thought this problem would have been fixed after that first incident in FL last year. But it seems not.

This and many other "unusual" situations already encountered in less than 2 months of M3 ownership make me skeptical of claimed FSD any time soon. I love the car! It has amazing capabilities. But the hype bothers me.
I think you have to be in control. I'd never rely on the AP to do what you think it should do. If you see a problem coming take command. AP is not smarter than a human it's just quicker IF it understands the problem.
 
Last edited:
  • Like
Reactions: Msjulie and mongo
I think you have to be control. I'd never rely on the AP to do what you think it should do. If you see a problem coming take command. AP is not smarter than a human it's just quicker IF it understands the problem.
In addition, no one has the neural net software yet, so only the older software is running. The demonstrations on the Investor Day were impressive, but they have not been rolled out yet.
 
Looks like Lenny was onto something:

Family of Jeremy Banner are getting lawyered-up with a view to suing Tesla/Musk for AP-assisted fatality.
Yeah, good luck with that.
It states clearly in the manual that Autopilot should not be engaged on streets with crossing traffic.
It also says the driver should be able to take control.
Jeremy engaged where it should not have been, and did not take any action to avoid hitting the truck. They will loose.
 
Yeah, good luck with that.
It states clearly in the manual that Autopilot should not be engaged on streets with crossing traffic.
It also says the driver should be able to take control.
Jeremy engaged where it should not have been, and did not take any action to avoid hitting the truck. They will loose.


Yup.

Same deal with the person who died the same way in a tesla a couple years ago.

Family got $0.00 for it, because it was the drivers fault- NHTSA report was pretty clear on that.
 
Yeah, good luck with that.
It states clearly in the manual that Autopilot should not be engaged on streets with crossing traffic.
It also says the driver should be able to take control.
Jeremy engaged where it should not have been, and did not take any action to avoid hitting the truck. They will loose.

I think you are missing the point, which is that whatever the legal outcome the horrible PR from each AP-fatality piling up contradicts Tesla's "safest car" and "imminent FSD" narratives in the popular mind, thus helps depress sales/stock price, and also increases the risk of regulatory intervention.
 
  • Like
Reactions: afadeev
I think you are missing the point, which is that whatever the legal outcome the horrible PR from each AP-fatality piling up contradicts Tesla's "safest car" and "imminent FSD" narratives in the popular mind, thus helps depress sales/stock price.

As one thinks a bit about the specifics of autopilot, it is easy to wonder why Tesla does not offer a form of training to highlight areas of its automation functionality that users should be aware of.

We are fortunate, on this forum to have one another, and frequently exchange thoughts, questions, and concerns. The complexities and differences of the Tesla platform are such that anyone would benefit from such an initiative.

By way of example, the airline industry recently has been pushing back on changes by the manufacturers, requiring training where there have been significant enhancements or modifications to the operation of aircraft.. it is probably a good idea for Tesla to consider adding ways to inform and to train owners of its use.

It would be a great enhancement to add a notated video to the digital owners manual. As new features get added, these could be hot-linked to the release notes.

Early in our ownership, Tesla had emailed us about a Q&A and training gathering in the local showroom and service center, and we joined about 20 other new Tesla owners. It was offered as a workshop to new owners who had not yet taken delivery of their vehicles. It was adorable watching the children of the owners answer questions from other adults in the room - and there were many questions. Give the rate of enhacements to the software, existing owners would likely benefit from similar workshops. Since the closing of the showroom, I have not heard of other workshops.
 
I think you are missing the point, which is that whatever the legal outcome the horrible PR from each AP-fatality piling up contradicts Tesla's "safest car" and "imminent FSD" narratives in the popular mind, thus helps depress sales/stock price, and also increases the risk of regulatory intervention.
No, I get that.
Listen, Autopilot needs to be able to see a tractor-trailer crossing in front of it and stop if needed. The fact that it still can't even 3 years after the Brown death is concerning.
The fact that people continue to trust such a system and die is just stupid.
 
As one thinks a bit about the specifics of autopilot, it is easy to wonder why Tesla does not offer a form of training to highlight areas of its automation functionality that users should be aware of.

We are fortunate, on this forum to have one another, and frequently exchange thoughts, questions, and concerns. The complexities and differences of the Tesla platform are such that anyone would benefit from such an initiative.
The fact that people continue to trust such a system and die is just stupid.

By way of example, the airline industry recently has been pushing back on changes by the manufacturers, requiring training where there have been significant enhancements or modifications to the operation of aircraft.. it is probably a good idea for Tesla to consider adding ways to inform and to train owners of its use.

It would be a great enhancement to add a notated video to the digital owners manual. As new features get added, these could be hot-linked to the release notes.

Early in our ownership, Tesla had emailed us about a Q&A and training gathering in the local showroom and service center, and we joined about 20 other new Tesla owners. It was offered as a workshop to new owners who had not yet taken delivery of their vehicles. It was adorable watching the children of the owners answer questions from other adults in the room - and there were many questions. Give the rate of enhacements to the software, existing owners would likely benefit from similar workshops. Since the closing of the showroom, I have not heard of other workshops.

Yes, definitely agree Tesla should be working on proactive mitigation/education strategies.

e.g. there should be a little cartoon video pop up on the MCU like every 20th time the car is unlocked, or upon request, explaining exactly where AP weaknesses in the current sw version lie, so owners are regularly reminded and those perhaps borrowing the car for a while can be thoroughly educated as to where the pitfalls lie. This would be a lot better investment than grating our nerves with fart apps and Atari rubbish.
 
Yes, definitely agree Tesla should be working on proactive mitigation/education strategies.

e.g. there should be a little cartoon video pop up on the MCU like every 20th time the car is unlocked, or upon request, explaining exactly where AP weaknesses in the current sw version lie, so owners are regularly reminded and those perhaps borrowing the car for a while can be thoroughly educated as to where the pitfalls lie. This would be a lot better investment than grating our nerves with fart apps and Atari rubbish.

Unless of course, Tesla adds a driving simulation to the list as part of onboard training. As Tesla adds 3d render engines (i.e. Unity and Unreal) to their cars, this could be very useful. It would be a real hit with new drivers, and the kids.
 
  • Like
Reactions: OPRCE
The problem for Tesla is that the ad-hoc strategery of trying to pin all the blame on their deceased customers is rapidly wearing thin, therefore it is in their interest to intelligently pre-empt before having more severe consequences imposed.

You used that word. I do not think it means what you think it means:
Ad hoc: made or happening only for a particular purpose or need, not planned before it happens:

Tesla has always said the driver is in charge, thus it cannot be ad hoc.

I'm not so sure about that. The manual says Autosteer should only be used on "highways and limited-access roads". The accident happened on SR7/US highway 441 which probably meets that definition.

Full warning:
Warning: Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.

Additional autosteer warning:
Warning: Autosteer is not designed to, and will not, steer Model S around objects partially or completely in the driving lane. Always watch the road in front of you and stay prepared to take appropriate action. It is the driver's responsibility to be in control of Model S at all times.

TACC:
Warning: Traffic-Aware Cruise Control cannot detect all objects and, especially in situations when you are driving over 50 mph (80 km/h), may not brake/decelerate when a vehicle or object is only partially in the driving lane or when a vehicle you are following moves out of your driving path and a stationary or slow-moving vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.

Mongo's bottom line: autopilot does more to protect people from themselves than pretty much any other car, and definitely more than a car without such features.
 
  • Like
Reactions: afadeev
By reading all the restrictions/limitations/warnings implied in operating safely the AP/FSD, considering that you need to give your undivided attention to the system while keeping 100% of the time your hands on the wheel. Also expecting unexpected actions from the car, I wonder if it is even worth to invest in this system today.
 
  • Like
Reactions: afadeev
You used that word. I do not think it means what you think it means:
Ad hoc: made or happening only for a particular purpose or need, not planned before it happens:

Tesla has always said the driver is in charge, thus it cannot be ad hoc.

We agree on the word's definition and the manual's legalese in itself has always been clear enough, what is not clear is that Musk had a coherent strategy planned in advance for dealing with the aftermath of AP's perception-gaps. This is where things IMHO become ad-hoc, more of an embarrassed rear-guard action to cover for his history of mixed messaging conflating FSD with AP in the mind of non-experts.

Mongo's bottom line: autopilot does more to protect people from themselves than pretty much any other car, and definitely more than a car without such features.

I think it is arguable both ways atm but we cannot know definitively until Tesla opens its data to an independent 3rd party/authority for a proper analysis.
 
  • Helpful
  • Like
Reactions: afadeev and mongo
Mongo's bottom line: autopilot does more to protect people from themselves than pretty much any other car, and definitely more than a car without such features.

By reading all the restrictions/limitations/warnings implied in operating safely the AP/FSD, considering that you need to give your undivided attention to the system while keeping 100% of the time your hands on the wheel. Also expecting unexpected actions from the car, I wonder if it is even worth to invest in this system today.

Therein lies the problem with Tesla's "AP is safer" argument: disclaimers absolve AP/EAP of responsibility to work properly under any and all conditions, yet Tesla and its fans keep arguing that the AP is still safer than not using it. Even though none of us, outside of Tesla, have seen the data to back up that safety claim.

Thus that claim is propagated on faith, not data. And the discussion crosses into the domain of religious convictions, outside of scientific evaluation, with all the vitriol that comes with that territory.

Those two arguments are logically and morally in contradiction with each other.
AP can't be safer, if the company is telling you it isn't expected to work properly... just about ever.

It's mostly a matter of time before a sympathetic jury puts a price tag on that fallacy.

My personal, statistically unsubstantiated, belief is that AP is a helpful driver aid tool that improves safety on average, but is deadly under certain poorly documented and ever evolving circumstances.

a

P.S.: The more I drive with AP ON, and the more I read these forums, the better I build up my personal heuristics on when I expect AP to fail. I wish Tesla was honest and upfront with sharing those corner cases with the owners of its products, but I also understand why doing so would be bad publicity for the company, and an ego deflating event for Elon. Thus, I'm not holding my breath.
 
  • Like
Reactions: dhrivnak and OPRCE
Status
Not open for further replies.