Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
This was a tragic accident that hit close to home for me. I’m on that road every day. I feel very badly for the driver and all involved.

But three salient points:

(1) Autopilot (as opposed to TACC alone) should not have been engaged on this road. It is not a limited access, exit/entry ramp type of highway. There are numerous private drives and public and private streets intersecting the road, and frequent traffic signals. The M3 manual could not be more clear that this is not a road suitable for autopilot. I don’t mean to cast blame on the driver but there are conservative rules for safe Autopilot driving and bad things can happen when Tesla’s guidance is not followed.

(2) We shouldn’t even be discussing Autopilot in connection with this crash. Autopilot keeps the car in lane. That’s it. It does not stop the car. We should be talking about the performance of the automatic emergency braking (AEB) system. AEB is supposed to bring the M3 to a full stop with TACC engaged if it sees an obstacle. Even without TACC engaged, it is designed to at least slow the car if it sees an obstacle. Without any driver inputs, AEB should have intervened here. It did not. The M3 simply did not see the semi blocking the roadway. Easy to understand why. The stainless steel trailer literally blended into the grey roadway surface. We can and should focus on the AEB performance in this difficult edge case, no need to confuse the public with Autopilot talk.

(3) The driver had ten seconds after engaging Autopilot — 1000 feet at 68 mph — to see, process and respond to the semi blocking all lanes. He should easily have seen the tractor pulling the trailer across from a quarter mile away or more. He needed only 133 feet to panic stop. It doesn’t matter if his hands were on or off the wheel for that 10-second period, if he was not looking at where he was going — or not seeing what was in front of him. I believe it may have been as hard for him to see as for M3’s cameras.

Like I said, a terrible tragedy. Sympathies to the victim and family.
 
So, my uninformed and probably wrong thought about this is: 1. If the radar OR visual detectors see a NEW object in the scene no matter what the speed it's obviously something that needs to be taken into account. 2. If it's Doppler ONLY radar i.e. there is no "return" from a non/slow moving object for whatever reason then the cameras have to do the work.

In other words, "Hey, I've got a new, fairly large, contact dead ahead. It wasn't there a second ago, I need to stop/slow down." I'm sure this is harder than it sounds from a programming perspective.
 
I believe it may have been as hard for him to see as for M3’s cameras.
According to surveillance video in the area and forward-facing video from the Tesla, the combination vehicle slowed as it crossed the southbound lanes, blocking the Tesla’s path.
The truck was visible from the forward-facing cameras.

no need to confuse the public with Autopilot talk.
To me this is like saying that fentanyl is not a problem since it can be used safely (it's a legal prescription drug after all!). It's very important to let the public know that Autopilot requires full attention.
 
This was a tragic accident that hit close to home for me. I’m on that road every day. I feel very badly for the driver and all involved.

But three salient points:

(1) Autopilot (as opposed to TACC alone) should not have been engaged on this road. It is not a limited access, exit/entry ramp type of highway. There are numerous private drives and public and private streets intersecting the road, and frequent traffic signals. The M3 manual could not be more clear that this is not a road suitable for autopilot. I don’t mean to cast blame on the driver but there are conservative rules for safe Autopilot driving and bad things can happen when Tesla’s guidance is not followed.

(2) We shouldn’t even be discussing Autopilot in connection with this crash. Autopilot keeps the car in lane. That’s it. It does not stop the car. We should be talking about the performance of the automatic emergency braking (AEB) system. AEB is supposed to bring the M3 to a full stop with TACC engaged if it sees an obstacle. Even without TACC engaged, it is designed to at least slow the car if it sees an obstacle. Without any driver inputs, AEB should have intervened here. It did not. The M3 simply did not see the semi blocking the roadway. Easy to understand why. The stainless steel trailer literally blended into the grey roadway surface. We can and should focus on the AEB performance in this difficult edge case, no need to confuse the public with Autopilot talk.

(3) The driver had ten seconds after engaging Autopilot — 1000 feet at 68 mph — to see, process and respond to the semi blocking all lanes. He should easily have seen the tractor pulling the trailer across from a quarter mile away or more. He needed only 133 feet to panic stop. It doesn’t matter if his hands were on or off the wheel for that 10-second period, if he was not looking at where he was going — or not seeing what was in front of him. I believe it may have been as hard for him to see as for M3’s cameras.

Like I said, a terrible tragedy. Sympathies to the victim and family.
You are right on the money.
 
(1) Autopilot (as opposed to TACC alone) should not have been engaged on this road. It is not a limited access, exit/entry ramp type of highway. There are numerous private drives and public and private streets intersecting the road, and frequent traffic signals. The M3 manual could not be more clear that this is not a road suitable for autopilot.
This is true, but Tesla is a bit schizophrenic in that regard. About two years ago they explicitly enabled the use of Autopilot on local roads (before that it had been blocked by geo-fencing). They also recently enabled a red-light warning feature that works only when Autopilot is enabled but only makes sense on streets where you shouldn't be using Autopilot. So on one hand the manual says "don't do it", on the other hand it's "come and try our latest feature".
(2) We shouldn’t even be discussing Autopilot in connection with this crash. Autopilot keeps the car in lane. That’s it. It does not stop the car.
Well, Autopilot (or the TACC component, to be more precise) can actually stop the car when you're approaching stopped vehicles at an intersection. But apparently this only works under specific, limited circumstances which are not documented or predictable for the customer, and may change without warning in a future firmware update.
We should be talking about the performance of the automatic emergency braking (AEB) system.
Very true. Ideally the car should have slowed down or stopped whether Autopilot was active or not.
 
The paradox is that making autopilot better may actually make it less safe.

What's crazy to me is that the three autopilot related deaths in the United States have all been engineers.

1. I think once Firetruck Super-Destruction [ironic FSD] mode is eliminated within the next year on HW3, the safety of AP/FSD will greatly increase even if the human supervisors become somewhat more careless. i.e. removing all the modes of *treacherous failure* will push it over a tipping point. However it looks as if AP on HW2.0 will remain as it now is permanently, though most owners will hopefully upgrade once the merits of FSD are amply demonstrated.

2. The statistics may be skewed by the reality that a lot of engineers tend to buy Teslas? Engineers per se are not given to overly trusting complex machines, as they know that materials/designs fail and bugs exist.
 
1. I think once Firetruck Super-Destruction [ironic FSD] mode is eliminated within the next year on HW3, the safety of AP/FSD will greatly increase even if the human supervisors become somewhat more careless. i.e. removing all the modes of *treacherous failure* will push it over a tipping point. However it looks as if AP on HW2.0 will remain as it now is permanently, though most owners will hopefully upgrade once the merits of FSD are amply demonstrated.
Maybe for interstates, I'm much more concerned about "automatic driving on city streets." where the possibility of third party casualties is much higher.
2. The statistics may be skewed by the reality that a lot of engineers tend to buy Teslas? Engineers per se are not given to overly trusting complex machines, as they know that materials/designs fail and bugs exist.
Engineers definitely tend to buy Teslas, over 10% of my coworkers have them. Still, I'm surprised and a bit dissapointed.
 
This is true, but Tesla is a bit schizophrenic in that regard. About two years ago they explicitly enabled the use of Autopilot on local roads (before that it had been blocked by geo-fencing). They also recently enabled a red-light warning feature that works only when Autopilot is enabled but only makes sense on streets where you shouldn't be using Autopilot. So on one hand the manual says "don't do it", on the other hand it's "come and try our latest feature".
Well, Autopilot (or the TACC component, to be more precise) can actually stop the car when you're approaching stopped vehicles at an intersection. But apparently this only works under specific, limited circumstances which are not documented or predictable for the customer, and may change without warning in a future firmware update.
This is because Tesla insists on using us, it’s customers, as test engineers. Instead of releasing a product that is fully tested and capable, they release it as a beta and let us test it on the streets and in the public. Yes it allows them to gather info and tweak their product faster, but in my opinion it also increases the risk for us as well. Now I am not a Tesla hater by any means. I love their product, but I’m just not so hip on how they have approached the testing and refinement of AP. I just feel it is to risky and people take the product for granted. You can’t watch YouTube and not see a multitude of examples of this.
 
  • Like
Reactions: OPRCE
I bet he had his foot on the accelerator a little, which over-rides cruise control and auto braking. My experience in the Model 3 (and with crossing traffic in front of me) is that auto braking is too sensitive, and very unlikely to have not slowed down for crossing traffic unless he was pressing the accelerator.
 
I bet he had his foot on the accelerator a little, which over-rides cruise control and auto braking. My experience in the Model 3 (and with crossing traffic in front of me) is that auto braking is too sensitive, and very unlikely to have not slowed down for crossing traffic unless he was pressing the accelerator.
that would have been noted in the preliminary report for sure.
 
  • Like
Reactions: chinnam3 and OPRCE
I bet he had his foot on the accelerator a little, which over-rides cruise control and auto braking. My experience in the Model 3 (and with crossing traffic in front of me) is that auto braking is too sensitive, and very unlikely to have not slowed down for crossing traffic unless he was pressing the accelerator.
What terrifies me the most is when people post things like this. Too many people think that if the software stops 1000 times for a semi crossing in front of their path that means there's a 100% chance that it will stop on the 1001st time. @bwilson4web thinks that since he fell asleep 5 times while using autopilot and survived that it's safe to do so a 6th time (and he's an engineer! :eek:).
 
  • Like
Reactions: OPRCE
For about a year after getting my car and a few dead stopped fire engine accidents on highways is when this Amateur Tesla owner figured out this car will not be stopping, attempt to stop or maybe even not maneuvering around stopped objects in highways at higher speeds then 35mph or some threshold I am still trying to find that answer. I had gone a year not knowing this. I thought my car did everything. All the sheepish expressions when asking Tesla employee a question are all coming into view now in my memory.

This is of great concern, especially after reading about this accident, versus a similar one a few years ago, the AP system has been completely overhauled(different hardware, different software) .... So does it mean 3 years of Tesla R&D still cannot solve the same problem?
 
  • Like
Reactions: OPRCE
This is of great concern, especially after reading about this accident, versus a similar one a few years ago, the AP system has been completely overhauled(different hardware, different software) .... So does it mean 3 years of Tesla R&D still cannot solve the same problem?

It certainly looks like all R&D towards a solution has been invested in HW3/FSD hard- and software, which we have yet to see heavily tested in action. For HW2.x AP it seems essentially nothing has changed re. crossing semis in 3 years, yes, and this is likely to remain the position.
 
  • Like
Reactions: afadeev
This is of great concern, especially after reading about this accident, versus a similar one a few years ago, the AP system has been completely overhauled(different hardware, different software) .... So does it mean 3 years of Tesla R&D still cannot solve the same problem?
In their defense no one else has solved the problem either without resorting to expensive LIDAR sensors.
 
AP does not remove the responsibility for driving from the human.
No automaker has solved bad driving.
You are correct, a point I have made on various posts across this forum. Tesla does however make their customers and the public in general too trusting in a product that shouldn’t be allowed on public roads. The general public is frankly to stupid to heed the warnings of the owners manual and the manufacture and insist on testing the system to the limits for the sake of YouTube clicks or inadvertently by just thinking the system is more capable than it is. Most people are not as educated on its limitations as the members of this forum, which of course represent a minor fraction of Tesla owners. Tesla is known for horrible communication with its customers and not updating its owners manual in a manner that truly reflect the systems operating manners. How can they as the system changes constantly. Tesla needs to fix these glitches in their labs on test roads and with their engineers. Not on public roads with out lives. As I have said before I am not anti Tesla but this crap frustrates me coming from a company I want to succeed and a product I intend to own in the near future. But you had better believe that I will not use AP till this kind of crap gets worked out. For the record I do not believe it to be the fault of the system in this case, but I do think it the fault of Tesla for not getting it through the heads of the customer what the limitations are. There needs to be better monitoring in place for the car to reconginize those who chose to abuse its options.
 
Last edited:
Status
Not open for further replies.