Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
Doesn't matter if the airbags deploy if you are literally decapitated.

Well you have two eyeballs until your skull gets crushed so why the guy never braked is a real issue. I don't want to come off as rude or disrespectful to the family but I bet this guy was going 90MPH on autopilot and not paying attention to the road. Just a guess, based on the damage and loss of life to those involved. I don't expect my car to automatically stop at an intersection or when a GIANT TRUCK TRAILER is about to smash into my skull. I commute often and cross paths with big rigs every day. Usually I slow down and go behind them, or go into the speed lane and go around them. I use proven defensive driving techniques to avoid damage. It is unfortunate that this driver did not. The marketing doesn't say, please proceed to violate the speed limit (probable), engage autopilot (probable), and don't prepare to take evasive action when a road hazard is present (obvious).

This truck is big enough to see from all the way down this huge stretch of flat land, so why he didn't slow down or break is beyond me. The tragedy here, IMO, is that it was avoidable. If the driver was convinced that AP would autobreak for him then he must not have been driving the car very long or even read the visual warnings when enabling autopilot or even skim thru the manual. I find it hard to blame the truck driver in this instance unless it turns out he ran the stop sign or stalled at the intersection. But even if he did, he will never admit that he did.

The problem for me is that this stretch of road is so obvious for speeding that there should most certainly be a traffic light there. This is the type of road where you would be tempted to mash the accelerator.


I intentionally have kept posts non-personal, non-emotional and am trying to discuss the matter from a vehicle standpoint and not go into gory details. As for the airbags, the question was raised as a functional aspect of the car's operation. It's something people have discussed in light of the roof and pillars coming off the car and what should be expected "behavior". A comparison of the two recent Tesla underride accidents I thought was interesting from that standpoint. The more we learn about our cars the better informed drivers we will be.

Until we have more info on the speed and timing of the vehicles granted we are just speculating here. However I still don't see how the truck driver with full view of a car with headlights on coming towards him, on a major 4+ lane highway, could think he could pull out safely and cross the highway to the median before traffic reached him. Maybe he misjudged the highway speed, maybe he never looked again to his left when pulling out, and as you said maybe there were other factors. However what car driver would pull out like that knowing they could get hit by oncoming traffic and be killed and injure others? We have deadly accidents on Hwy 17 all the time from drivers who misjudge either traffic or their car and cross over lanes of traffic to turn onto the roadway on the other side. Their fault totally. No stop signs for traffic on the highway but instead there is at the side streets. Just like at this junction on SR441.

I hope the thinking is not because he was in a huge vehicle and therefore personally safer he simply didn't care if oncoming traffic couldn't brake in time. I've seen cars do stupid, dangerous things like passing trucks and cutting in front of them without a safety margin, but a truck pulling out with oncoming traffic is no less responsible if a life is lost than that car who pulled in front of a truck causing an accident would be.
 
Last edited:
  • Like
Reactions: OPRCE
I hope the thinking is not because he was in a huge vehicle and therefore personally safer he simply didn't care if oncoming traffic couldn't brake in time. I've seen cars do stupid, dangerous things like passing trucks and cutting in front of them without a safety margin, but a truck pulling out with oncoming traffic is no less responsible if a life is lost than that car who pulled in front of a truck causing an accident would be.
This is something we can both agree on. And as someone said in a previous post side guards on trailers are mandatory in Europe and save lives. This could be mandated here and why its not I think is something that if our govt wasn't so dysfunctional would be looked into. Also, yes, its possible the truck driver was used to "getting his way" and tried to cut across the intersection. These are things the investigators have to figure out. This is why highway cams (it appears there are few or none in Florida) and other such equipment such as dashcams are important in issues like these. As well as taking precise photographs and measurements at the scene and recovering any possible data from both vehicles.
 
  • Like
Reactions: OPRCE
We have deadly accidents on Hwy 17 all the time from drivers who misjudge either traffic or their car

because he was in a huge vehicle and therefore personally safer he simply didn't care if oncoming traffic couldn't brake in time.

I suspect a little of A, a little of B. Think about this: Truck driver has a lot of rush hour traffic to contend with. He has to deal with traffic in both directions. He wants to get a clear spot and get across the southbound lanes, then deal with the northbound lanes.

Let's say he looks left to southbound traffic, sees a big opening, says, ok, I'm going to go for it, get across these lanes, and starts rolling, knowing that he can stop mostly clear of traffic in the center median (he'd likely block the u-turn left turn lane). Then he looks right (to northbound traffic) as he starts, notices he's probably going to need to wait for an opening in northbound traffic, and commits to clearing the southbound lanes, but slows his roll a bit (otherwise he'd have to jam on his brakes to stop for northbound traffic). He figures he had "so much time" that he has some time to fully clear the southbound lanes. He may have misjudged the oncoming traffic which is typically 50-60mph, instead it is at 80-90mph (speculation). That greatly reduces the available time.

But, he does not have time. Tesla was going much faster than he thought. He may STILL not have been worried; he would be "used to" an oncoming southbound vehicle changing lanes to the right lane and slowing to avoid the back of his truck as it cleared the intersection. He's not used to a vehicle proceeding at 80mph with speed unchecked without worrying about the large obstacle directly in its path.

Based on where the Tesla impacted, it does look unlikely that the truck had even cleared the right hand southbound lane. Unless perhaps the Tesla detected the solid object of the rear wheels and veered (the wrong way) to avoid (not sure whether it will do this although there are some winter autopilot videos that suggest that it might - or that could have been driver intervention - there is no way for me to know)? Not necessary for it to have this capability for the argument above to generally apply.

This is all speculative of course. We're dealing with an accident here - a lot of unusual factors occurred, so I don't think we should think about how people "normally" drive. It's not "normal" for people to EXPECT other traffic to slow for them - though it happens all the time, it's not good driving. So that may have been a factor. The Tesla driver might just have been sleepy or distracted and traveling too fast. He might have been using autopilot. It's possible that lighting conditions could have been a factor - maybe the light was very dim, actually (I know the stories give the approximate time of collision, but I don't know the exact light at that time in that place in Florida). Maybe the truck didn't have its lights on, and without side markers might have been slightly less obvious? One way or another the Tesla driver didn't see the truck, presumably (the truck didn't bolt across the street out of nowhere, most likely).
 
Last edited:
I'm aware that trucks, especially 18-wheelers, loaded down with fresh produce and a long trailer, take a long time to move from a stop. Southbound SR441 at that intersection essentially had 4 lanes, a right turn lane, two straight-thru lanes, and a left turn lane. And that is before you reach the roadway that's between the opposing highway lanes of traffic. Looking at the overhead photos of the truck and the intersection it seems to me that the truck's length, with cab and trailer, would cover more ground than just the two straight lanes and possibly three or four, really hard to tell.

My parents' best friend owned a cab and hauled cross country for a many years. I'm aware that shifting gears on a truck like that requires a lot of gearshift changes. I drove a five-speed stick car for a number of years and you have to go through the gears. No way around it without stripping something. We have no idea how experienced the driver was either on this truck, nor even how long he had had his commercial license. Long-time truckers pride themselves on their ability to shift gears efficiently but that takes experience.

I wonder how long it will be before any preliminary information comes out.
 
And as someone said in a previous post side guards on trailers are mandatory in Europe and save lives. This could be mandated here and why its not I think is something that if our govt wasn't so dysfunctional would be looked into.

I’ve seen this stated quite a few times, but it contradicts the 2016 NTSB report from the original Model S underride accident. The report states that Euro spec underride guards are only designed for ~35mph and would not have been effective in that collision.

Not sure what my point is, but felt it’s worth mentioning that the NTSB disagrees that underride guards are effective in this type of collision.

I suppose in the specific case of autopilot, the underride guards may increase the likelihood that the system takes evasive action.
 
  • Informative
Reactions: OPRCE and bhzmark
I suspect a little of A, a little of B. Think about this: Truck driver has a lot of rush hour traffic to contend with. He has to deal with traffic in both directions. He wants to get a clear spot and get across the southbound lanes, then deal with the northbound lanes.

Let's say he looks left to southbound traffic, sees a big opening, says, ok, I'm going to go for it, get across these lanes, and starts rolling, knowing that he can stop mostly clear of traffic in the center median (he'd likely block the u-turn left turn lane). Then he looks right (to northbound traffic) as he starts, notices he's probably going to need to wait for an opening in northbound traffic, and commits to clearing the southbound lanes, but slows his roll a bit (otherwise he'd have to jam on his brakes to stop for northbound traffic). He figures he had "so much time" that he has some time to fully clear the southbound lanes. He may have misjudged the oncoming traffic which is typically 50-60mph, instead it is at 80-90mph (speculation). That greatly reduces the available time.

But, he does not have time. Tesla was going much faster than he thought. He may STILL not have been worried; he would be "used to" an oncoming southbound vehicle changing lanes to the right lane and slowing to avoid the back of his truck as it cleared the intersection. He's not used to a vehicle proceeding at 80mph with speed unchecked without worrying about the large obstacle directly in its path.

Based on where the Tesla impacted, it does look unlikely that the truck had even cleared the right hand southbound lane. Unless perhaps the Tesla detected the solid object of the rear wheels and veered (the wrong way) to avoid (not sure whether it will do this although there are some winter autopilot videos that suggest that it might - or that could have been driver intervention - there is no way for me to know)? Not necessary for it to have this capability for the argument above to generally apply.

This is all speculative of course. We're dealing with an accident here - a lot of unusual factors occurred, so I don't think we should think about how people "normally" drive. It's not "normal" for people to EXPECT other traffic to slow for them - though it happens all the time, it's not good driving. So that may have been a factor. The Tesla driver might just have been sleepy or distracted and traveling too fast. He might have been using autopilot. It's possible that lighting conditions could have been a factor - maybe the light was very dim, actually (I know the stories give the approximate time of collision, but I don't know the exact light at that time in that place in Florida). Maybe the truck didn't have its lights on, and without side markers might have been slightly less obvious? One way or another the Tesla driver didn't see the truck, presumably (the truck didn't bolt across the street out of nowhere, most likely).
You're over thinking this. I've seen plenty of truck drivers with this attitude: "OK, there's an opening big enough for me to get started across the road. Yeah, there's some traffic a ways off, but I'm a big ass truck so they'll just have to stop and wait for me, F them, I can't sit here all day."
 
  • Like
Reactions: bhzmark and OPRCE
You're over thinking this. I've seen plenty of truck drivers with this attitude: "OK, there's an opening big enough for me to get started across the road. Yeah, there's some traffic a ways off, but I'm a big ass truck so they'll just have to stop and wait for me, F them, I can't sit here all day."

That's totally possible. In fact, it's kind of what I'm saying. There is a continuum of possibilities in terms of the exact timing and circumstances.

In any case, unless you have a death wish, the drivers of the oncoming vehicles have a responsibility to anticipate and to yield. Or just use the right hand turn lane to get around the truck like a normal human. I wonder if that's programmed into the imminent-release FSD yet?
 
I’ve seen this stated quite a few times, but it contradicts the 2016 NTSB report from the original Model S underride accident. The report states that Euro spec underride guards are only designed for ~35mph and would not have been effective in that collision.

Not sure what my point is, but felt it’s worth mentioning that the NTSB disagrees that underride guards are effective in this type of collision.

I suppose in the specific case of autopilot, the underride guards may increase the likelihood that the system takes evasive action.

Yep, them being there might make the truck more visible such that AEB would trigger. Which would take the ~70MPH down to ~45MPH; so it might make a big difference.
 
I have new data on this matter. Does not look very good.

backstory: a semi run a red light in front of me the car thought it could drive under it!
trailer.png


25 seconds of video from 5 cams: video-360-full - Streamable

4K video download from 5 cams: Box
 
I have new data on this matter. Does not look very good.

backstory: a semi run a red light in front of me the car thought it could drive under it!
View attachment 386133

25 seconds of video from 5 cams: video-360-full - Streamable

4K video download from 5 cams: Box

Interesting, but not familiar with the meaning of the drivable yellow line. I ask, since before the truck arrives, it clearly shows you can also drive straight across the road into the post, even though there are no obstacles to confuse it, and that would be outside of the drivable surface. So does it REALLY think you can drive under the truck? To be clear, I would not be surprised if it did, just wondering.

As a comparison, how does the picture look as you go under a low overpass?

EDIT: I agree looking at the drivable surface it seems like some of the area under the truck is cool with the Model 3 still. It might just fit under there.
 
Last edited:
  • Like
Reactions: tracksyde and OPRCE
I have new data on this matter. Does not look very good.

backstory: a semi run a red light in front of me the car thought it could drive under it!
View attachment 386133

25 seconds of video from 5 cams: video-360-full - Streamable

4K video download from 5 cams: Box

Clearly the Neural Network detected it so wouldn't it brake for it? Plus the radar data is showing it as stationary so it definitely knows its there.

I should probably watch more of your videos to see what the yellow ribbon does when cross traffic gets in the way.
 
Last edited:
I have new data on this matter. Does not look very good.

backstory: a semi run a red light in front of me the car thought it could drive under it!
View attachment 386133

25 seconds of video from 5 cams: video-360-full - Streamable

4K video download from 5 cams: Box


I wonder if because the trailer is mostly white it blends with the sky in the background so gets ignored? This accident and the 2016 one both had white trailers. Would the same be true if the trailer was red for example?
 
some clarifications:
The yellow ribbon: - this is path planner output - this is where the car would drive on autosteer unless changing lanes. green shadow - space car considers driveable.
Also this is a point in time snapshot - who knows how things would have changed if I chose to approach that trailer (esp. at high speed).

What this shows is that at some point in time on approach the car considers the idea of driving under that trailer as reasonable. It does not prove this is what happened in that other case (hopefully AP snapshot was created to show what happened) but it recreates some preconditions to make this worth considering.
 
  • Informative
Reactions: Matias and OPRCE
some clarifications:
The yellow ribbon: - this is path planner output - this is where the car would drive on autosteer unless changing lanes. green shadow - space car considers driveable.
Also this is a point in time snapshot - who knows how things would have changed if I chose to approach that trailer (esp. at high speed).

What this shows is that at some point in time on approach the car considers the idea of driving under that trailer as reasonable. It does not prove this is what happened in that other case (hopefully AP snapshot was created to show what happened) but it recreates some preconditions to make this worth considering.

I don't think this tells us anything because it's just a path planner.

It probably doesn't take into account vehicles in the way. Sure normally you wouldn't see the path planner extend under a vehicle because it's a drawing issue. You can't draw under something that's low.

On lifted pickup's and semi's like this I would expect to see the path planner extend under the vehicle. Unless the path planner specifically took into account vehicles in the way. Ideally the path planner would change color or have some end point to show that the path is blocked so you could tell regardless if it was a low vehicle or high vehicle.

I'll have to look through your videos to find one with a lifted 4x4 that crosses in front at a stop light.

I'm afraid your one from Paris won't work for this because I don't think lifted 4x4's are all that popular in Paris.
 
It probably doesn't take into account vehicles in the way
This truck is never marked as "in the way" at that intersection - the vehicles that are in the way have asterick after their id (see the same truck at 0:09 mark in the vide when it enters into my lane).

also it's not just the path planner, also note that the area is marked as driveable (it's even worse on the wide and narrow camera views where the semi is at times detected as driving away from us: narrow2 - Streamable

Anyway, it's a single nonconclusive data point. People were asking for it. I hope Tesla comes with some good explanation eventually.
 
I wonder if because the trailer is mostly white it blends with the sky in the background so gets ignored? This accident and the 2016 one both had white trailers. Would the same be true if the trailer was red for example?

But it is not ignored! The bounding box has the trailer correctly identified as a stationary truck, nevertheless the path-planning algorithm decides that is a legitimate place to drive, apparently because the semantic space identifier paints underneath it partially as a green drivable surface.

This is a logical conflict (or Superbug) as the car should simply never be permitted to drive through a bounding box.


PS: Thanks for posting this important evidence @verygreen

3. It would be interesting to see the output of a @verygreen video simulating this scenario, as I cannot imagine the vision system would have been painting the road under the trailer as a green drivable space, at least not for more than a few meters beyond the impact point.

4. If that is the case, it begs the question as to why AP would be programmed to allow it to continue driving at full tilt into an undefined space, independent of the readings [or rather lack thereof] from the radar?

5. OTOH, if it was able to paint a drivable path for another 100m under the truck, such that no braking was required at that point, the irony becomes that perhaps even a soft plastic side-skirt on the trailer sufficient to obstruct the view through it may have been enough to safely brake the M3 to a full stop under AP.
 
Last edited:
This truck is never marked as "in the way" at that intersection - the vehicles that are in the way have asterick after their id (see the same truck at 0:09 mark in the vide when it enters into my lane).

also it's not just the path planner, also note that the area is marked as driveable (it's even worse on the wide and narrow camera views where the semi is at times detected as driving away from us: narrow2 - Streamable

Anyway, it's a single nonconclusive data point. People were asking for it. I hope Tesla comes with some good explanation eventually.

Thanks for the info on the asterisk.

The asterisk is clearly shown when you make the turn, and the truck is in front of you. When you opted for the left lane the asterisk went away.

I'm not sure how the AP computer actually uses the asterisk. The asterisk comes on/off at different times depending on the camera (whether it's the main, narrow or fisheye). It's interesting that the fisheye camera indicates the truck as being in the way as it crossed in front of you. I think that's indicative of Tesla's overall problem in labeling semi's. A lot of times it will try to label them as two separate trucks.

The neural network definitely gets the area under the trailer wrong. Normally it show a boundary line for vehicles. So that's why the area under the trailer is shown as green.
 
Last edited:
  • Like
Reactions: OPRCE
This is a logical conflict (or Superbug) as the car should simply never be permitted to drive through a bounding box.

Or, better stated, it should never be permitted to even think about driving through a bounding box.

Thus nothing under a bounding box should ever be painted green as a driveable space for the path-planner to consider choosing as a route.

With this evidence it seems more plausible now that in the case of Banner, where the lane lines would have been tracking (predicted) straight ahead under the perpendicular trailer, underneath of which a considerable distance ahead may have been painted green as driveable space in the crucial timeframe, and with the radar returning no reading due to the obstacle being essentially stationary, and with the bounding-box (presuming it was recognised) from the vision system not considered a sufficient impediment, that the system wrongly concluded to give the signal "keep going, no brakes required."

This certainly seems like negligent system design/programming to me and IMHO would support a suit by his dependents against Tesla for their contribution to his death.
 
Last edited:
Status
Not open for further replies.