Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla 3 crashes into overturned truck on highway

This site may earn commission on affiliate links.
If you analyze the video frame by frame, it is obvious the Model 3 was slowing down starting after it passes the truck driver. Perhaps regen braking. The distance traveled between frames grows smaller and smaller after the truck driver. It is also obvious to me for another reason: if the Model 3 hadn't slowed down the damage would have been much more.

Not seeing it. If you're looking at the rear view, it's probably an illusion caused by the curvature of the road. Front view looks to me like the same amount of frames pass as the shadow of the Model 3 passes each white road line.

As for the severity of damage.. probably helps that it looks like it hit a truck full of containers of some kind of fluid. But look how far it pushes a full truck after impact.
 
I would recommend looking at this review and analysis of the above mentioned incident:

bleh.png


DRIVER COMPLETELY UNINJURED; WALKS AWAY UNHARMED
However, they don’t note the most important thing, which is that the Tesla driver is reportedly uninjured.

The local media reported (translated):

It can be seen from the picture that the impact force is so great that even the truck shakes. It is understood that the Tesla driver was unharmed. He confessed to the police that the auxiliary system was turned on at the time, and the self-driving state was not adopted. There is no drunk driving situation, and the relevant transcripts have been completed so far, and the two parties have to face the subsequent compensation matters.

Electrek
If this is “the most important thing”, why wasn’t that the headline? “Tesla driver in Taiwan walks away unharmed after running into overturned truck”. That’s pretty amazing! I guess a positive story about Tesla safety isn’t as salacious as one where Autopilot takes the blame for what happened.
 
  • Like
Reactions: FreshPrince
I would recommend looking at this review and analysis of the above mentioned incident:


The white smoke is not from braking. If you look at the 25 second mark (of the video posted on page 2 of this thread), you can see the car behind the Tesla actually kicks up the white "substance" on the road a little as well. The second car seems to have been paying attention and not driving full speed towards the overturned truck, so there isn't as much.​
 
  • Like
Reactions: diplomat33
Driver stated that Autopilot was off, and TACC was on ..

Anyone who owns a Tesla might have noticed one key fact about TACC when being used .. and that is that *if* the driver so much as has their foot ever so slightly on the accelerator pedal then the dash shows a message “cruise will not brake” for you.

If Autopilot was on, the car would have definitely detected the lorry driver standing in the traffic lane and braked or tried to avoid him. It did not budge one inch, or make any attempt to brake at that point.

The puff of ‘something’ was a patch of something in the road because both the Tesla and another car produced the same kickup at the same point. It might have been a patch of something from the upturned lorry, or even something that cause the lorry to overturn (I’d like to see more video of how on earth the lorry ended up on its side as that could be useful)

I therefore suspect the driver was simply not alert and doing something else whilst cruising along with his foot slightly on the accelerator and therefore made no attempt to avoid the lorry driver, or brake until just before actual impact. The airbag didn’t go off because of the very unusual nature of the accident with the top of the lorry absorbing some of the impact energy and not providing the sudden sharp impact needed to trigger the airbags.



Busted !

.
 
...I would recommend looking at this review and analysis of the above mentioned incident...

I would NOT recommend that article.

It advised looking for the car's log then the article is filled with speculations without reviewing through the car log.

One worst example is to say AEB was activated at "5 seconds before, 4 seconds before, and immediately before impact" just by looking at the white smoke.

In this thread, we have established that the smoke is not from the brake because we also see big white smoke at collision and white materials spilled on the road at the collision moment also.

The following car also created a smaller plume of white smoke and if you look from the rear view video clips, there were no brake lights for Tesla and the following car but you could see the Right Signal Light for the following car even when there was white smoke:

bqST7xl.jpg
 
Last edited:
  • Like
Reactions: OPRCE
Driver stated that Autopilot was off, and TACC was on ..

Anyone who owns a Tesla might have noticed one key fact about TACC when being used .. and that is that *if* the driver so much as has their foot ever so slightly on the accelerator pedal then the dash shows a message “cruise will not brake” for you.

If Autopilot was on, the car would have definitely detected the lorry driver standing in the traffic lane and braked or tried to avoid him. It did not budge one inch, or make any attempt to brake at that point.

I therefore suspect the driver was simply not alert and doing something else whilst cruising along with his foot slightly on the accelerator and therefore made no attempt to avoid the lorry driver, or brake until just before actual impact. The airbag didn’t go off because of the very unusual nature of the accident with the top of the lorry absorbing some of the impact energy and not providing the sudden sharp impact needed to trigger the airbags.



Busted !

.

It won't brake during Autopilot either. I use the accelerator a lot since the speed limit data is outdated / messed up, and there are some areas it slows way down in curved roads here in Europe. TACC is using the same vision stuff and will phantom brake and so on, same as Autopilot. Think of TACC as Autopilot minus Autosteer.

So there are 2 causes for this:
a) the driver did indeed have the foot on the accelerator
b) the AP algoritms failed to detect it since a turned over truck is definitively an edge case.
 
  • Informative
Reactions: APotatoGod
I would recommend looking at this review and analysis of the above mentioned incident:

bleh.png


DRIVER COMPLETELY UNINJURED; WALKS AWAY UNHARMED
However, they don’t note the most important thing, which is that the Tesla driver is reportedly uninjured.

The local media reported (translated):

It can be seen from the picture that the impact force is so great that even the truck shakes. It is understood that the Tesla driver was unharmed. He confessed to the police that the auxiliary system was turned on at the time, and the self-driving state was not adopted. There is no drunk driving situation, and the relevant transcripts have been completed so far, and the two parties have to face the subsequent compensation matters.

Electrek
If this is “the most important thing”, why wasn’t that the headline? “Tesla driver in Taiwan walks away unharmed after running into overturned truck”. That’s pretty amazing! I guess a positive story about Tesla safety isn’t as salacious as one where Autopilot takes the blame for what happened.

If that were tire smoke from an emergency braking, why did the car not pitch forward as force transferred onto the front wheels? Review the video: the aspect of M3 in the longitudinal horizontal plane remains absolute stable as it passes over this patch producing the spray. Which is impossible with a sprung suspension - ergo there was no significant braking at this point.


If Autopilot was on, the car would have definitely detected the lorry driver standing in the traffic lane and braked or tried to avoid him.

That is very far from definite - in my experience at higher speeds pedestrians and bicycles have ~60% chance of showing up on IC.

And AFAIK no public release of Tesla AP as yet steers around human obstacles.
 
Last edited:
I think we can agree overturned trucks are an edge case that should be avoided by any FSD vehicle. I've never come across one in my decades of driving, or at least not one that wasn't already blocked off by cones / police. That said, at least for freeway driving, how many 9s are we at? Tesla claims 1 accident per 4.68M miles in Q1, which is biased for relatively safe driving conditions. How would one convert that to percent safety? If you count that 1 accident as an entire mile of failure, then 1/4,680,000 is 99.99998% of miles without an accident. So does that mean 4 9s isn't enough?
 
  • Like
Reactions: APotatoGod
This safety discussion has me wondering, what level of safety do we as consumers demand, and at what cost? Drivers routinely drive faster than the speed limit, even though it's demonstrably less safe. Yet we do the calculation, and decide it's worth the risk.

Testing out "FSD" stop light detection (with AP, not just TACC) has my car going just 5 mph over the "detected" speed limit on a 4 mile stretch to the freeway from my house. The detected speed limit is 5 mph under the actual limit. Most drivers driver 10 mph over the speed limit, so it's a bit frustrating be driving so slowly down the street, even though it's the posted, and (presumably) safer speed limit.

I can see tolerating a slow driving communal robotaxi you don't own. But what about being in a personally owned self-driving car? Do you want it to be more aggressive? Does the driver take responsibility for overriding the defaults? Would different car makers have different driving styles, like BMW could have theirs cut people off and never use the blinkers (j/k).

Maybe a compromise would be, if there's a "driver" in the driver seat, they can tell the car to drive faster and be more aggressive (Mad Max Mode), but have to take on liability. Would that still be a Level 4 / 5 car?
 
...Would that still be a Level 4 / 5 car?

The idea of robotaxi is the owner can sleep at home and a Tesla will do all the work that the current Uber/Lift drivers are doing.

There will be no drivers for robotaxi.

It will recognize and brake for stationary objects and will avoid this kind of accident easily.

But that's an idea to be proven in the future.

Right now, the manual clearly states that:

"Warning: Navigate on Autopilot may not recognize or detect oncoming vehicles, stationary objects"...
 
The idea of robotaxi is the owner can sleep at home and a Tesla will do all the work that the current Uber/Lift drivers are doing.

There will be no drivers for robotaxi.

True, but I wasn't talking about a robotaxi. I was talking about using my own car to self-drive me somewhere I want to go. If I'm instructing the car to break the law, wouldn't I be liable for, say, speeding? Though, point taken, it's still Level 4 / 5 if it doesn't require me to do anything.
 
True, but I wasn't talking about a robotaxi. I was talking about using my own car to self-drive me somewhere I want to go. If I'm instructing the car to break the law, wouldn't I be liable for, say, speeding? Though, point taken, it's still Level 4 / 5 if it doesn't require me to do anything.

Yes. Current FSD is still beta.

Right now, FSD is Level 2 that still requires a licensed driver. So, the driver is still responsible.

Level 3 was touted in 2017 by Audi as the car would take over most of the monitoring so its divers don't have to watch for an overturned truck on the road like Tesla drivers have to. It also planned to take over the liability if Audi cars hit an overturned truck on the road. However, it has backed out and still keeps it as Level 2 for drivers to be liable.

Audi gives up on Level 3 autonomous driver-assist system in A8

Level 4 still requires human controls (steering wheels, foot pedals...) so my guess is, if there's a human driver in L4 car, human is still responsible for human controls to override L4 mistakes.

Level 5 does not need human controls (steering wheels, foot pedals...) so there's no driver seat in the car, thus, no driver, thus no way to give a human driver a ticket for not overriding the system with non-existing human controls. It's assumed that in an L5 Tesla, Tesla would be given a ticket because there's no driver there but there's an algorithm from Tesla that made the car got into a traffic ticketing situation.

Tesla did not commit taking over the liability for Robotaxi mishaps during the presentation on Autonomous day but did say "maybe".
 
Last edited:
I know that you knew that car was under the bridge, where there's more contrast. Therefore, your response was disingenuous. Thanks

Now, that same car passed the bridge and under no shadow, you can still see at least the right brake light. Its left brake liight is all white out overexposed under the sun. Other cars and trucks not under the shadow and you can still see their brake lights either under the shadow or not:

ynECw21.jpg



But all the talk about, white smoke, brake lights, AEB, Autopilot and FSD and no death in this collision still means that drivers just can't rely on AEB, Autopilot, and FSD to avoid a collision with a stationary object.

At this beta level, yes it still can collide.

Don't try to justify the smoke, brake lights, automation, no deaths, and everything else.

So please do the right thing: Drivers should monitor the environments.

Once the automation goes to another level, 3 and above, drivers won't have to monitor the environment that much. But that's future, not now.
 
Last edited:
Level 5 does not need human controls (steering wheels, foot pedals...) so there's no driver seat in the car, thus, no driver, thus no way to give a human driver a ticket for not overriding the system with non-existing human controls. It's assumed that in an L5 Tesla, Tesla would be given a ticket because there's no driver there but there's an algorithm from Tesla that made the car got into a traffic ticketing situation.

My hypothetical question is, let's say I'm in my own car, in the driver's seat in the future with Fully Autonomous Driving on the freeway with a posted speed limit of 65 mph. If I roll the jog wheel up to 80 mph, and the Level 5 car dutifully complies and accelerates to 80 mph, would I be liable for a speeding ticket? Seems a bit unfair to put Tesla on the hook for my decision to have the car speed.


To make it interesting, lets go down the slippery slope.

Hypothetical scenario B. You are a passenger in a Tesla Robotaxi in the back seat. To be a safe, and more realistic driver, the Tesla Robotaxi decides to drive at 70 mph to keep up with the flow of traffic. A CHP officer decides to pull over the car. Clearly, as a passenger, you would not be liable for a ticket, and I suppose the car itself would get a ticket, and Tesla would have to pay for it.

Hypothetical scenario C. You are a passenger in a your own privately owned Tesla in the back seat. The Tesla decides to drive at 70 mph to keep up with the flow of traffic. A CHP officer decides to pull over the car. Clearly, as a passenger, you would not be liable for a ticket, and I suppose the car itself would get a ticket, and Tesla would have to pay for it, as the Tesla system was driving, even though you own the car.

Hypothetical scenario D. You are a passenger in a your own privately owned Tesla in the driver's seat. The Tesla decides to drive at 70 mph to keep up with the flow of traffic. A CHP officer decides to pull over the car. You are in the driver's seat, but make the case to the officer you are a passenger and should not be liable for a ticket. How the hell would that work? Hopefully you could play back the cabin camera (assuming you have a Model 3 or Model Y) to show you weren't driving.


Note: Getting pulled over and ticketed for going 5 mph over the speed limit is unlikely, but does happen (happened to me). Maybe the cop doesn't like Teslas, or it's the end of the month and they haven't got as many tickets in as usual, or you got pulled over anyway for not having a front license plate.
 
Last edited:
The white smoke is not from braking. If you look at the 25 second mark (of the video posted on page 2 of this thread), you can see the car behind the Tesla actually kicks up the white "substance" on the road a little as well. The second car seems to have been paying attention and not driving full speed towards the overturned truck, so there isn't as much.​

White smoke could be both, Tesla passing by some dust as well as braking which would explain why dust cloud is larger than at other places.