Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot nearly allows side collision on highway

This site may earn commission on affiliate links.
And since the guy who got killed did go under the semi, there clearly was some kind of deficiency that needs to be corrected for

The deficiency was the driver using AP in a situation the manual explicitly states it's not intended to be used (roads that have cross-traffic and are not limited access highways)

Nothing to do with the car though.

. Was there a technical failure with the Tesla

Nope. AP is explicitly not intended to deal with cross-traffic.

As anybody who reads the manual would know.

The accident was 100% user/driver error.

Same as the previous drive-under death in 2016 was (and the NHTSA report on that accident makes that clear)
 
While driving to work on the highway with my Base AP, there have been a few times where I was about to pass a bus or trailer truck that was riding right on the edge of its lane, and AP did nothing to give them some room, it stayed centered in its lane.

Tesla needs to have a set of sensors at the top edge of the door, where it meets the roof, to have a higher line of sensor detection, since there seems to be a problem with detecting big truck trailers that sit up off the road.

Hey Tesla, hire me to be an outside technical consultant for a mllion bucks a year and stock options. I'll tell you everything you're doing wrong.
 
  • Funny
Reactions: Kant.Ing
The deficiency was the driver using AP in a situation the manual explicitly states it's not intended to be used (roads that have cross-traffic and are not limited access highways)

Nothing to do with the car though.



Nope. AP is explicitly not intended to deal with cross-traffic.

As anybody who reads the manual would know.

The accident was 100% user/driver error.

Same as the previous drive-under death in 2016 was (and the NHTSA report on that accident makes that clear)

I do agree that this is full on the driver, no argument there whatsoever. My point is that from a discussion standpoint into the technicalities of what happened it doesn't matter what the manual says. If Tesla is trying to get to Level 5 autonomy then this has to be addressed and you can't just say "well the driver didn't read the manual", and "the driver is 100% responsible". I agree with those statements but that doesn't get down to how Tesla can work to prevent this from happening in the future for Level 4/5 autonomy.

From a VERY basic standpoint, this accident happened because the car did not stop. No matter what, and as the manual states that you repeatedly quote, the driver is totally responsible... BUT....There are MULTIPLE systems that COULD make the vehicle stop/slow down, the driver pressing the brake, AEB, TACC, any other collision avoidance code we don't know about....Now, did any of these systems fail(for whatever reason)? Did they see the truck but this it was a bridge...did it see the truck, accurately identify it, but not put the bounding box all the way to the road surface thereby thinking there was a path underneath? That we don't know.

While I agree with your frustration about people blaming the system, is is relevant to getting to Level 4/5 autonomy, so get over it and stop repeating RTFM.

The put disclaimers on plastic bags; is adding some holes in bags that can support them not a reasonable thing to do? It is reasonable to ask how the car could have done more in the situation and attempt to mitigate the risk as can be reasonably be done from a business/money point of view.
 
Im sorry I cannot get over it, I knew the guy who got killed. I drive a Tesla, and I see a huge, very large elephant in the room and it unfortunately is coming from mostly Model 3 owners who bought FSD and think its all a go here. I don't think its a go for FSD no hand no eyes driving.

I do agree that this is full on the driver, no argument there whatsoever. My point is that from a discussion standpoint into the technicalities of what happened it doesn't matter what the manual says. If Tesla is trying to get to Level 5 autonomy then this has to be addressed and you can't just say "well the driver didn't read the manual", and "the driver is 100% responsible". I agree with those statements but that doesn't get down to how Tesla can work to prevent this from happening in the future for Level 4/5 autonomy

Neural Net not complete and software dare I say.

From a VERY basic standpoint, this accident happened because the car did not stop. No matter what, and as the manual states that you repeatedly quote, the driver is totally responsible... BUT....There are MULTIPLE systems that COULD make the vehicle stop/slow down, the driver pressing the brake, AEB, TACC, any other collision avoidance code we don't know about....Now, did any of these systems fail(for whatever reason)? Did they see the truck but this it was a bridge...did it see the truck, accurately identify it, but not put the bounding box all the way to the road surface thereby thinking there was a path underneath? That we don't know.

The car won't stop in its current configuration because this truck was perpendicular to the vehicle and the vehicle does not know what it is seeing yet. Its not programmed currently to stop for non moving, slow moving objects. Frustrating isn't it?

This truck was just like the fire engines last few years being hit. Currently these cars do not stop for immoveable or barely moving objects. IF it did, it would never drive.

Thats where the human comes into play in the basic standpoint. Your eyes tell you to stop if your looking or as you get closer. Your eyes.

Everyone is going too fast here with the process, the car is not nearly as far along as you have lead yourself to believe. Reading helps here.
If you bought FSD, its not working yet at all. Sorry its the truth. Tesla says so and we are all saying so.

Stop blaming the car, its probably not the car. This guy gave his life for a mistake in judgement unfortunately. Your driving the car.
 
Last edited:
  • Like
Reactions: willow_hiller
I do agree that this is full on the driver, no argument there whatsoever. My point is that from a discussion standpoint into the technicalities of what happened it doesn't matter what the manual says. If Tesla is trying to get to Level 5 autonomy then this has to be addressed

Except AP isn't trying to get to level 5. And never will be.

FSD is a different feature set, using different hardware, and a different NN that no consumer in a Tesla is using right now.

While I agree with your frustration about people blaming the system, is is relevant to getting to Level 4/5 autonomy, so get over it and stop repeating RTFM.

But again- it's really not.

Because this system is not trying to get to L4/L5

HW2.x systems are level 2, and won't ever be anything else. That's the whole reason they developed HW3 in the first place.


Now if the HW3 FSD features start being deployed at L3 and higher, and those cars are routinely failing to work in places they're actually supposed to you can discuss flaws in the system that need to be addressed.

But "not working in a place it's not supposed to work" isn't a flaw with the system- it's a flaw with the guy using it.
 
  • Like
Reactions: qwertzy
The car won't stop in its current configuration because this truck was perpendicular to the vehicle and the vehicle does not know what it is seeing yet. Its not programmed currently to stop for non moving, slow moving objects. Frustrating isn't it?

This truck was just like the fire engines last few years being hit. Currently these cars do not stop for immoveable or barely moving objects. IF it did it would never drive.

Thats where the human comes into play in the basic standpoint. Your eyes tell you to stop if your looking or as you get closer. Your eyes.

Everyone is going too fast here with the process, the car is not nearly as far along as you have lead yourself to believe. Reading helps here.
If you bought FSD, its not working yet at all. Sorry its the truth. Tesla says so and we are all saying so.

Stop blaming the car, its probably not the car. This guy gave his life for a mistake in judgement unfortunately. Your driving the car.

But TACC WILL stop for non moving/slow moving objects. Now why(technically) TACC currently stops is a good question and could also be used to further its capabilities.

Again you as others are missing my point, I am not blaming the car, I am not saying the car is the reason this person died. If that was the case Tesla would have a nice settlement to pay.

You drop a glass, the glass breaks...did it break because you dropped it(because you didn't pay attention) or did it break because it wasn't strong enough? BOTH. It is your fault you dropped the glass but that doesn't stop you from finding a way to make the glass stronger, or less-breakable.

Edit, fixed quoting issue.
 
Last edited:
Except AP isn't trying to get to level 5. And never will be.

FSD is a different feature set, using different hardware, and a different NN that no consumer in a Tesla is using right now.



But again- it's really not.

Because this system is not trying to get to L4/L5

HW2.x systems are level 2, and won't ever be anything else. That's the whole reason they developed HW3 in the first place.


Now if the HW3 FSD features start being deployed at L3 and higher, and those cars are routinely failing to work in places they're actually supposed to you can discuss flaws in the system that need to be addressed.

But "not working in a place it's not supposed to work" isn't a flaw with the system- it's a flaw with the guy using it.

So you are saying that there is NO WAY to code TACC/AEB, in the current 2.5 HW, to stop for a semi that has jack-knifed across a "limited access highway"?
 
So is EAP a level below FSD?
There is no FSD yet, technically, just the promise of access to it in the future. EAP is a stop gap Tesla developed after they thought the promise of FSD was "too confusing" and removed it as an option last October. But then they decided it wasn't too confusing after all and went to including AP in all cars and FSD as an add on this March. So, if you ordered between October and March, you couldn't buy FSD and were only offered EAP for $5k. And now, if you want to purchase FSD, you have to pay the full addition $6k, even though they locked everyone out of that option for about 5 months.
 
Then either your settings for the alert are set to be too sensitive, or you "frequently" wait too long to meaningfully brake or are just following too closely in general.

I had the settings to the max when I first started driving, and then brought it down one level after I got annoyed by the excessive early warnings. In any case, you're fixating on the wrong part. Disregard "frequently". The point is, without any automation engaged, my car is still constantly assessing the environment and able to provide a warning of a potential crash. In the situation I experienced on the highway though, it failed to either warn me or take any action on it's own, when AP was engaged.
 
There is no FSD yet, technically, just the promise of access to it in the future. EAP is a stop gap Tesla developed after they thought the promise of FSD was "too confusing" and removed it as an option last October. But then they decided it wasn't too confusing after all and went to including AP in all cars and FSD as an add on this March. So, if you ordered between October and March, you couldn't buy FSD and were only offered EAP for $5k. And now, if you want to purchase FSD, you have to pay the full addition $6k, even though they locked everyone out of that option for about 5 months.


this is...not quite correct.

FSD was always available, you just had to ask for it off-menu.

Nor was EAP a stop-gap developed after FSD was too confusing on October- EAP is what Tesla has sold since AP2 was introduced in 2016... and was only discontinued a couple of months ago.


Prior to a couple months ago you had 2 options:

EAP: Which was all currently existing features (TACC, Autosteer, automatic lane change, NoA, summon, auto park).

FSD: Which got you literally nothing additonal but the promise you'd get more features up to and including L5 driving some day... (and a HW3 upgrade once that was mentioned as existing).



What they did a couple months ago is remove EAP for new purchases and instead now the options are:

AP: TACC and auto-steer in a single lane. Nothing else.

FSD: All the other things that used to be in EAP, plus the promise of more features later (and still the free HW3 upgrade)
 
So you are saying that there is NO WAY to code TACC/AEB, in the current 2.5 HW, to stop for a semi that has jack-knifed across a "limited access highway"?

Jack-knifed? Would likely depend on the angle of the trailer.


The issue here is that AP/EAP on 2.x (and 1.x) has trouble telling the difference between 'sideways trailer' and 'overpass'- there's been considerable technical discussion on why that's so, so I won't rehash it here.

Typically this is a non-issue because in the places you're intended to use AP/EAP there shouldn't be a sideways trailer, and on the off chance there does happen to be one jackknifed with the trailer at exactly a 90 degree angle to oncoming traffic, the fully attentive driver should see that well ahead of time and take corrective action.


Suggesting the existing system must always detect everything correctly and always take appropriate action, without needing the driver to ever do so, misunderstands the entire nature and point of a level 2 driver aid. If it could do that (or was even meant to do that) it would be an L3 or higher system.


There's been 2 such accidents total that I'm aware of in a Tesla in the 5 years AP has existed in one form or another- both on roads AP isn't even supposed to be used on. So it doesn't appear that within the domain AP is intended for there's actually a real-life problem. There's barely even one OUTSIDE that domain statistically because presumably MOST folks are paying attention and notice things like a trailer blocking the road regardless of AP.
 
Here's my recent situation somewhat similar to what the OP dealt with.

I was on Navigation on AP (NOA), center lane originally. NOA detected a slow-moving car in front of me, changed to the left lane to pass, once passed, it attempted to move back into the center lane. At this point, a fast charging aggressive driver also passed the slow-moving (center lane) car, but on the right lane, and also tried to cut into the center lane once it passed the slow-move car. At this point, my NOA detected the fast charging car moving into my lane behind me and jerked me back into the left lane.

While the distance between the aggressive driver and me was not as hairraising as the OP's video, you can make the judgment on how you feel about the software. IMHO, NOA did exactly what it was supposed to do. The aggressive NOA jerking me back to left lane was jarring enough to my wife who was in the passenger lane and asked if I was the one that did it, and answer was no as I never even saw the fast charging driver coming up from behind me until he's way ahead of me after the near collision/avoidance.

Right-repeater:
Front-repeater:
 
I've said it before; Tesla should require drivers to review an online Autopilot safety course (through the app or center console) before AP can be activated (per named profile) so drivers are fully aware of CURRENT limitations.

This reduces Tesla's liability and increases driver safety (there is a large percentage of people who think FSD is already here and working).

Segway/Ninebot requires online training before letting you activate higher speeds... for a little scooter that maxes out at 16mph... you would think Tesla could do the same.
 
Stop blaming the car, its probably not the car. This guy gave his life for a mistake in judgement unfortunately. Your driving the car.
So if I had a GPS app that kept recommending I go the wrong way on a one-way street, I should be aware of my surroundings and not go down the one-way street. But at some point, that app has a known, dangerous issue - it shouldn't be on the market until they fix it.

Ever see an Uber driver stare at their GPS for your pickup location, and completely miss you waving at them?
Automation bias - Wikipedia
 
Autopilot is far from perfect. If this is any indication of Tesla's Full Self Driving, they have a lot of work to do.
Same situation happened to me, but w/o autopilot. I had a curb on my left, and a lane changer 3/4 adjacent on my right who was moving into my lane at 40 MPH. I too apply the brakes and horn. I know there is a lot of automatic stuff in the car -- but I would sure like an automatic beeping of the horn when a) 'the *sugar* is getting real', as observed from cameras and ultrasound; and b) I've just just pulled my foot off the accelerator like I was barefoot on a hot iron.

I probably cleared the curb by ~ 1 inch, and had maybe 2 inches on the random car's side. An earlier (automatic) beep would have brushed off the random car, and I would have had better precision making the most of my lane without impacts.