Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP/FSD related crashes

This site may earn commission on affiliate links.
I am guessing you did not read this tweet closely. Green said that the driver *overrode* FSD to change lanes into the obstructed lane. Autopilot would have avoided it; human chose and caused it. You can do this with AP engaged using the turn signal.
 
I am guessing you did not read this tweet closely. Green said that the driver *overrode* FSD to change lanes into the obstructed lane. Autopilot would have avoided it; human chose and caused it. You can do this with AP engaged using the turn signal.

I know from my own stupidity that Auto-lane change won't allow you to do a commanded lane change into an obstructed lane.

It won't do anything, and ignores the request.

I use commanded auto-lane changes fairly frequently as they're my favorite part of FSD.
 
I am guessing you did not read this tweet closely. Green said that the driver *overrode* FSD to change lanes into the obstructed lane. Autopilot would have avoided it; human chose and caused it. You can do this with AP engaged using the turn signal.

It might be true that the driver chose to change lanes but the job of the system is to automatically avoid a collision if the TACC and the Autosteer are not turned off (overridden):

The picture below:

ACC: On 70MPH (Traffic-Aware Cruise Control)
AP: On (AutoSteer)
Steer: 2.2 / 4.3 degrees.
Brake: No

The system turns to the right (positive degrees) to automatically change the lane and the driver was not required to steer. The driver could steer at any time but that would disable the system and the cyan coloration of the system status of on would be dimmed to grey at the right lower corner.



WRhyVya.jpg



The picture below indicated that the driver manually took over as all the system of TAAC and AutoSteer was turned off at that time. Probably too late to realize that the system would not automatically brake in this case:

ACC: Off (Standby) 70MPH (Traffic-Aware Cruise Control)
AP: Off (Unavailable) (AutoSteer)
Steer: -12.9 / -11.2 degrees
Brake: Yes

The driver manually and sharply turned to the left (negative degrees) to avoid hitting the truck as well as braking but the car still had the momentum at 61 MPH.

fPgY0yg.jpg


This is nothing new for this kind of collision as it repeatedly happened in the past because Tesla technology can still collide with stationary objects in the past and present.

The latest breakthrough to solve this problem was Pure Vision.

So the question is, is there anything new with Tesla technology to hint that collision avoidance is progressing quite well just in time for the new 2024 dedicated robotaxi?
 
Last edited:
  • Like
  • Informative
Reactions: KenC and S4WRXTTCS
I am guessing you did not read this tweet closely. Green said that the driver *overrode* FSD to change lanes into the obstructed lane. Autopilot would have avoided it; human chose and caused it. You can do this with AP engaged using the turn signal.
Not sure what you are saying, but regardless if human commanded to move into a danger area, AP should have braked (hard?) and possibly disengaged (before human disengaged). Text says human disengaged 2-3 seconds before impact.
 
Yeah this is pretty clearly a mistake on AP's part. The lane change was merely "requested" by the driver with AP still engaged, and AP decided it was ok to do so and then proceeded to ram the car into a large, immobile object blocking the lane. The only disengagement of AP was at the last second by the panicked driver. AP could/should have refused the lane change request (it does this routinely based on safety concerns about nearby moving traffic) if it realized the danger, and AP/TACC should've eventually hit the brakes pretty hard before that collision. This is more evidence (on top of other past incidents) that AP is still terrible at avoiding large immobile objects, which seems like an important thing to fix! For that matter, seeing a human on the shoulder of a freeway (the one waving the flashlight) should've been a big red flag for the FSD computer as well.

All that being said, though: all these ADAS systems in Teslas (TACC, Autosteer, NoA, FSDBeta) are L2 and require an attentive driver at the wheel willing to deal with these kinds of edge cases, so in the real legal and/or ethical sense, it's still the driver who's at true fault. The human mistakes here are reasonably and understandable: it was dark and hard to see, and this is a pretty rare situation to anticipate. Still, it was the human's responsibility to realize what was going on and fix the situation, just like it would be in any manually-driven car. I would hope (but you can never know if you weren't there!) that if it were me, once I noticed the flashlight waving on the shoulder, that I'd have already been starting to brake and get ready to steer hard left (taking up the slack in the reaction time, basically), even if my brain couldn't quite figure out the total situation yet at that point.
 
  • Like
Reactions: VanFriscia
it's still the driver who's at true fault.
This is likely a big reason why this problem exists to the degree in which it does.

The manufacture is free to release features without fully verifying that they work, and they can release features with serious limitations.
The driver is human so they're going to fall into traps human usually do like no longer paying as much attention as they normally would.
The features work some of the time, but not all the time so the driver has to deal with the "is it seeing something I'm not or is it being dumb?" question when a feature like auto-lane change ignores the request.

We seem to be stuck on L2 without much momentum past thing. L3 has limited functionality as a traffic assist only feature, and L4 seems to be way off for the general masses.

I would have preferred some sharing of liability in an accident where both AP, and the driver failed to prevent an accident.

The approach the Europeans took was to limit functionality which isn't something I'd want.
 
  • Like
Reactions: daktari
Beyond the talk about AP/FSD with this crash, I'm more interested in why AEB didn't kick in. All the talk about phantom braking, and AEB kicking in with a shadow of a tree, yet it doesn't seem like AEB kicked in while speeding towards the accident. I wasn't there, and have no idea what the conditions were really like, but just based off the video, this could have been avoided by an alert driver who should have never started a lane change while seeing the crashed vehicle, and the driver warning people away with the flashlight. But as for why AEB didn't kick in and slam on the brakes?
 
After watching the video many times, it appears that the driver wanted to take the upcoming exit as indicated by the sign on the right. Prior to the start of the video, the driver initiated a lane change as he was coming up a hill. At this time, the obstruction in the right lane, the person waving the light and the emergency vehicle on the other side of the truck may not have been visible.

So, the video starts with the car complying with the lane change command. The obstruction is still a bit off, so the car has not yet detected it. Complicating things at this point is an emergency vehicle of some sort on the other side of the truck with a blinding light pointing toward the Tesla. This may have obscured both car and driver from seeing the overturned truck. Once the car has entered the right lane, the blinding light is obscured by the truck but the driver's eyes would not have had time to recover.

As an additional complication, there is a person on the shoulder waving another light. The Tesla would likely see the person but would not understand what he is trying to communicate. The driver was almost certainly distracted by making sure he did not hit him and trying to understand the waving light. By the time he got close enough to figure this out, he saw the truck. But it was too late by then.

So, what about AEB? From the video, it's not obvious whether AEB triggered or not. But AEB is not guaranteed to prevent an accident. It only mitigates the damage by reducing speed. In this case, the car did slow prior to the crash. Whether it was AEB, the driver, or both, is not apparent.

What's really sad is that there are some reflective construction cones in the shoulder. Had the person waving the light thought to place those out in the road like hazard triangles or road flares, he would have done a far better job helping to avoid the accident.
 
I'm optimistic detection of this situation will be much easier with FSD V11 and the merged stack. Possible algorithms are:
  1. If I detect people on limited access freeway (dude with a light), then be extra leery.
  2. As evident by other crashes, current software doesn't detect stopped vehicles. Current FSD obviously detects stopped vehicles, so easy theory that merged stack will do it too.
 
Last edited:
...I'm more interested in why AEB didn't kick in...
...So, what about AEB? From the video, it's not obvious whether AEB triggered or not. But AEB is not guaranteed to prevent an accident. It only mitigates the damage by reducing speed. In this case, the car did slow prior to the crash. Whether it was AEB, the driver, or both, is not apparent...

The owner's manual:

"Automatic Emergency Braking does not apply the brakes, or stops applying the brakes, when:

You turn the steering wheel sharply.
You press and release the brake pedal while Automatic Emergency Braking is applying the brakes.
You accelerate hard while Automatic Emergency Braking is applying the brakes.
...
Warning
Automatic Emergency Braking is designed to reduce the severity of an impact. It is not designed to avoid a collision."


As @Supcom said, even if Automatic Emergency Braking happened, its goal was not to avoid a collision. It just wants to reduce the severity of the collision.

The manual listed why it might not activate Automatic Emergency Braking in this case:

"You turn the steering wheel sharply."

At the last moment, the driver took over the steering wheel manually which could inhibit the activation of the Automatic Emergency Braking.

It's not sure how the driver applied the brake if but if "You press and release the brake pedal while Automatic Emergency Braking is applying the brakes" then that would disable Automatic Emergency Braking too.

The lesson here is: Don't rely on Tesla Automatic Emergency Braking. It's better to prevent an accident in the first place rather than hoping the Automatic Emergency Braking would come to the rescue.

On the other hand, LIDAR people claim that their Automatic Emergency Braking is reliable:

 
  • Like
Reactions: Dewg
The driver of a Tesla operating on autopilot must stand trial for a crash that killed two people in a Los Angeles suburb, a judge ruled Thursday. There is enough evidence to try Kevin George Aziz Riad, 27, on two counts of vehicular manslaughter, a Los Angeles County judge said. A judge ruled Thursday that a trial can proceed against a Tesla Model S driver in a 2019 crash that left two people dead in Gardena.

Police said the Tesla Model S left a freeway and ran a red light in Gardena and was doing 74 mph (119 kph) when it smashed into a Honda Civic at an intersection on Dec. 29, 2019.
The crash killed Gilberto Alcazar Lopez, 40, of Rancho Dominguez and Maria Guadalupe Nieves-Lopez, 39, of Lynwood, who were in the Civic and were on their first date that night, relatives told the Orange County Register.
Prosecutors said the Tesla's Autosteer and Traffic Aware Cruise Control were active.
 
Last edited:
...AP hits and kills a motorcyclist in Salt Lake City.
What version is the AP? I don't think AP1 does any good for motorcyclists.

AP2/FSD beta is better for motorcyclists, but it can still aim at pedestrians so I won't be surprised if it still aims at motorcyclists.


Will full stack prevent this?...

Collision seems not to be a priority for Tesla because human drivers are responsible for that task. There's no penalty for Tesla, but it's all the human driver's fault.

Based on Tesla's automation history, the next one, the full stack might not prevent this accident either.
 
  • Disagree
Reactions: nvx1977
Driver not paying attention, AP hits and kills a motorcyclist in Salt Lake City.
Will full stack prevent this? When is it coming?
Article does not state how the police determined that the car was using AP at the time of the crash. Maybe based on the driver's statement. The same driver who claims he didn't see the motorcycle and would be motivated to find another liable party?

The report comes too soon for police to be able to get access to the car's telemetry data from Tesla.
 
  • Like
Reactions: DanCar