Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP accidents

This site may earn commission on affiliate links.
it chose to ignore and refused to act on avoiding the obstacle
Karpathy explained this at CVPR this year when introducing Tesla Vision:

We have a stationary approach again. … We are just approaching this vehicle and hoping to stop. What you see in orange in the legacy stack is that it actually takes us quite a bit of time for us to start slowing, and basically what's happening here is that the radar is very trigger happy, and it sees all these false stationary objects everywhere.​
Radar by itself doesn't know what actually is a stationary car and what isn't, so it's waiting for vision to associate with it. Vision if it's not held up to a high enough bar is noisy and contributes sort of error. The sensor fusion stack just picks it up too late. So again you could fix all that -- even though it's a very gross system with a lot of if statements and so on. Because the sensor fusion is complicated, the error modes for vision and radar are quite different.​
But here when we just work with vision alone and we take out the radar, vision recognizes this object very early gives the correct depth and velocity. There's no issues so we actually get an initial slow down much earlier, and we really simplified the stack a lot.​

cvpr stationary.png
 
1) No it's an AP2.5 not even with the new hardware
So would there be any brave soul to do a demo of the latest state of the tech from Tesla: HW3, Pure Vision approaching the same scenario (stationary obstacle) even in broad daylight?

2) On the new fsd beta or tesla vision the system use full headlights not on this case
You missed the part that GreenTheOnly tweeted "the truck was seen by both the radar and also could be clearly observed by the narrow camera from quite a distance away (pictures are spaced 1 second apart)"

The camera was not blind: It recorded the obstacle even when the system is not the most modern Pure Vision only.
3) Yes my car with full headlights slow down but it's an HW3 cars.
@Terminator857 was saying "Don't drive faster than your headlights" but I doubt the most modern Tesla Automation can avoid a collision at 75 MPH due to the range restriction of visible light shining by the headlights.

This gif is reported by @flyhighboi20 using Pure Vision of FSD beta that hit the garbage can in the middle of a dark road because, by the time the headlights range reached the garbage can, it's too late and resulted in broken the front bumper fascia on 11/29/2021:

2021-11-29_19-31-13-garbage-can_trim-short-2-gif.739270
 
Last edited:
  • Like
Reactions: Terminator857
...The first is Waymo that no see the pedestrian on a crosswalk...
What do you expect? There's a human driver in that Waymo.


Waymo has perfected avoiding collisions WITHOUT any human drivers in Chandler, AZ
...The second is an autonomous bus in Canada the go directly on a tree after they go out of the road...Bus is equiped with more than two lidars + 4 radars it's better NO
What do you expect? They are not proven as Waymo is. Sometimes it's very difficult to Copy Cat. In addition, there's a human driver in that bus too which means its level of autonomy is not guaranteed just yet.
 
Last edited:
It might be it's just a Waymo PR's excuse but they could get away with that in non-California states because California reporting is more strict and also there's a car's log so the possibility to cover up for its automation and to blame humans is unlikely.

That's why there should not be a human driver in an Autonomous car so that companies can't just blame human drivers for its automation.
 
It's terrible that this accident occurred, and I hope that those affected are ok. Using the high-beams might have helped give more warning. Sometimes you can on highways and there's enough separation between the directional lanes that there is minimum impact to the traffic on the other side. I do think that radar should have seen the truck and slowed down. Night vision cameras could help too as part of a future update.
 
The GreenTheOnly tweeted "the truck was seen by both the radar and also could be clearly observed by the narrow camera from quite a distance away (pictures are spaced 1 second apart)"

This is correct, but needs clarification.

It was seen by the radar, but discarded upstream as it often is when of a stationary object.

The truck was visible to the human eye in the narrow camera, but that doesn't mean the Tesla Neural Network detected it. We've seen numerous cases where it doesn't correctly detect things half way in the lane. In this case it looks like the signal was ignored because vision didn't see it until it was too late.
 
Last edited:
  • Like
Reactions: EVNow
If vision works, we should see a demo of how a Tesla would avoid crashing into a stationary obstacle at 75 MPH before turning off the radar and switching to radarless this year.

I feel like regulators are at fault for this.

They allow L2 driving, and yet they don't have any minimal safety related tests that a vehicle must pass. Things like not crashing into stationary obstacles.

Obviously Vision + Radar with HW2.5 didn't get the job done.

We don't know if Pure Vision with HW3 would get the job done. The current version would probably crash at 70mph where it did its 5mph phantom brake thing.. :p
 
It's terrible that this accident occurred, and I hope that those affected are ok. Using the high-beams might have helped give more warning. Sometimes you can on highways and there's enough separation between the directional lanes that there is minimum impact to the traffic on the other side. I do think that radar should have seen the truck and slowed down. Night vision cameras could help too as part of a future update.

High beams wouldn't have been in play due to oncoming traffic in the other lane.
 
vision recognizes this object very early gives the correct depth and velocity.
Karpathy has so far failed to back this up with any real data.

Tesla's are so notorious for running into stationary objects that there is very much a need for Tesla to prove to owners that this issue has been resolved with Pure Vision.

During autonomy day they completely left out medium to long range distances in their vision versus radar comparison.
 
Just last week we have two accidents with autonomous cars and bus
The first is Waymo that no see the pedestrian on a crosswalk
The second is an autonomous bus in Canada the go directly on a tree after they go out of the road
All the Waymo and Autonomous Bus is equiped with more than two lidars + 4 radars it's better NO

By that logic you're assuming that all accidents everywhere are perception failures.

The very first question to ask is if the vehicle was in autonomous mode, and the Waymo example wasn't.

The autonomous bus likely was, but I don't believe we have any of the details as to the root cause of that accident.
 
Just last week we have two accidents with autonomous cars and bus
The first is Waymo that no see the pedestrian on a crosswalk
The second is an autonomous bus in Canada the go directly on a tree after they go out of the road
All the Waymo and Autonomous Bus is equiped with more than two lidars + 4 radars it's better NO
Waymo was in manual drive.
 
they completely left out medium to long range distances in their vision versus radar comparison
Are you referring to this AI Day comparison?ai day depth.png

Isn't that basically the same type of graph – Predicted Longitudinal Depth/Velocity – that I shared earlier from CVPR – Relative Longitudinal Position/Velocity of CIPV [closest-in-path-vehicle]? The AI Day data shows a maximum distance of ~42m while CVPR data goes up to ~180m, which I would think covers medium to long range distances (main camera is "max distance 150m" and narrow "250m").

Sure, it's only one instance in a test environment, and even then the position predictions weren't consistent until 145m. So yes I also think it would be interesting to see how Tesla Vision has improved in more scenarios such as max distance, low light, poor visibility, curves and obstructions.
 
  • Like
Reactions: S4WRXTTCS
AP/FSD should slow down to be as safe as the reaction time within the headlights' range?
Tesla Vision actually has existing logic to slow down if it detects poor visibility:
poor visibility.jpg


This was going through some thick fog today on the interstate with FSD Beta 10.5, and Autosteer stayed active but dropped down to 59mph from a set speed of 70+. When the fog got even thicker, it showed the big red "take over immediately" warning resulting in just TACC active and slowing down to about 45mph, so with 7 seconds of visibility at that speed is ~140m which is close to the main camera's limit.

Tesla could have Autopilot detect more types of poor visibility, but maybe Tesla Vision would have avoided crashing in the first place even with OP's low light scenario.
 
  • Like
Reactions: VanFriscia
Tesla Vision actually has existing logic to slow down if it detects poor visibility...

True.

But normally functioning headlights in a clear dark night does not slow down Autopilot speed. It continues to drive up to 90 MPH radar or 80 MPH radarless.

I would define driving in the dark with just headlights with no aid from streetlights or full moon as poor visibility but not according to Autopilot.
 
Sure, it's only one instance in a test environment, and even then the position predictions weren't consistent until 145m. So yes I also think it would be interesting to see how Tesla Vision has improved in more scenarios such as max distance, low light, poor visibility, curves and obstructions.

Yeah, we need a lot more data along with comparisons with Lidar.

Lidar should be the gold standard, and then Vision+Radar along with Pure Vision compared with it.

The use of of an offset trailer is good, and its refreshing that we at least have that comparison you posted. But, I'd like to see comparisons with other stuff including things like a Sofa on the highway at night. Things where the Neural network wasn't trained to identify.
 
Tesla Vision actually has existing logic to slow down if it detects poor visibility:
View attachment 747292

This was going through some thick fog today on the interstate with FSD Beta 10.5, and Autosteer stayed active but dropped down to 59mph from a set speed of 70+. When the fog got even thicker, it showed the big red "take over immediately" warning resulting in just TACC active and slowing down to about 45mph, so with 7 seconds of visibility at that speed is ~140m which is close to the main camera's limit.

Tesla could have Autopilot detect more types of poor visibility, but maybe Tesla Vision would have avoided crashing in the first place even with OP's low light scenario.


My experience with 10.5 was that it was absolutely awful in the rain, and was the first time my car ever slowed down due to weather with just TACC.

I experienced approximately the same weather with 10.6.1 without any such message. Aside from phantom braking, and lack of smoothness during stop and go (I have a performance Model 3) it wasn't that terrible.

10.8 seems a bit smoother in stop and go, but luckily traffic wasn't as terrible when I was trying. With 10.6.1 using chill mode seemed to help despite the fact that people claim that setting doesn't make a difference for TACC/AP driving. Just from a butt feel it wasn't as bad.
 
Just last week we have two accidents with autonomous cars and bus
The first is Waymo that no see the pedestrian on a crosswalk
The second is an autonomous bus in Canada the go directly on a tree after they go out of the road
All the Waymo and Autonomous Bus is equiped with more than two lidars + 4 radars it's better NO
The bus in Canada was also supposed to be limited to 12 miles per hour, was travelling a predefined route lined with smart sensors built into the road etc, and it had a backup driver

I don't think you can quantify whether one system is worse or better by the presence of crashes regardless of setup because none of the systems are even close to perfect at this point and all will crash. If vehicles can still crash despite more sensors plus smart infrastructure, well IMO that just suggests autonomous driving tech as a whole still has a looooong way to go.

And more competitors in the space = more + quicker progress
 
  • Informative
Reactions: EVNow