Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP accidents

This site may earn commission on affiliate links.
Yes, thats very concerning too ... so, what would you suggest ?
People engage in critical thinking when viewing videos on the internet. I actually doubt they're driving around at 75mph while only being able to see 20 feet in front of them.
You know I'm an automation complacency concern troll so I can't suggest that they improve stopped vehicle detection, that would just make people trust AP more!
Honestly I'm not sure what Tesla should do. Maybe better driver monitoring would help. The obvious solution is to get rid of AP as we know it and instead have it run in the background and only intervene to prevent crashes or lane departures. I like Autopilot though...

Just in case people missed it, here is a still from the narrow cam.
1640139866845.png
 
So what we can learn from the statement?

1) Machine vision (video) is not as good as humans': What's the point here? Should we use Tesla Vision or not?

2) Since the car and junk are not invisible because, once within the range of headlights, the machine should pick those clues up and do an evasive maneuver?

3) AP/FSD should slow down to be as safe as the reaction time within the headlights' range?
One thing we know, machine vision is a lot better than humans at seeing things in black images:

I think was just the case of their algorithm not being confident enough. A few thousands of more examples and the algorithm would have nailed this situation. It sucks that it’s not perfect yet, but it saves lives and is getting better quickly…

We humans struggle to see the border between #000000 and #010101, but for the computer this is night and day.
 
  • Funny
Reactions: Daniel in SD
...I think was just the case of their algorithm not being confident enough. A few thousands of more examples and the algorithm would have nailed this situation. It sucks that it’s not perfect yet, but it saves lives and is getting better quickly…

It sounds like the algorithm needs to get a few more collisions so it can learn its lesson.
 
  • Like
Reactions: DanCar
It sounds like the algorithm needs to get a few more collisions so it can learn its lesson.
Yeah, but hopefully most of these can be from collisions the Tesla was not involved in sourced from the cloud. Plus some data augmentation from simulation.

That’s the thing with Tesla’s iterative approach to real world AI. There will always be new edge cases causing accidents that the bears can point at showing that Tesla FSD still sucks and bulls can interpret that Tesla is making progress. Pick your perspective. But when you judge the system, remember it’s the remaining unkown failure modes that matter, not the past known failures that now are solved.
 
  • Like
Reactions: CyberGus
People engage in critical thinking when viewing videos on the internet. I actually doubt they're driving around at 75mph while only being able to see 20 feet in front of them.
You know I'm an automation complacency concern troll so I can't suggest that they improve stopped vehicle detection, that would just make people trust AP more!
Honestly I'm not sure what Tesla should do. Maybe better driver monitoring would help. The obvious solution is to get rid of AP as we know it and instead have it run in the background and only intervene to prevent crashes or lane departures. I like Autopilot though...

Just in case people missed it, here is a still from the narrow cam.
View attachment 746879

I absolutely hate freeways that have sections like this where you're staring directly into the lights of oncoming vehicles.

It makes it hard to say how much time the driver had to react.
 
What concerns me about this accident is it was NoA with HW2.5, and it happened fairly recently (June 2021).

So basically this accident happened on a feature complete version of EAP that likely won't get much improvements in the future.

We don't know if Pure Vision would have stopped in time, but we do know it has improvements or pending improvements that very well could have at least reduced the severity of the accident.
 
  • Like
Reactions: DanCar
What concerns me about this accident is it was NoA with HW2.5, and it happened fairly recently (June 2021).

So basically this accident happened on a feature complete version of EAP that likely won't get much improvements in the future.

We don't know if Pure Vision would have stopped in time, but we do know it has improvements or pending improvements that very well could have at least reduced the severity of the accident.

Like @heltok said "algorithm". Most likely the system was not "blind".

The GreenTheOnly tweeted "the truck was seen by both the radar and also could be clearly observed by the narrow camera from quite a distance away (pictures are spaced 1 second apart)"

With Vision, so far, it's not better than the radar version with so many complaints so it's difficult to speculate that it's "sublime".

If vision works, we should see a demo of how a Tesla would avoid crashing into a stationary obstacle at 75 MPH before turning off the radar and switching to radarless this year.
 
  • Like
Reactions: DanCar
One thing we know, machine vision is a lot better than humans at seeing things in black images:

I think was just the case of their algorithm not being confident enough. A few thousands of more examples and the algorithm would have nailed this situation. It sucks that it’s not perfect yet, but it saves lives and is getting better quickly…

We humans struggle to see the border between #000000 and #010101, but for the computer this is night and day.

"It sucks that it’s not perfect yet"

That neatly sums up the problem with AI for critical systems.
 
  • Like
Reactions: DanCar
Could these help in situations like this?
Thermal cameras like FLIR
Remove the IR filters from the cameras or at least have an IR camera up front
Radar - probably higher resolution and the NN actually has to use it
Much better camera, like the Sony Starlight cameras - I use those for security purposes and they can see well with just moonlight. They see a lot better than I do.
LIDAR - do not see why it would have not worked. Still would not work if it was fog, heavy rain or snow.

People like to say cameras work because people use vision. People's vision suck in night, heavy rain, fog and snow so I do not understand why Elon and others keep going back to those type statements.

Is it a good system for driving on a bright sunny day. Sure! (except when the camera is blinded by sunlight)
 
Last edited:
Radar - probably higher resolution and the NN actually has to use it

As stated above from the tweet, the system was not blind. Its radar registered the obstacle just fine.

It’s then up to the "brain" to decide what to do with that data. In this case, it chose to ignore and refused to act on avoiding the obstacle.

I don't know why but maybe it's to promote smoother experience because people are complaining about phantom brakes with FSD beta.

So it's a terrible choice between being alive with phantom brakes or dead with smooth experience without phantom brakes.

Waymo has figured it out so its riders don't get that kind of terrible choices: They got both safety and smooth (but confined in Chandler, AZ only).
 
So what we can learn from the statement?

1) Machine vision (video) is not as good as humans': What's the point here? Should we use Tesla Vision or not?

2) Since the car and junk are not invisible because, once within the range of headlights, the machine should pick those clues up and do an evasive maneuver?

3) AP/FSD should slow down to be as safe as the reaction time within the headlights' range?
1) No it's an AP2.5 not even with the new hardware
2) On the new fsd beta or tesla vision the system use full headlights not on this case
3) Yes my car with full headlights slow down but it's an HW3 cars.
 
As stated above from the tweet, the system was not blind. Its radar registered the obstacle just fine.

It’s then up to the "brain" to decide what to do with that data. In this case, it chose to ignore and refused to act on avoiding the obstacle.

I don't know why but maybe it's to promote smoother experience because people are complaining about phantom brakes with FSD beta.

So it's a terrible choice between being alive with phantom brakes or dead with smooth experience without phantom brakes.

Waymo has figured it out so its riders don't get that kind of terrible choices: They got both safety and smooth (but confined in Chandler, AZ only).
Safety the have last week the second accident with pedestrian on a crossway pedestrian
 
  • Informative
Reactions: Daniel in SD