Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Is anyone aware of any actual FSD beta accidents? There have been many posts regarding "imminent crashes" but with Elon's #1 goal was to avoid hitting pedestrian or other vehicles I wonder how well FSD beta is doing in that regard? Of course since drivers are responsible to take over any stats would be somewhat meaningless but it would still be interesting. You can be sure the media would have a field day should an accident be reported.
 
  • Like
Reactions: drtimhill and Iain
Radar wouldn't have helped for detecting parked vehicles in fog right? I wonder if the white parked car is easier for vision to detect because the darker wheel wells are more defined? Although maybe it's more of the dynamic range of the multiple Autopilot cameras than what was captured for the video:

before white.jpg

after white.jpg
 
Same old news. Tends towards the closed road but does finally mark it as undriveable as the driver disengages. All at only 8mph.

Takes insufficient avoidance of this impending collision. Poor path prediction of the other car.
View attachment 699395
Why would you call this an "impending collision"? Did the other driver cut the corner too close? Absolutely. Did FSD move to the right? Yes. Could it have swerved hard to the right like I suspect many humans would do. Yes. Would I have? Yes, I'd like to think so but did it need to swerve knowing the clearance between the two cars? Perhaps not. This is why we have to be careful since words matter. "Possible collision" would have been more accurate.
 
Why would you call this an "impending collision"? Did the other driver cut the corner too close? Absolutely. Did FSD move to the right? Yes. Could it have swerved hard to the right like I suspect many humans would do. Yes. But did it need to swerve knowing the clearance between the two cars? Perhaps not. This is why we have to be careful since words matter.
FSD did not move to the right at all, the driver did, he disengaged to move over. You can hear him say this. It would have been a collision but for the action of the Tesla safety driver
 
  • Disagree
Reactions: Iain
FSD did not move to the right at all, the driver did, he disengaged to move over. You can hear him say this. It would have been a collision but for the action of the Tesla safety driver


Since he disengaged how do we know what Autopilot would have done? I’m sure the driver felt he needed to take action but as we know we make extremely slow decisions compared to a computer. You saying it would have been an accident if not for the drivers action is not accurate at all in my opinion. “Could have been in an accident “ is accurate. Now if he had been in an actual accident and was still engaged to autopilot you would have an actual reference that autopilot didn’t work.
 
I'm not really an optimist as far as autonomous cars go, I'm far more of a pessimist about human driving abilities. But many people seem to have a blind spot for that ... post after post about what the car must be and how reliable it must be while more or less ignoring that humans would never be able to reach the levels they are arguing the cars must reach.
Strongly disagree. I think people here vastly underestimate human driving performance. Tesla's data shows 1 collision per 2 million miles (>12mph). Every FSD Beta video published could be literally flawless and it still wouldn't come close to proving that level of performance.
 
Looks like the path prediction comes in a couple frames later than lane line predictions. Here the destination lanes for the left turn disappear while the path still correctly predicts the left turn.
left no lanes.jpg


Then the path prediction updated to go right probably because it couldn't find a high confidence lane for the left turn anymore.
right no lanes.jpg


Maybe it got confused by the harder to see white lane lines on the lighter concrete part of the road?
 
  • Informative
Reactions: Dan D.
Is anyone aware of any actual FSD beta accidents? There have been many posts regarding "imminent crashes" but with Elon's #1 goal was to avoid hitting pedestrian or other vehicles I wonder how well FSD beta is doing in that regard? Of course since drivers are responsible to take over any stats would be somewhat meaningless but it would still be interesting. You can be sure the media would have a field day should an accident be reported.
One good recent example


@ 07:38, merging into a vehicle that's visible in the B pillar camera until Frenchie saves the day
 
Strongly disagree. I think people here vastly underestimate human driving performance. Tesla's data shows 1 collision per 2 million miles (>12mph). Every FSD Beta video published could be literally flawless and it still wouldn't come close to proving that level of performance.
Um .. if they were "literally flawless" then yes, by definition they would beat humans, since the human accident rate is not zero. As for underestimating, the human accident rate is well known, so what is it that is being "underestimated"?
 
  • Like
Reactions: rxlawdude
Strongly disagree. I think people here vastly underestimate human driving performance. Tesla's data shows 1 collision per 2 million miles (>12mph). Every FSD Beta video published could be literally flawless and it still wouldn't come close to proving that level of performance
You do realize that Tesla's statistic is not actually measuring human driving performance since it includes Tesla's autopilot/NoA assistance?
 
Not an actual accident though which was the question.
To your initial question yes, the video was more directed at this

with Elon's #1 goal was to avoid hitting pedestrian or other vehicles I wonder how well FSD beta is doing in that regard?
Unless you're asking about the success of FSD Beta inclusive of Tesla's screening process and picking testers who are nimble enough to compensate for the system's mistakes, which definitely seems like a success.

I've probably watched all of the footage posted by testers who have YouTube accounts and will say their screening process seems to be working, they picked good drivers who have stopped many near misses from becoming actual incidents.
 
You do realize that Tesla's statistic is not actually measuring human driving performance since it includes Tesla's autopilot/NoA assistance?
No it doesn't, but it does include active safety features. To me this seems like it sets an ever increasing standard for self-driving cars. Humans+machines have the potential to be far better than machines or humans alone. Also, I think that model of machine-human driving will probably have much higher performance than the supervised FSD model.

They do report the rate on old Teslas without active safety features and it seems to fluctuate a lot more because of small sample size (or maybe a season effect?). It's about 1-2 million miles between collisions.

In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
 
Um .. if they were "literally flawless" then yes, by definition they would beat humans, since the human accident rate is not zero. As for underestimating, the human accident rate is well known, so what is it that is being "underestimated"?
Obviously if the system itself were "literally flawless" it would be better than humans. What I said is that even if every FSD video in existence were literally flawless it wouldn't prove human level of safety. The sample size is way too small!
You were complaining about people holding FSD Beta to too high a standard which makes no sense if the goal is greater than human performance. We should not see any safety related flaws in such a small sample size. I thought you were underestimating human performance, now I'm not sure what you were saying.
 
Obviously if the system itself were "literally flawless" it would be better than humans. What I said is that even if every FSD video in existence were literally flawless it wouldn't prove human level of safety. The sample size is way too small!
You were complaining about people holding FSD Beta to too high a standard which makes no sense if the goal is greater than human performance. We should not see any safety related flaws in such a small sample size. I thought you were underestimating human performance, now I'm not sure what you were saying.
I didn't say anything about the sample size. And in fact I have already noted that the statistics from the FSD beta would mean nothing since the drivers are cherry-picked and are required to mainain a very high vigilance level, way beyond the general population.

And I also said nothing about the FSD standard. I have never argued about not having a high bar for autonomous vehicles. My point has always been that those who argue that FSD will never appear do so using an artificially high standard of acceptance. But acceptance does not mean we should not strive to be better than that in the longer term. Of course we should.
 
  • Like
Reactions: rxlawdude
've probably watched all of the footage posted by testers who have YouTube accounts and will say their screening process seems to be working, they picked good drivers who have stopped many near misses from becoming actual incidents.
While I agree these are cherry picked testers (as they should be), you need to be careful about this claim. You dont KNOW that the near misses would have been an actual accident without the intervention .. that is just speculation. The testers are told to be vigilant, and its quite possible that in many of these cases the car would have taken appropriate steps, but the tester disengaged before that. (Presumably Tesla can look at the predictions in detail and determine this.) Of course, there are many cases where its clear the car was way out of line, but it's incorrect to assume that is always the case.
 
Last edited:
  • Like
Reactions: rxlawdude
Elon says V9.2 is "not great" but adds that AP team is rallying to improve it as fast as possible. He also adds that they are working on combining highway and city streets but it requires "massive retraining":


I am a little surprised that Elon would admit that FSD Beta is not great. Usually, he is hyping it up. He seems to be trying to downplay the hype in this tweet.

And I think "massive retraining" implies that Tesla has a lot of work left to do before we see a complete "highway+city" FSD. Elon seems to be trying to lower expectations.