ThomasD
Active Member
It should be my Tesla almost hit a trainYou don't need 3rd party software for that!
The whole video is worth watching. 16:15 is terrifying.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
It should be my Tesla almost hit a trainYou don't need 3rd party software for that!
The whole video is worth watching. 16:15 is terrifying.
Absolutely. I've seen that statistic before, 1.06 deaths per 100 million miles. I read an article that did some amazing statistical math, using a metric I've never heard of called "deaths per million vehicle-years", which they then calculated out based on the average driver driving 12,000 miles per year, and came up with human drivers achieve 1 death per 428 million miles.The truth is that driverless vehicles need to be near 100% flawless. Human drivers achieve 1 death per 100 million miles of driving in the US.
Haha, might want to check your math, if that were true Autopilot would have been banned long ago. Tesla's numbers are the rate of airbag or active restraint usage (seatbelt pretensioners) which they say correlates to collisions over 12mph.If I do the same math for 2020, there were 6 deaths claimed on AP, and 2 deaths verified on AP. Using Tesla's safety reports for 2020: Q1=4.68 million miles on AP, Q2=4.53 million miles, Q3=4.59 million miles, Q4=3.45 million miles. That gives us a total of 17.25 million AP files in 2020. The 2 verified deaths in 2020 on AP would then be 11.6 deaths per 100 million AP miles in 2020.
2019, there were 9 deaths claimed on AP, 7 deaths verified. 13.55 million miles drive on AP. That gives us 51.6 deaths per 100 million AP miles in 2019.
Since Tesla doesn't give us the total number of AP miles driven in a given year, I had to extrapolate. The pattern would seem to hold though, less deaths per AP miles each year. And you're right, we're not near 1.06 per 100 million miles, but well on our way.Haha, might want to check your math, if that were true Autopilot would have been banned long ago. Tesla's numbers are the rate of airbag or active restraint usage (seatbelt pretensioners) which they say correlates to collisions over 12mph.
My point is that anyone saying that FSD is close to human performance based on personal experience is wrong because it's literally impossible for an individual to determine that within their lifetime.
Not sure exactly what you're saying. I'm sure the death rate for Tesla vehicles is better than 1.06 per 100 million miles either with or without AP on because of demographics and the fact they're much safer than the average car on the road (average car on the road is much older and doesn't have AEB, lane departure avoidance, etc.).Since Tesla doesn't give us the total number of AP miles driven in a given year, I had to extrapolate. The pattern would seem to hold though, less deaths per AP miles each year. And you're right, we're not near 1.06 per 100 million miles, but well on our way.
No this is not extrapolation. As you say you need to know the number of AP miles. Or the number of accidents.I had to extrapolate.
You're probably right on AP miles - I found a site that did the calcs and estimates, but it was a few years ago, before the Y came out. He estimated over 5 billion AP miles around 2021. Given those numbers, yes Tesla cars on AP are safer than 1.06 deaths per 100 million miles.Not sure exactly what you're saying. I'm sure the death rate for Tesla vehicles is better than 1.06 per 100 million miles either with or without AP on because of demographics and the fact they're much safer than the average car on the road (average car on the road is much older and doesn't have AEB, lane departure avoidance, etc.).
If you're talking about FSD without a safety driver then it seems like they've got a long way to go to get to 1 collision (>12mph) per 2 million miles (that's what Tesla claims human drivers of Teslas achieve). My guesstimate would be 1 per 2000 miles if I'm being an optimist, so 1/1000th of the way there!
Well VRUs (pedestrians, cyclists, etc.) make up more than a quarter of deaths and the collision rate is higher in cities so I'm not so sure that it's reasonable to assume that.You're probably right on AP miles - I found a site that did the calcs and estimates, but it was a few years ago, before the Y came out. He estimated over 5 billion AP miles around 2021. Given those numbers, yes Tesla cars on AP are safer than 1.06 deaths per 100 million miles.
Since there is no L4 or L5 Tesla yet, we don't have any numbers to estimate FSD without a driver. All we have are L2 numbers which have a human driver supervising. I can't find any deaths while on FSD Beta, but I can't guarantee that number. Since FSD Beta is limited to city streets where speeds are much lower than freeways/highways, it's reasonable to assume there wouldn't be that many deaths over 100 million miles of use.
The same can be said for Waymo and Cruise, we're likely to see very few deaths per 100 million miles due, in part, to the slower speeds in a collision, and the fact that current L4 and L5 vehicles on public roads are limited to city streets without a driver, and don't operate on freeways/highways.
Has your Tesla on FSD Beta ever veered into oncoming traffic? Mine never has. The worst it's done is attempt to move on a red left turn arrow on this recent version. Or are we taking a few data points and extrapolating them to the entire fleet?Well VRUs (pedestrians, cyclists, etc.) make up more than a quarter of deaths and the collision rate is higher in cities so I'm not so sure that it's reasonable to assume that.
Obviously you can estimate how FSD would perform without a safety driver (what do you think Waymo and Cruise did? They didn't just remove the safety driver to find out!). An FSD Beta user can only get a very rough estimate since you don't have the simulation tools that Tesla has to run the counterfactual of what would have happened had you not disengaged. And of course that analysis is highly dependent on predictions of what other human drivers will do (especially with FSD Beta and its propensity to steer into oncoming traffic. How often will the other drivers successfully avoid you?). Obviously it's nowhere close to human averages but it's fun as a thought experiment to estimate. I think it's a severe collision every 200-2000 miles.
If there were oncoming cars that is veering into oncoming traffic!Has your Tesla on FSD Beta ever veered into oncoming traffic? Mine never has. The worst it's done is attempt to move on a red left turn arrow on this recent version.
At the 5:50 mark on the first video, it appears the cruise continues to slow down after the light turns green, as if there was a delay processing the light change. I wonder if that was for comfort to reduce g-forces for the passenger.Driverless rides in Cruise:
Without Driver, But With Coyotes: First Ride in a Cruise Robotaxi Through San Francisco
Tuesday night was the night. I rode in one of Cruise’s fleet of driverless robotaxis in San Francisco for the first time. Thanks to an Austrian friend who already had access to the Cruise app…thelastdriverlicenseholder.com
I bet it's programmed not to go too quickly through lights that just turned green in order to avoid red light runners.At the 5:50 mark on the first video, it appears the cruise continues to slow down after the light turns green, as if there was a delay processing the light change. I wonder if that was for comfort to reduce g-forces for the passenger.
At the 5:50 mark on the first video, it appears the cruise continues to slow down after the light turns green, as if there was a delay processing the light change. I wonder if that was for comfort to reduce g-forces for the passenger.
According to the report, filed by Cruise Vice President of Global Markets Todd Brugger, a Toyota Prius entered an intersection after traveling straight via a lane designated for turning. The Cruise vehicle was attempting to make a left-hand turn across several lanes of traffic and had stopped to allow the car to turn.
The Prius was traveling about 40 mph in a 20 mph speed zone when it struck the Cruise vehicle, according to the filing. The Cruise vehicle was in “autonomous mode” at the time of the crash. It’s unclear if a safety driver, employee or other passenger was in the car.
The DMV report says "occupants of both vehicles received medical treatment for allegedly minor injuries". It shows one driver and one passenger were injured, presumably the passenger was in the Cruise car.NHTSA will investigate the crash last month involving the Cruise AV and a Prius:
U.S. safety regulators to probe crash involving self-driving car from GM-backed Cruise
Federal vehicle safety regulators will investigate a crash last month in which a vehicle struck a self-driving car from General Motors-backed Cruise.www.cnbc.com
I posted about this here: California Autonomous Vehicle Collision ReportsNHTSA will investigate the crash last month involving the Cruise AV and a Prius:
U.S. safety regulators to probe crash involving self-driving car from GM-backed Cruise
Federal vehicle safety regulators will investigate a crash last month in which a vehicle struck a self-driving car from General Motors-backed Cruise.www.cnbc.com
I think the Prius was legally at fault because it was in the wrong lane and speeding. It's not clear to me why the Cruise attempted a UPL in front of a speeding car, though. Or why it stopped so late in the turn. Seems like it would have been better at that point to proceed out of the intersection. Hopefully the NHTSA will release a report.
What doesn't make sense to me is that it sounds like the Prius would have had the right of way if they were turning right but the Cruise vehicle turned in front of them. Perhaps it was predicting that the Prius would slow down to make a right turn? Also, once detecting that the possibility of a collision the Cruise vehicle stopped? It sounds like without that inhuman behavior there wouldn't have been a collision.
Looking forward to seeing the report (in a year or two probably, haha).