rxlawdude
Active Member
They've done 10,000 loops?Waymo can do a 10 mile loop 10,000+ times without a collision.
I'm also not completely convinced that they're greater than human level.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
They've done 10,000 loops?Waymo can do a 10 mile loop 10,000+ times without a collision.
I'm also not completely convinced that they're greater than human level.
I wouldn't be surprised but I was talking statistically.They've done 10,000 loops?
Nothing has more degrees of freedom than reality. You could see on the display it was having trouble with this light and it was alternating between green and red.Trying to run red light at 5:30.
Wild swerves at 21:30
Confirmation required for left turn at 22:30
Thanks for your videos! I've been enjoying watching them for a few weeks now and appreciate your commentary and the time/effort you've put into them. The repeated tests you've done give a really good benchmark to track progress.FSDBeta 9.1 - 2021.4.18.13 - First Impressions Memorial Park Drive for Comparison to Previous v9
Check it out!
Tesla told the DMV their goal is 1-2 million miles per driver interaction so it only has to get 50,000x better.
That's an average of 1 disengagement per 40 minutes of driving. Personally, I don't think that is a good disengagement rate for FSD. Although it is lacking a lot of info. We don't know the type of disengagement, type of driving, number of miles etc...
If only Tesla would release the disengagement data, then we could maybe quantify the progress better.
As a fellow human I can figure it out pretty easily, but I've never seen a light like that outside of these videos. Are lights even standardized across the country? There has to be a finite list somewhere of every traffic light combination, right?Nothing has more degrees of freedom than reality. You could see on the display it was having trouble with this light and it was alternating between green and red.
View attachment 690279
It is standardized. There are only two lights there. Green arrow and red stop light. FSD wants to go straight so it should obey the solid red. If it gets confused by multiple lights then it should default to stop or require driver intervention.As a fellow human I can figure it out pretty easily, but I've never seen a light like that outside of these videos. Are lights even standardized across the country? There has to be a finite list somewhere of every traffic light combination, right?
so far is one step forward, no step backward.
Yes, those were some amazing left turns.but it’s starting to seem like 9.1 will perform as well or better on Waymo-equivalent routes in Chandler
I'd be interested in seeing it try that turn without any traffic at all; it seemed to have slightly different behavior when it didn't ever enter the "saw cross traffic, waiting for it to pass" state. Also at night - assuming that it's well lit, and the artificial light provides further visibility and contrast than during the day.#FSDBeta 9.1 - 2021.4.18.13 - Unprotected Left Turns with Drone View - Looping behavior
It's tried that before but got into an endless loop of right hand turnsI wonder if in the end, this scenario is solved by better feedback from the path planning back into the navigation. It could have aborted the left, done a right, then taken a u-turn or three lefts and a right (which is something I do myself sometimes on similar roads). It looks like, as of now, FSD doesn't have the capability of blocklisting maneuvers from the navigation that it discovers it can't complete (at least for the duration of the subsequent re-route). I think that would be a powerful capability.
Right, but if the FSD system aborts the left turn and forces the navigation to re-route, if it could remove that left turn from consideration during the re-route, presumably the car wouldn't get stuck, since it would be trying a different path every timeIt's tried that before but got into an endless loop of right hand turns
Tesla told the DMV their goal is 1-2 million miles per driver interaction so it only has to get 50,000x better.
Obviously 1 million mile per actual disengagement is impossible and I don't think 1 million miles per necessary disengagement is possible either and is also not necessary to achieve human level safety.
Closer to 40K miles per accident.The average accident rate for humans is approx. 1 accident per 500,000 miles. ...
Closer to 40K miles per accident.
Thanks for the video Chuck. It looks like the unprotected left hand turn didn't get any changes...or ones that haven't enabled it to show noticeable improvements. Hopefully 9.2 brings these fixes. The drone is great for watching that left hand traffic coming.FSDBeta 9.1 - 2021.4.18.13 - First Impressions Memorial Park Drive for Comparison to Previous v9
Check it out!
That number must count curb rash as a collision...Closer to 40K miles per accident.
I think 500k is the police reported collision rate so it would leave out a lot of minor collisions.The average accident rate for humans is approx. 1 accident per 500,000 miles. If you multiply that by 100-200% as Elon mentioned in the recent Earnings Call, you get 1 per 1-2M miles, the same metric that Tesla gave to the CA DMV. So if interventions only refer to interventions that prevented an accidents then it fits perfectly with Tesla's goal of 100-200% safer than the average human. So I think interventions don't refer to all interventions but only to those that prevented an accident. I assume Tesla collects all disengagements from the fleet. Presumably, they are only counting the disengagements that prevented an accident. So when that safety disengagement rate hits 1-2M miles then Tesla will consider FSD Beta to be safer than the average human.
I thought of that too in previous videos but did not mention it. I think a major problem (which has been brought up before) is Tesla's nav does not allow waypoints. This is a common complaint that has been brought up endlessly. If the nav allowed waypoints, all FSD has to do is set one up a waypoint that avoids that left turn, and it should be fine.Right, but if the FSD system aborts the left turn and forces the navigation to re-route, if it could remove that left turn from consideration during the re-route, presumably the car wouldn't get stuck, since it would be trying a different path every time
In Europe, if the active hood in the Model S deploys (in case of pedestrian or cyclist collision) that would very likely trigger the crash module also. This doesn't apply to other regions that doesn't have that feature, but I wonder if the sensor that deploys the hood is still active even if the car doesn't have the mechanism (such that it can detect a pedestrian/cyclist collision).That number must count curb rash as a collision...
I think 500k is the police reported collision rate so it would leave out a lot of minor collisions.
Of course not every intervention to prevent a potential collision actually would have prevented a collision. Most of the time the other driver will take evasive action.
I do like Tesla's current 1 per 2 million mile human driven number because because it has a clean definition, active restraint deployed (which they claim correlates to any crash over 12mph). Unfortunately it does leave out collisions with pedestrians and cyclists...
40K miles per accident is a number google / Waymo reported many years ago.That number must count curb rash as a collision...