Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Tesla can report stats for FSD-with-nag. The nag means they are not required to, but that is likely the data set they will be collecting in 2020 to internally validate and then externally justify the safety of the FSD/ attention free software.

Agreed. And Tesla's fleet size will be an advantage. Instead of other companies who report disengagement data from a small fleet of test cars, and must drive them for awhile to get enough miles, Tesla can report the disengagement data from hundreds of thousands of cars using FSD-with-nag and get enough miles much faster to validate FSD.
 
  • Like
Reactions: mongo
Agreed. And Tesla's fleet size will be an advantage. Instead of other companies who report disengagement data from a small fleet of test cars, and must drive them for awhile to get enough miles, Tesla can report the disengagement data from hundreds of thousands of cars using FSD-with-nag and get enough miles much faster to validate FSD.
More miles, more locations, more everything (including ability to iterate through edge cases and update all cars). Tesla: we have more cars on the 405 than others have in their entire fleet.

They can even geofence the level based on fleet results and adjust as things improve. This region is a 3, this region is a 4, Hawthorne, CA to McGregor, Texas to Cape Canaveral, FL is a 5 with Semi endorsement :).
 
Uber continues self-driving vehicle testing in SF in defiance of DMV – TechCrunch
Musk asks for Tesla employees to test out new full self-driving mode

If owners need to have a permit no one would have been able to operate autopilot, not at least for certain functions like NoA.



That will vary from region to region. Certain states/countries may want to accept the technology sooner to be ahead of the game and others may take a more conservative approach to wait and see how did it turn out in other places. Regardless "disengagement" will not even be the metric when true autonomous vehicles are released.
You’re right that it is difficult to determine the difference between “driver assist” and testing of full self driving. Other autonomous vehicle companies go to extreme lengths to make sure their cars are tested safely (Cruise actually has two testers per vehicle). They do all this and they still have accidents so it seems unlikely that Tesla will find a way to satisfy the DMV while having their system tested by customers.
Disengagement and accident rates are the only metric to test autonomous vehicles in my opinion. I don’t believe standardized testing is possible, if it was there would be no need to test on public roads.
 
Regarding reporting disengagements to the CA DMV, the bottom line for me, is that Tesla will need to report something before they do robotaxis. Releasing "FSD" features to cars with drivers, if the drivers need to supervise, is one thing. But there is no way that Tesla can skip reporting all together and just deploy robotaxis directly.
They can and probably will (if and when they actually get to robotaxi) outside CA. Then use those results to deploy in CA ;)
 
You’re right that it is difficult to determine the difference between “driver assist” and testing of full self driving. Other autonomous vehicle companies go to extreme lengths to make sure their cars are tested safely (Cruise actually has two testers per vehicle). They do all this and they still have accidents so it seems unlikely that Tesla will find a way to satisfy the DMV while having their system tested by customers.

Is it though? If the car is handling steering and speed, isn't the car doing the driving? Driver assist is a term applied when the system is not robust or proven enough to be fully self-responsible. The only issue with average drivers as the supervisors is instances when the drivers cause an unneeded disengagement; however, that can happen with safety drivers also.


Disengagement and accident rates are the only metric to test autonomous vehicles in my opinion. I don’t believe standardized testing is possible, if it was there would be no need to test on public roads.

Those are the easily available metrics, but, on their own, do not fully cover or describe the range of events. For instance, a non-disengagement which was a non-accident due to the other car. Violations of normal driving practice that do not result in consequences.

Accident rates need granularity to see how the system is performing. Is a higher rate of accidents with a lower average severity better or worse?
 
Accident rate is the only metric that should be used to evaluate autonomous car capability. Comparison of that for fully autonomous car to human driven cars is pretty simple and straightforward.

Accident rates need granularity to see how the system is performing. Is a higher rate of accidents with a lower average severity better or worse?

Both accident rate and fatality rate should be looked at. Both will need to at least do better than human driver cars. Elon seems to indicate 2X better would be needed for the initial acceptance. That sounds fair to me.
 
Driver assist is a term applied when the system is not robust or proven enough to be fully self-responsible.
By that definition Waymo, Cruise, et al. are all "driver assist" systems since they are not proven enough to be fully self-responsible.
The only issue with average drivers as the supervisors is instances when the drivers cause an unneeded disengagement; however, that can happen with safety drivers also.
The issue is when the safety driver does not cause needed disengagement. It is a real problem that all autonomous vehicle companies are struggling with.
How GM trains human drivers to monitor its autonomous cars
Tesla has a self-driving strategy other companies abandoned years ago
Google ditched autopilot driving feature after test user napped behind wheel - Reuters
Those are the easily available metrics, but, on their own, do not fully cover or describe the range of events. For instance, a non-disengagement which was a non-accident due to the other car. Violations of normal driving practice that do not result in consequences.

Accident rates need granularity to see how the system is performing. Is a higher rate of accidents with a lower average severity better or worse?
It makes sense to include all those metrics when determining how safe the system is. If you have safety drivers then you're going to have disengagements and you have to figure out a way to factor those in the calculations.
Accident rate is the only metric that should be used to evaluate autonomous car capability. Comparison of that for fully autonomous car to human driven cars is pretty simple and straightforward.
I agree for deployed autonomous vehicles. The tricky part is how you deal with determining when the system is ready to be deployed. If I'm a passenger in a car I can yell at the driver "look out!" but the only option a safety driver has is to take over control of the vehicle. I could see situations where it would be very difficult to know whether or not intervention is required.
 
  • Like
Reactions: mongo
By that definition Waymo, Cruise, et al. are all "driver assist" systems since they are not proven enough to be fully self-responsible.
Yeah, bad explaining on my part. For the Tesla system with a butt in the driver's seat, the system is fairly autonomous but not high enough on the 9 scale so they call it driver assist.

The issue is when the safety driver does not cause needed disengagement. It is a real problem that all autonomous vehicle companies are struggling with.

For statistics of safety/ reliability, unneeded disengagements screw up the stats and crashes provide valid data points. One the real world side, crashes are bad and extra disengagements are non-factors (as long as the driver doesn't make things worse).

In the realm of monitoring, I think Tesla is better off than others since the driver is always responsible and should override the system if they are uncomfortable. While some disengagements may be superfluous, it does provide feedback of what passengers don't like.

Other companies going for level 4/5 taxis have safety drivers caught in a game of chicken with the car where they need the car to not have disengagements (give it as much chance to handle the situation itself) but also need to prevent crashes.
 
  • Like
Reactions: CarlK
I agree for deployed autonomous vehicles. The tricky part is how you deal with determining when the system is ready to be deployed. If I'm a passenger in a car I can yell at the driver "look out!" but the only option a safety driver has is to take over control of the vehicle. I could see situations where it would be very difficult to know whether or not intervention is required.

We'll just have to rely on statistics. One can always argue statistics is not perfect but it still provides the best info when done properly. If FSD enabled cars produce better safety statistics than non-FSD cars then you have no reason not to allow it regardless of how and when it's used.
 
  • Like
Reactions: Daniel in SD
Driver assist is a term applied when the system is not robust or proven enough to be fully self-responsible.

By that definition Waymo, Cruise, et al. are all "driver assist" systems since they are not proven enough to be fully self-responsible.

Yes, that is not the correct definition where SAE Levels are concerned.

Autonomous driving is a design definition. Autonomous car prototypes are autonomous cars, just prototypes. Teslas, at least in any released form, are not autonomous by design — the features released so far are not autonomous prototype features, they are designed (and designated) as ADAS features.
 
Yes, that is not the correct definition where SAE Levels are concerned.

Autonomous driving is a design definition. Autonomous car prototypes are autonomous cars, just prototypes. Teslas, at least in any released form, are not autonomous by design — the features released so far are not autonomous prototype features, they are designed (and designated) as ADAS features.
So is this an autonomous car prototype or ADAS features? :p
 
I would say long as there is a driver sitting in the front seat it is still in between autonomous car and ADAS. You could say this is an autonomous car in development of course.

In regulatory reality at least this is not a question of opinion, but of manufacturer definition and legal reporting.

The California DMV report of 2019 will eventually answer whether or not it was an autonomous car prototype or not.

I expect it was given that the infamous 2016 video car was also defined as such.
 
In regulatory reality at least this is not a question of opinion, but of manufacturer definition and legal reporting.
Uber defined their system as driver assistance and not subject to legal reporting. The DMV stated that they decide what's autonomous and forced them to comply. I'm really looking forward to seeing how this all plays out. Lots of predictions on this forum including many from myself!
 
Uber defined their system as driver assistance and not subject to legal reporting. The DMV stated that they decide what's autonomous and forced them to comply. I'm really looking forward to seeing how this all plays out. Lots of predictions on this forum including many from myself!

True. A company declaration might be challenged or overridden by regulators of course.
 
The restriction that Waymo cannot charge money for the rides seems a bit odd. After all, Waymo is a private company, their self-driving cars are quite good and the cars are required to have safety drivers. What would be the harm in charging money?

A guess: Then you are a commercial operation which throws a bunch of other regulations on you.
Similar to airplanes: fly your friends for free: OK
Fly your friends and get paid: that's a no-no.
 
A guess: Then you are a commercial operation which throws a bunch of other regulations on you.
Similar to airplanes: fly your friends for free: OK
Fly your friends and get paid: that's a no-no.

The regulators may also fear a backlash or (political) liability if they allow money-making too soon as that might encourage the wrong kind of behavior.

Interesting to note the state-to-state difference on this one though since AZ does allow payments on Waymo One. Anyone know if California also has a system for paid autonomous services or if it is not possible yet at all?