Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Apple Confirms Acquisition of Self-Driving Vehicle Startup Drive.ai

I'm dismayed to find that via Testing of Autonomous Vehicles and pages like Autonomous Vehicle Disengagement Reports 2018, that the CA DMV did this:

Seriously? Before, they had them up, including previous years for years. Now this? I wonder if they're going to back down on this.

FWIW, the summary of Drive.ai's results are at UPDATE: Disengagement Reports 2018 – Final Results.
Fortunately, someone (possibly me being one of them) did their duty (Hate dead links? Do your duty! Submit URLs to archive.org! - My Nissan Leaf Forum) and submitted old copies of relevant pages and at least the Tesla and Waymo reports to archive.org. One can access them via Testing of Autonomous Vehicles. Not sure if all the other reports were archived.
 
Yes, they reported 0 miles and 0 disengagements for many every year but one. In the only year in which they reported CA publoc road autonomous miles >0, they had reported 182 disengagements during 550 miles of autonomous driving that only happened in (IIRC) 2 months of the year.

I get that. But that is not what I am talking about. I am saying that when Tesla releases "feature complete" FSD to the public, that they should release the disengagement data of the entire fleet. After all, "feature complete" FSD would be autonomous testing or at least informative about Tesla's autonomous progress and we are all essentially "beta testers" right? And given the number of miles that the fleet drives every year, it should be a lot of data that would be a big enough sample to be statistically meaningful.
 
With Tesla we really don’t know if Tesla has driven any autonomous miles beyond the little that has been seen in public, or are they merely ADAS testing all the time that is a different ball game because the requirements are different.
I'd say any miles with AP on are (supervised) autonomous, and Tesla had racked up over a billion cumulative as of last year.
Now, one could say that is a geo-fenced data set, and they'd have a point (at least geo-biased). Likewise, lumping all SW versions together is bad statistics.
Interesting data would be: type of driving and % of AP vs trip. Unfortunately, you practically need a trip by trip data set to pull out useful figures. How to report 100 miles on highway at 2am vs 50 2 mile end to end commutes through a major city at rush hour?
 
  • Like
Reactions: CarlK
Yes, they reported 0 miles and 0 disengagements for many every year but one. In the only year in which they reported CA publoc road autonomous miles >0, they had reported 182 disengagements during 550 miles of autonomous driving that only happened in (IIRC) 2 months of the year.

You can't tell we're talking about Tesla autopilot data? As for DMV disengagement report companies only need to report test miles administrated by company employees which like I said mostly depend how it was performed and can be easily gamed.

I'd say any miles with AP on are (supervised) autonomous, and Tesla had racked up over a billion cumulative as of last year.
Now, one could say that is a geo-fenced data set, and they'd have a point (at least geo-biased). Likewise, lumping all SW versions together is bad statistics.
Interesting data would be: type of driving and % of AP vs trip. Unfortunately, you practically need a trip by trip data set to pull out useful figures. How to report 100 miles on highway at 2am vs 50 2 mile end to end commutes through a major city at rush hour?

Yes it's not so much geo-fenced but more like geo-biased. Although even that the bias is decided by each driver's own need instead of biased toward producing a better looking data. That's the main difference between Tesla and the rest.

Agree with it's hard to compare the data with non-autopilot miles driven on exactly same roads and time. You probably can compare accident rate between AP and pre-AP cars although I don't know if those data are available. This will change though. When FSD is released it will produce the only non-biased statistics although it could also work against Tesla if the public do not understand nature of how those came from.
 
Last edited:
Speaking of disengagements on AP, the #1 cause of AP disengagements for me is intersections. Most of my daily driving is 2-4 lane roads just outside of the city. AP of course handles the parts that are just cruising in my lane really well. But every time, I hit an intersection, if the light turns red and there is no lead car or if I need to make a turn, I have to disengage AP. I had a drive just the other day across town to see a friend that involved a 4 lane road, some highway, a 4 lane road and then residential one lane roads. AP/NOA worked 95% of the trip. My only disengagements were a couple intersections, taking the on ramp and of course navigating the residential roads. Everything was really smooth. NOA did especially well on the highway part. It's why I am especially looking forward to NOA on city streets. It it handles intersections well enough, then I should be able to use AP more in my daily driving with much fewer disengagements. Of course, I know the driver still needs to supervise. But being able to stay on NOA from start to finish, seamlessly from local roads to highway and back to local roads, will be really nice.
 
Last edited:
@diplomat33

But that’s the thing. Can we really call that a ”disengagement”, say, in the California reporting sense?

Currently shipping Autopilot software has not been programmed/trained to handle intersections at all. A Level 4 prototype car (operating within its ODD) certainly is programmed/trained to handle intersections.

Tesla basically leaves its ODD in an intersection.

It would be very interesting to see Tesla’s disengagement data within its ODD (for a prototype with intersections included).
 
  • Like
Reactions: cwerdna
@diplomat33

But that’s the thing. Can we really call that a ”disengagement”, say, in the California reporting sense?

Currently shipping Autopilot software has not been programmed/trained to handle intersections at all. A Level 4 prototype car (operating within its ODD) certainly is programmed/trained to handle intersections.

Tesla basically leaves its ODD in an intersection.

It would be very interesting to see Tesla’s disengagement data within its ODD (for a prototype with intersections included).

You are correct. It does not meet the California DMV definition of a disengagement since the car is leaving its ODD.

I am looking at things differently. I am looking at things from the perspective that I want AP's ODD to include all of my daily driving so that I can use AP more. It will be a more seamless experience when AP stays inside it's ODD for my entire daily driving. In other words, I am interested in AP expanding its ODD to include intersections because AP's ODD needs to be bigger and include city streets, highways and intersections in order to approach FSD. Once AP's ODD is big enough that it encompasses both city streets, intersections and highways, which I guess is another way to define "feature complete", then I feel like we can really start looking at AP as a FSD prototype.

And yes, that is why I said that Tesla should wait until they have intersections done, before releasing AP disengagement data. Once we have an AP with an ODD that includes both city streets, highways and intersections, then I feel like we can start talking about FSD disengagement.
 
Last edited:
You can't tell we're talking about Tesla autopilot data? As for DMV disengagement report companies only need to report test miles administrated by company employees which like I said mostly depend how it was performed and can be easily gamed.
I can tell, but as we pointed out Tesla autopilot as deployed to customers is missing MANY critical features vs. what Waymo's vehicles can do. And, Tesla in their last letter at Wayback Machine claimed "For Reporting Year 2018, Tesla did not test any vehicles on public roads in California in autonomous mode or operate any autonomous vehicles, as defined by California law..."

I haven't looked into all the requirements but from Application Requirements for Autonomous Vehicle Tester Program - Testing with a Driver, it's lot more than just employees or not employees (e.g. vehicle disposal, annual application fee ($3600 that covers 10 vehicles and 20 drivers w/another $50 needed for another 10 vehicles + 20 drivers, mandatory collision and disengagement reporting, etc.))
 
And, Tesla in their last letter at Wayback Machine claimed "For Reporting Year 2018, Tesla did not test any vehicles on public roads in California in autonomous mode or operate any autonomous vehicles, as defined by California law..." I haven't looked into all the requirements but from Application Requirements for Autonomous Vehicle Tester Program - Testing with a Driver, it's lot more than just employees or not employees (e.g. vehicle disposal, annual application fee ($3600 that covers 10 vehicles and 20 drivers w/another $50 needed for another 10 vehicles + 20 drivers, mandatory collision and disengagement reporting, etc.))
I have. As long as the nag system is active, nothing is reportable. I think Investor autonomy day should produce disengagement reports (unless there was a different nag / driver attention system in use).

I can tell, but as we pointed out Tesla autopilot as deployed to customers is missing MANY critical features vs. what Waymo's vehicles can do.

Enabled fleet features are not indicative of internal development features.
Similarly: features demonstrated in a limited environment are not indicative of a solution that works state/ country/ world wide.
 
  • Like
Reactions: CarlK
I can tell, but as we pointed out Tesla autopilot as deployed to customers is missing MANY critical features vs. what Waymo's vehicles can do. And, Tesla in their last letter at Wayback Machine claimed "For Reporting Year 2018, Tesla did not test any vehicles on public roads in California in autonomous mode or operate any autonomous vehicles, as defined by California law..."

I haven't looked into all the requirements but from Application Requirements for Autonomous Vehicle Tester Program - Testing with a Driver, it's lot more than just employees or not employees (e.g. vehicle disposal, annual application fee ($3600 that covers 10 vehicles and 20 drivers w/another $50 needed for another 10 vehicles + 20 drivers, mandatory collision and disengagement reporting, etc.))

Tesla does test FSD in every car in shadow mode. It was also giving employee owners, or perhaps others, HW/SW before releasing for them to test out. Even Elon was doing that. Those will not need to be reported. Although my original post was just to point out these disengagement numbers are pretty meaningless. They are uncontrolled and easily be gamed. With Tesla's cars, either AP or FSD when released, there is nothing to fudge or to hide when they come from real cars in the field.
 
I can tell, but as we pointed out Tesla autopilot as deployed to customers is missing MANY critical features vs. what Waymo's vehicles can do. And, Tesla in their last letter at Wayback Machine claimed "For Reporting Year 2018, Tesla did not test any vehicles on public roads in California in autonomous mode or operate any autonomous vehicles, as defined by California law..."

I haven't looked into all the requirements but from Application Requirements for Autonomous Vehicle Tester Program - Testing with a Driver, it's lot more than just employees or not employees (e.g. vehicle disposal, annual application fee ($3600 that covers 10 vehicles and 20 drivers w/another $50 needed for another 10 vehicles + 20 drivers, mandatory collision and disengagement reporting, etc.))
I don't think that Tesla has found a very legal and very cool loophole to get around CA testing rules. Uber tried a similar strategy of claiming that their system was merely a driver assistance system. I guess we'll see what happens.
“In their minds, they really thought they weren’t autonomous,” Jessica Gonzalez, assistant deputy director of public affairs at the DMV, told The Verge. “But we decide what’s autonomous. And under our regulations, it was.”
 
Cruise self-driving cars navigating around double parked cars in San Francisco.

Thank you for that. I hadn't bothered browsing their page at Cruise until now.

was good. Description was:
"Cruise
Published on Jan 25, 2019
Our driverless cars constantly encounter challenging situations on the streets in San Francisco. The driving seen in this video is 100% autonomous. This video is sped up approximately 2.5x. Do you have what it takes to solve challenging problems like this? https://getcruise.com/careers
 
  • Helpful
Reactions: hiroshiy
I don't think that Tesla has found a very legal and very cool loophole to get around CA testing rules. Uber tried a similar strategy of claiming that their system was merely a driver assistance system. I guess we'll see what happens.

The loophole is DMV reporting is only required for manufacturer owned test cars. Tesla is recruiting employee owners, including Elon himself, or others to use their own cars to test the new SW/HW. Either way that DMV reporting system is pretty silly with no purpose other than an easy way to relieve bureaucracies of their responsibilities.
 
The loophole is DMV reporting is only required for manufacturer owned test cars. Tesla is recruiting employee owners, including Elon himself, or others to use their own cars to test the new SW/HW. Either way that DMV reporting system is pretty silly with no purpose other than an easy way to relieve bureaucracies of their responsibilities.
There’s way more to the rules than just the reporting requirements. I don’t think the DMV will interpret them as you suggest and they write the regulations (it’s not a law).
Obviously this doesn’t really matter all that much since by the end of next year the software will no longer be subject to testing rules, it will be a deployed level 5 system.
 
  • Disagree
Reactions: CarlK
Regarding reporting disengagements to the CA DMV, the bottom line for me, is that Tesla will need to report something before they do robotaxis. Releasing "FSD" features to cars with drivers, if the drivers need to supervise, is one thing. But there is no way that Tesla can skip reporting all together and just deploy robotaxis directly.
 
Regarding reporting disengagements to the CA DMV, the bottom line for me, is that Tesla will need to report something before they do robotaxis. Releasing "FSD" features to cars with drivers, if the drivers need to supervise, is one thing. But there is no way that Tesla can skip reporting all together and just deploy robotaxis directly.
What about all the requirements for test drivers (training, background checks, etc.)?
I don’t think Tesla will release FSD on city streets in California until it’s approved as a level 3-5 system. I guess we’ll see though!
 
  • Like
Reactions: KArnold
There’s way more to the rules than just the reporting requirements. I don’t think the DMV will interpret them as you suggest and they write the regulations (it’s not a law).
Obviously this doesn’t really matter all that much since by the end of next year the software will no longer be subject to testing rules, it will be a deployed level 5 system.

Uber continues self-driving vehicle testing in SF in defiance of DMV – TechCrunch
Musk asks for Tesla employees to test out new full self-driving mode

If owners need to have a permit no one would have been able to operate autopilot, not at least for certain functions like NoA.

Regarding reporting disengagements to the CA DMV, the bottom line for me, is that Tesla will need to report something before they do robotaxis. Releasing "FSD" features to cars with drivers, if the drivers need to supervise, is one thing. But there is no way that Tesla can skip reporting all together and just deploy robotaxis directly.

That will vary from region to region. Certain states/countries may want to accept the technology sooner to be ahead of the game and others may take a more conservative approach to wait and see how did it turn out in other places. Regardless "disengagement" will not even be the metric when true autonomous vehicles are released.
 
Last edited:
What about all the requirements for test drivers (training, background checks, etc.)?
I don’t think Tesla will release FSD on city streets in California until it’s approved as a level 3-5 system. I guess we’ll see though!

well, it seems that Tesla has found a loophole by claiming that they are just testing a driver assist and therefore not beholden to the requirement to report. My point is that may fly as long as Tesla is just releasing autopilot features to drivers who need to supervise but that won't work if Tesla actually wants to deploy robotaxis. At some point, Tesla will need to report something in order to get permission to deploy robotaxis.
 
well, it seems that Tesla has found a loophole by claiming that they are just testing a driver assist and therefore not beholden to the requirement to report. My point is that may fly as long as Tesla is just releasing autopilot features to drivers who need to supervise but that won't work if Tesla actually wants to deploy robotaxis. At some point, Tesla will need to report something in order to get permission to deploy robotaxis.

Tesla can report stats for FSD-with-nag. The nag means they are not required to, but that is likely the data set they will be collecting in 2020 to internally validate and then externally justify the safety of the FSD/ attention free software.
 
  • Like
Reactions: diplomat33