Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

When does the CA DMV autonomous driving report come out?

This site may earn commission on affiliate links.
They had a little person in the frunk driving the demo rides.
There's no way they would violate the autonomous testing regulations which say nothing about nags. Cruise and Waymo also have nags BTW.
Can't you simply require nags, claim the driver is responsible (L2) and test your FSD just like it's EAP? As long as it is beta EAP, where the driver is using it like a drivers aid, why would a disengagement be reportable?
 
  • Like
Reactions: mongo
The car getting rear ended could just be the other car not stopping in time. It does not necessarily mean the Cruise AV did anything wrong.
The fact that the test driver disengaged before impact makes me suspect that the car did do something wrong. I'm not talking about legal liability. I could go around randomly slamming on my brakes and probably get a few people to rear end me and not be legally at fault.
I suspect this is one of the more difficult problems that they are working on, making the car drive in a way that is predictable to other road users.
 
Can't you simply require nags, claim the driver is responsible (L2) and test your FSD just like it's EAP? As long as it is beta EAP, where the driver is using it like a drivers aid, why would a disengagement be reportable?
It depends on the "design intent" of the system.
SAE J3016 said:
The level of a driving automation system feature corresponds to the feature’s production design intent. This applies regardless of whether the vehicle on which it is equipped is a production vehicle already deployed in commerce, or a test vehicle that has yet to be deployed. As such, it is incorrect to classify a level 4 design-intended ADS feature equipped on a test vehicle as level 2 simply because on-road testing requires a test driver to supervise the feature while engaged, and to intervene if necessary to maintain safe operation.
The claim at autonomy day was that the new software was a robotaxi prototype so I think they're violating the rules.
The DMV has the final say on what constitutes autonomous vehicle testing. In 2016 Uber tried claiming their vehicles were L2 and the DMV disagreed and forced them to comply with autonomous testing rules.
 
The fact that the test driver disengaged before impact makes me suspect that the car did do something wrong. I'm not talking about legal liability. I could go around randomly slamming on my brakes and probably get a few people to rear end me and not be legally at fault.
I suspect this is one of the more difficult problems that they are working on, making the car drive in a way that is predictable to other road users.

Possibly. Certainly, Cruise still has some issues to work out. I just think their disengagement report is very encouraging. Cruise seems to have very good autonomous driving.

It depends on the "design intent" of the system.

The claim at autonomy day was that the new software was a robotaxi prototype so I think they're violating the rules.
The DMV has the final say on what constitutes autonomous vehicle testing. In 2016 Uber tried claiming their vehicles were L2 and the DMV disagreed and forced them to comply with autonomous testing rules.

I will be curious to see how Tesla handles reporting disengagements after they release "FSD feature complete". I would imagine that Tesla would need to treat every car in CA with FSD as a test car and report all disengagements. I am just not sure how Tesla could release "FSD feature complete" to the entire fleet yet still try to claim it is L2 so as to avoid reporting. I think the CA DMV would take issue with Tesla just like they did with Uber.

Once Tesla releases "FSD feature complete", I hope that they start reporting autonomous miles from either a fleet of Tesla owned test cars or from the actual fleet owned by the customers who have FSD. We need disengagements from a large number of autonomous miles in order to get a realistic and fair comparison of Tesla's FSD. Reporting 12 miles is pathetic!!

I am curious why Tesla reported the autonomous miles from the FSD demo but did not report autonomous miles from any other test car. I guess they felt like the autonomous demo had to be reported or they would be in violation but they somehow treat the test cars as L2 so they are not autonomous? Weird.
 
I am curious why Tesla reported the autonomous miles from the FSD demo but did not report autonomous miles from any other test car.
There was video evidence. :D
It does seem like a better strategy would have been to report zero miles like they usually do. I have no idea what's going to happen but I still believe they will never release city NoA on current vehicles or as a Level 2 system.
 
I have no idea what's going to happen but I still believe they will never release city NoA on current vehicles or as a Level 2 system.

I am not sure I understand. Are you saying that Tesla won't release City NOA on the current vehicles but won't release it as L2 either? So you think Tesla will wait until they can release it as true FSD on future hardware?
 
I am not sure I understand. Are you saying that Tesla won't release City NOA on the current vehicles but won't release it as L2 either? So you think Tesla will wait until they can release it as true FSD on future hardware?
Yes. I do not think "automatic driving on city streets" can ever be safe as a Level 2 system unless you have extremely strict driver monitoring. If you look at your phone while testing for Waymo or Cruise you will be fired. And I don't think the existing hardware is L3-5 capable.
 
Last edited:
  • Like
Reactions: diplomat33
The fact is that these numbers do show how far behind Tesla is. Only 12 miles is pathetic.

These numbers don't show anything.

Tesla effectively didn't report their numbers just like I predicted.

Sure we can interrupt that as a rule violation, but it's not like Tesla is all that adherent to rules. It's pretty easy to predict what Tesla will do based on Elon's opinion of the rule.

They didn't report the numbers (in any meaningful way) because Elon likely thinks it's stupid. Just like everyone else does.
 
They didn't report the numbers (in any meaningful way) because Elon likely thinks it's stupid. Just like everyone else does.
Everyone? You can count me out of everyone. You think the folks at the DMV behind this program think it's "stupid"? I think you could find somewhere between 10 to 60% of people who have at least looked at the numbers who think that it's not the end all be all stat BUT yet not totally worthless, esp. if additional context is provided (e.g. about the routes, conditions and environments that testing was done under).

Disengagement Report 2019 had a summary and calls out Baidu.

As I've said before, Tesla's all about hype. If their system was doing so well, they should have no problem demonstrating a high # of autonomous miles driven on CA public roads w/a low disengagement rate (up there with the top 5 players) and improvement year over year. Instead, they turned in years of big fat 0, ~550 miles with a horrific bunch of runs and somehow a clean run during one year, more 0 miles and now 12.2 miles. :rolleyes: If they were really doing well, they should be doing their testing in the streets of San Francisco but have more miles covered, a better disengagement rate and lower accident rate than Cruise Automation.

It shouldn't be difficult at all for them to collect and track the info that the CA DMV requires. If they can't handle it, then what does that say about their progress and readiness for customer deployment esp. about the mythical robotaxis in the timeframe that Elon's talked about?

At least it seems like many (most?) companies are complying but if Tesla is just skirting the law or willfully violating it, you think that's going to look very good in the eyes of CA and other state regulators? CA DMV could pull their permit. I'm not a lawyer but perhaps they could even sue Tesla and others not in compliance.
 
Last edited:
They didn't report the numbers (in any meaningful way) because Elon likely thinks it's stupid. Just like everyone else does.

If everyone thinks it's so stupid, why did they comply and report all their miles? Cruise CEO criticized the disengagement report as not being the best metric to measure FSD progress but Cruise still reported 800K miles!!! Heck some companies have horrible disengagement rates but they still reported them.
 
  • Like
Reactions: cwerdna
I also wish Tesla fanboys would stop dismissing Waymo and Cruise.

It has been quite a year of growth for you. :D

Tesla's biggest advantage is their fleet gathering data with over 2 billion miles logged.

Just collecting data (which we know they don't actually collect data for all miles driven) isn't helpful. Collecting unique data is what's necessary, and having data about the context of disengagements and errors is critical. That's something Tesla doesn't have except for a very few specially outfitted vehicles. Most AP disengagements don't get reported to Tesla, and most AP miles are driven on extremely popular routes over and over again which adds minimal value to begin with.

VW worked with MobilEye to produce mapping data of all roads in the EU and US. They found that massive areas of the US were so poorly traveled with any recent VW vehicle that no reliable data could be collected. VW put over 300k vehicles on US roads for multiple years in a row, meaning they outnumbered Tesla's entire annual production. And they couldn't collect enough data. They have to hire paid, trained drivers to collect from all of those areas, which is something like a full third of the continental US.
 
  • Like
Reactions: diplomat33
It has been quite a year of growth for you. :D

Yep. Like I explained in another thread, I've tried to educate myself on autonomous driving and also see what Waymo, Cruise and others are doing. It is obvious that Tesla is way behind on FSD. I am now a big fan of Cruise and Waymo.

I still love Tesla cars. I love my Model 3. When it comes to battery tech, style, range, comfort, acceleration, Tesla is great. They definitely make great EVs. But when it comes to the AP/FSD side of things, Tesla still has a lot work to do.
 
I still love Tesla cars. I love my Model 3. When it comes to battery tech, style, range, comfort, acceleration, Tesla is great. They definitely make great EVs. But when it comes to the AP/FSD side of things, Tesla still has a lot work to do.

This is always my biggest point. Tesla makes cars people like, and they can sell more cars than they can produce. That alone should be good enough, and as they make breakthroughs in the AP system, they should roll them out as safety features with OTA updates. Nobody else is doing that, it's a huge competitive advantage, and when some day in the far off future we have self driving cars, Tesla will be right in among all the other players.
 
  • Like
Reactions: diplomat33
This is always my biggest point. Tesla makes cars people like, and they can sell more cars than they can produce. That alone should be good enough, and as they make breakthroughs in the AP system, they should roll them out as safety features with OTA updates. Nobody else is doing that, it's a huge competitive advantage, and when some day in the far off future we have self driving cars, Tesla will be right in among all the other players.
All true, but I'd much prefer it if Tesla didn't keep selling people features that don't exist and are likely not going to be available during the expected life of the car.

No, I don't think robotaxis are going to be available this year, or next, or this decade.
 
The DMV autonomous vehicle regulations seem sensible to me. Do you have a specific complaint?

Nope. They may very well be the place that competent people in the DMV work, but I haven't really looked closely enough to have any feelings on that matter.

I just can't pass up on opportunity to talk smack about the department that makes people send in a physical form and an actual check to pay for *stickers* to put on a car for HOV access. Not to mention how much of a turd-show it is to deal with anything related to licensing.
 
Everyone? You can count me out of everyone.

Ha, after I wrote that I thought that I should go back to clarify what I meant by everyone.

What I meant was that a lot of the players in the self-driving arena have issues with the disengagement reporting. That it's not a good measuring stick for how far along they are. It also makes Cruise seem worse than Waymo when the reality is that Cruise might be further along in difficult driving situations because they test their system in hell itself.
 
Last edited:
I would love to see videos of all their disengagements and their simulation results of what would have happened had the disengagement not occurred. As it stands I'll believe it when they do a million miles without a test driver. And of course I agree that they're way ahead but it's hard to say how close they are to having a viable product.

It's not a million miles, but Cruise posted another video of their car driving 1 hour in autonomous mode on city streets with no disengagements. It certainly gives a good idea of the kinds of scenarios that the car can handle: