Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomy Investor Day - April 22 at 2pm ET

This site may earn commission on affiliate links.
I’m pretty sure all autonomous vehicle testing on public roads must be reported to the DMV.

Not production cars used by customers I would think. Otherwise we will all need to report to DMV when Tesla turns on the FSD sometime this or next year. Either way Tesla reported zero miles in CA last year to DMV.

I don't think Elon's words about the fleet being trained by billions of miles driving is wrong at all.

Look at it this way: Training the network is not about feeding some Matrix-style computer with 1000's of terrabytes of plain well marked roads. That data is useless. Even if you had the ability and capacity to actually feed everything into the network, the results wouldn't be good because basically any uncommon scenario would be blurred out by all that common data.

What you are interested in is the edge scenarios. Every time the driver did something not predicted, every time you see something not recognized, generally every time you disengaged the system. Edge scenarios happen frequently in the beginning, but with every iteration of the NN it happens more rarely. At some point maybe you have just 1 abnormal situation every 1000 km, but to get that abnormal situation you still have to drive those 999 km first. You still count as 1000 km fleet experience as improving/training the system, but you just send the useful data needed for the next iteration. Not the flawless data, or data you don't need until a future iteration.

It's like a carpenter with 10 years work experience. If he mounts drywalls for 30 days he's not learning anything new until a difficult problem he needs to handle shows up. He finds a solution and next time maybe it takes 60 days for next problem because he has more experience. In the end despite low learning rate (transferred data), he still has 10 years of carpenting experience. What you'd do is ofcourse let the carpenter get a more challenging project, which is why Tesla also gradually increases difficulty rate of which rare events to actually ask the fleet for detailed info about.

It's absolutely correct but people just interpret it wrong. That's how it usually was. People who are ignorant is in no position to judge others.

In Karpathy's talks yesterday and last year he said they got normal cases pretty much down and do not need anything more for them. It's only edge cases that they still need more data to train the NN. Some situations of how these worked that he mentioned are:
-- Tesla will send request to cars for data of edge cases it needs. Cars will automatically send data to Tesla whenever they see those cases and ignore the rest.
-- Whenever there is a driver intervention it will take it as a special case and will send to Tesla for engineers to look at. Engineers will feed whatever needed back to the neural net to improve it.
-- Neural net in Tesla cars has the ability to observe minor movements (body languages) of outside cars and pedestrians to predict their intentions. These are always run in shadow mode even when the AP is not turned on and there is no action taken with the prediction. However the computer will compare the eventual true movement of cars and pedestrians to what computer has predicted and send data to the server for further training of the NN. This again is the first time I've heard anyone doing this and is an extremely powerful tool. Tesla can do this better than anyone because of all those Tesla cars driving in the real world.

So even that Tesla does not need all those data from billions of miles cars driven the more miles cars has driven it will still provide more those needed data of interest. Maybe another few billion or tens of billion miles later Tesla will acquire enough data of all needed edge cases. Other companies that are only running few millions of test car miles just don't have the ability to match this.
 
Last edited:
It added that all of Uber’s “self-driving” cars have a driver sitting in the passenger seat to take over if needed.

Assuming the article is correct (usually a driver sits in the driver's seat), this is very different from Tesla autopilot. If the driver is not in the correct seat, the car is by definition self-driving as he cannot take over like a normal driver. They should have permit
Yeah, that's incorrect. They definitely had a person in the driver seat. Everyone testing in California has a person in the drivers seat. Uber claimed that their system was a level 2 system and not capable of self-driving and the DMV disagreed.
Tesla should be leaning on California legislators to green-light laws to permit end user use of FSD features in cars to bypass the current regulatory scheme which is aimed at testing and not end-use scenarios (presumably because FSD is always a fleeting reality).
They should get clarification from the DMV but the reason for the rules is to make sure the cars are tested safely. Uber's pedestrian collision in Arizona and the near collisions with their un-permitted testing in California make me skeptical that self-driving can be tested safely by untrained and monitored drivers.
 
  • Informative
Reactions: ChrML
Not production cars used by customers I would think. Otherwise we will all need to report to DMV when Tesla turns on the FSD sometime this or next year. Either way Tesla reported zero miles in CA last year to DMV.
All autonomous vehicle testing requires a permit in California. Tesla may try to claim that they have a level 2 system (there a couple threads where this is argued about ad naseum). There are 49 other states to test in anyway.
 
California Halts Self-Driving Cars in San Francisco

The issue is that regulatory agencies don't play fast and loose with semantics. Tesla's FSD software, if it exists now, does not intend to have active monitoring. It was hands-free unlike AP.
I think those are the key words. ;) It doesn't, right now.
Even if you add in red lights and intersections, if it nags you every 30 seconds or lacks active driver monitoring, it might trigger regulatory oversight if its intended to be FSD
At some point it might, but the question was why we are not seeing disengagement reports right now. I think Tesla simply isn't at the point yet where what they have would be classified as autonomous driving.
 
Tesla AP/ EAP do not need reported due to the nag system which requires continuous driver input. Uber had no nag or vigilance system (as shown in the fatality) . FSD with nags is just super enhanced EAP, same rules apply.

Article was from 2016. If they haven't objected to Tesla in 2+ years, it's pretty safe to say Tesla is not doing anything wrong.
 
  • Like
Reactions: WarpedOne
That's correct. They're presently, and for the next few years I'm sure, a rather serious driver assistance/ADAS company. And a really excellent EV company. But not an AV company, no matter what Elon said here.

I think I have to disagree with that last part. Elon is clearly serious about Tesla being a serious AV company. Just look at Elon's Master Plan to go all in on Model 3's being robotaxis by the end of 2020. And look at the huge investment and work put into FSD from developing the FSD computer in house, to doing machine learning with massive amounts of fleet data. A company devoting that much energy and that has their own hardware and their own software that is already capable of some self-driving is a serious AV company. To dismiss Tesla as not a serious AV company just because Tesla is taking a different approach, is a mistake. if Autonomy Investor Day revealed anything it is that the other AV companies would do well not to underestimate Tesla.
 
I think I have to disagree with that last part. Elon is clearly serious about Tesla being a serious AV company. Just look at Elon's Master Plan to go all in on Model 3's being robotaxis by the end of 2020. And look at the huge investment and work put into FSD from developing the FSD computer in house, to doing machine learning with massive amounts of fleet data. A company devoting that much energy and that has their own hardware and their own software that is already capable of some self-driving is a serious AV company. To dismiss Tesla as not a serious AV company just because Tesla is taking a different approach, is a mistake. if Autonomy Investor Day revealed anything it is that the other AV companies would do well not to underestimate Tesla.

They need to deliver on feature complete first.
 
What Elon says versus what's actually going on is night and day. He will say whatever he feels makes him superior.

But Tesla actually uses HD lane maps, exactly he says they dont do. Infact the HD map development has advanced and it's now in version 3.

Using NOA on area with insufficient maps lead to disatorous results.

But like I said again the demo was a complete non-event and a pr spin job

Exactly. Why not just show a person entering a random destination and letting the car drive.
 
I don't think Elon's words about the fleet being trained by billions of miles driving is wrong at all.

I respectfully disagree. I feel Elon goes into his snake-oil salesman role every time he pitches that ”every mile driven the fleet learns” mantra. The contrast couldn’t have been clearer on the 22nd when he was next to Andrej Karpathy who was giving clear, plausible answers on what the ”shadow mode” triggers and data gathering actually did — and Elon tried to interject with this every mile thing. Just like most of that day, Elon looked much less believable than his fellow presentation givers. And indeed not technically accurate as we’ve known about this.

He’s an engineer, we always say. On this, he is not showing engineer credentials, but something else.

What really happens is for some miles driven, in some cars, a trigger is launched that collects some data that is sent to Tesla. Tesla then uses this small amout of data for validation and improvement that eventually leads to new software updates and NNs being sent to the fleet. ”Every mile driven” the fleet out there does not learn a thing, nor does it even collect much data every mile driven other than some general statistics.

Why the need to tell a lie? The deployment and validation, and data gathering, benefit is impressive as it is. But the fleet does not ”learn” anything ”every mile driven”. You could see how this misunderstanding crept even into the audience when they were asking about NNs being trained in the cars...

I say this with great appreciation for the improving data gathering triggers Karpathy provided new insights into. As I’ve said the trigger part of the presentation was the highlight of the day for me. I think they are making strides in this area. Just give us Karpathy giving it like it is and I’m a happy camper on this one.
 
Last edited:
Just give us Karpathy giving it like it is and I’m a happy camper on this one.

Yes, this. I trust Karpathy because he’s just super excited to be working on this cool stuff. I’m cautiously optimistic, and that’s in spite of Elon’s rambling.

I have one simple worry though. My company has a sister company that’s in the AI field (though as far as I know, nothing to do with autonomous driving). They produce impressive results, constantly hanging with the big names and even beating them on recognizable tests/benchmarks. Despite this, they never seem to turn those impressive results into impressive products for their customers.

So as happy as I was listening to Karpathy talk passionately about all this, it really means very little beyond academically until an update hits the general public. I sincerely hope they can actually follow through on even half of their promises this time.
 
  • Like
Reactions: Icer and conman
I respectfully disagree. I feel Elon goes into his snake-oil salesman role every time he pitches that ”every mile driven the fleet learns” mantra. The contrast couldn’t have been clearer on the 22nd when he was next to Andrej Karpathy who was giving clear, plausible answers on what the ”shadow mode” triggers and data gathering actually did — and Elon tried to interject with this every mile thing. Just like most of that day, Elon looked much less believable than his fellow presentation givers. And indeed not technically accurate as we’ve known about this.

He’s an engineer, we always say. On this, he is not showing engineer credentials, but something else.

What really happens is for some miles driven, in some cars, a trigger is launched that collects some data that is sent to Tesla. Tesla then uses this small amout of data for validation and improvement that eventually leads to new software updates and NNs being sent to the fleet. ”Every mile driven” the fleet out there does not learn a thing, nor does it even collect much data every mile driven other than some general statistics.

Why the need to tell a lie? The deployment and validation, and data gathering, benefit is impressive as it is. But the fleet does not ”learn” anything ”every mile driven”. You could see how this misunderstanding crept even into the audience when they were asking about NNs being trained in the cars...

I say this with great appreciation for the improving data gathering triggers Karpathy provided new insights into. As I’ve said the trigger part of the presentation was the highlight of the day for me. I think they are making strides in this area. Just give us Karpathy giving it like it is and I’m a happy camper on this one.

To me the question is how widely distributed are the triggers. If the triggers go the the whole fleet, then you could argue that every mile driven is being used to search for the desired events eventhough very few cars may actually detect the desired event. This would make Elon’s and Andrej’s comments consistent, and will still show the power of a large fleet to quickly identify edge cases. The same goes for reporting when a driver overrides Autopilot. If it happens for every car, then all the autopilot miles are being used to search for exceptions otherwise Elon was overstating.

I agree that Elon did not present himself as well as Andrej, but it is still possible for Elon to have been correct and for all the fleet miles to be searched by Tesla for useful training data even if very little data is actually being reported to Tesla. I don’t know if they are doing that now or will in the future, but as long as they get the data they need to train the system, then it doesn’t really matter if every mile from every car is being searched. However, if they are able to get what they need from a subset of the fleet, then there would be no need for Elon to overstate the situation. Either way it would appear to be a powerful technique that should allow them to progress rapidly.
 
To me the question is how widely distributed are the triggers. If the triggers go the the whole fleet, then you could argue that every mile driven is being used to search for the desired events eventhough very few cars may actually detect the desired event. This would make Elon’s and Andrej’s comments consistent, and will still show the power of a large fleet to quickly identify edge cases. The same goes for reporting when a driver overrides Autopilot. If it happens for every car, then all the autopilot miles are being used to search for exceptions otherwise Elon was overstating.

I agree that Elon did not present himself as well as Andrej, but it is still possible for Elon to have been correct and for all the fleet miles to be searched by Tesla for useful training data even if very little data is actually being reported to Tesla. I don’t know if they are doing that now or will in the future, but as long as they get the data they need to train the system, then it doesn’t really matter if every mile from every car is being searched. However, if they are able to get what they need from a subset of the fleet, then there would be no need for Elon to overstate the situation. Either way it would appear to be a powerful technique that should allow them to progress rapidly.

It doesn't go to the whole fleet though according to @verygreen
Remember that green discovered and knew and wrote about this whole trigger thing years ago.
Also the autonomy day event just proved that shadow mode as described by Elon doesn't exist.
 
I think I have to disagree with that last part. Elon is clearly serious about Tesla being a serious AV company. Just look at Elon's Master Plan to go all in on Model 3's being robotaxis by the end of 2020. And look at the huge investment and work put into FSD from developing the FSD computer in house, to doing machine learning with massive amounts of fleet data. A company devoting that much energy and that has their own hardware and their own software that is already capable of some self-driving is a serious AV company. To dismiss Tesla as not a serious AV company just because Tesla is taking a different approach, is a mistake. if Autonomy Investor Day revealed anything it is that the other AV companies would do well not to underestimate Tesla.

No this is 100% a ploy to sell cars. Elon is simply channeling trump. Say outrageous things get 100s millions of views. People don't even read articles they read the title only. tens of thousands will rush to buy cars AS A RESULT.

Everyone i run into already think Teslas are self driving that people can sleep in. i wonder why.
Do you know how much cars Tesla will sell this quarter as a result of this event?
How many more people will believe teslas are self driving?
 
No this is 100% a ploy to sell cars. Elon is simply channeling trump. Say outrageous things get 100s millions of views. People don't even read articles they read the title only. tens of thousands will rush to buy cars AS A RESULT.

Everyone i run into already think Teslas are self driving that people can sleep in. i wonder why.
Do you know how much cars Tesla will sell this quarter as a result of this event?
How many more people will believe teslas are self driving?

Well, Tesla is an auto maker. Their business is to sell cars. So yeah, they are going to try to sell cars. But I don't believe "FSD" is a ploy or a scam. Tesla's self-driving is very legit. The cars have the hardware for self-driving and the dev team has made great progress towards self-driving.

And as I mentioned before, if people think Tesla cars are full self-driving now where they can sleep in them, I blame the media for the most part. They are the ones who put out clickbait articles about how Tesla cars are Full Self-Driving now. Musk talks about the cars having the hardware NOW but he's been consistent that being able to sleep in your Tesla while it drives you is still in the future.
 
Last edited: