Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
More data is not the solution to all problems, and especially not for safety-critical applications. To get ML to "work" you indeed need an insane amount (Petabytes) of labelled examples that are carefully curated. You need the right events, in all environments (the ODD) and in all weather and that still may not be enough.

The curation has been the bottleneck forever for self-driving. More compute will not solve this. And ML alone will not likely be able to provide any functional guarantees anytime soon.

Furthermore the models cannot be easily validated in a large ODD, and they are prone to dangerous regressions. There is no way to get logical "proof" of a models' safety or performance. Just look at Tesla's release history. It's nothing like a straight line going up in reliability.

There is plenty of research going on that might solve a few problems, but wrt more data the arrow are pointing in the wrong direction, if you look at the latest research. To get linear progress you likely need an exponential amount of training examples. Which again, is a bottleneck to curate.
 
Last edited:
(1 - 1/1000000) * 100 = 99.99999
Six sigma quality, which I think of as aspirational, is 99.9997 % of the normal probability distribution, 3.4 per million. I believe that six sigma is much much better than the average driver.

There is data that about one insurance claim per 500,000 miles driven. It’s not clear how many accidents are not reported. However it is likely a positive number. FSDS may already be better than the average driver. Tesla could have the data, we will see if they publish it. It could happen with the data collected by Tesla insurance.
 
  • Like
Reactions: willow_hiller
FSDS may already be better than the average driver.
Maybe, but it's also being supervised entirely by current Tesla owners - and primarily men. In general, women don't seem to care to use it much. Imagine throwing FSD at the general population. I have my doubts that the combination would be better than the average driver.

I'm pleased that the free trial hasn't resulted in a spate of incidents and/or accidents.
 
and primarily men. In general, women don't seem to care to use it much.

Do you have any data about the demographics of FSD users? Or is this just based on the overall male skew of the Tesla owning demographic? Last source I've seen on the topic says about 27% of Tesla owners are female. Without additional data, I don't think it's fair to assume that the share of women using FSD is less than the share of women that own a Tesla in the first place.
 
VW plans to start test drives with passengers of their ID Buzz robotaxi (powered by Mobileye Drive) later this year in Hamburg, Germany.

Recently, Volkswagen Commercial Vehicles chief executive Carsten Intra said: “We want to offer test drives for customers in Hamburg this year — under real conditions.” As we’ve long discussed, this will be through Volkswagen Group subsidiary MOIA.

 
MOIA has been operating a public ride-pooling service in Hamburg since 2018, albeit with human drivers. They have fixed passenger collect points.

Thanks for the info. It looks like they are hoping the ID Buzz with Mobileye Drive will allow them to do the same service without a human driver, if they can validate the tech to be driverless. And I can see why they chose the ID Buzz. It is a large vehicle, ideally suited for ride-pooling.
 
Wow.

As automakers strive towards more advanced driver-assist systems, one province in Canada is applying the brakes on the tech. British Columbia recently updated its Motor Vehicle Act, prohibiting the use of vehicles with Level 3 systems. This isn't just a clampdown on using such systems. The law makes it illegal to merely drive any Level 3-equipped car, whether you use it or not.

If you're caught, fines range from $368 to $2,000 in Canadian currency, or even six months in jail.
 
Here's the government link to describing the situation with Level 3 cars:


The Motor Vehicle Act prohibits a person from driving, or permitting the driving of, a Level 3, 4 or 5 automated vehicle. This means that highly automated self-driving vehicles cannot yet be driven on public roads in B.C., nor can highly automated self-driving features be used, unless enabled through a pilot project under the Motor Vehicle Act or by regulation in the future.

This stops manufacturers who have a tendency to skirt regulations or test out their software on public roads using untrained test drivers as unable to continue the practice in that province.

It will be interesting to see how the government takes Elon's buffoonery maintaining V12 is already an L3 system. I believe BC has the second highest concentration of teslas in Canada. It will also be interesting to see if this regulation spreads across the country.
 
Here's the government link to describing the situation with Level 3 cars:


The Motor Vehicle Act prohibits a person from driving, or permitting the driving of, a Level 3, 4 or 5 automated vehicle. This means that highly automated self-driving vehicles cannot yet be driven on public roads in B.C., nor can highly automated self-driving features be used, unless enabled through a pilot project under the Motor Vehicle Act or by regulation in the future.

This stops manufacturers who have a tendency to skirt regulations or test out their software on public roads using untrained test drivers as unable to continue the practice in that province.

It will be interesting to see how the government takes Elon's buffoonery maintaining V12 is already an L3 system.
Where does Elon claim V12 is already L3? Link? I must have missed it. If he claimed that, he would already be subject to permit requirements in California, as well as registering test drivers as AV test drivers.
 
Where does Elon claim V12 is already L3? Link? I must have missed it. If he claimed that, he would already be subject to permit requirements in California, as well as registering test drivers as AV test drivers.
The drive in 2019 was L3. Since then Tesla has been very careful to avoid complying with California autonomous vehicle testing rules.

He has talked about the performance of V12 unsupervised which suggests that they are doing autonomous testing. I don’t think they have any interest in L3 since that’s useless for robotaxis.
 
The drive in 2019 was L3. Since then Tesla has been very careful to avoid complying with California autonomous vehicle testing rules.

He has talked about the performance of V12 unsupervised which suggests that they are doing autonomous testing. I don’t think they have any interest in L3 since that’s useless for robotaxis.
By V12 unsupervised, my take is he means the performance if the driver doesn't take over. Basically the complaint by some is the current stats are biased because the driver takes over, but what would the stat be if the driver didn't? I imagine they are modeling that by looking at the NN predicted actions vs reality or testing that in simulation. That's way different than saying V12 is already L3.
 
By V12 unsupervised, my take is he means the performance if the driver doesn't take over. Basically the complaint by some is the current stats are biased because the driver takes over, but what would the stat be if the driver didn't? I imagine they are modeling that by looking at the NN predicted actions vs reality or testing that in simulation. That's way different than saying V12 is already L3.
That’s literally what autonomous vehicle testing with a safety driver is.
 

Alex makes an important point: Wayve did not just strap chatGPT to their autonomous driving and ask it to narrate what it sees. Wayve is training a single end-to-end AI model on both vision, language and action. That is an importance distinction.

I can see the application for solving the black box problem in e2e. By training the AI model on both vision, language and action, it is able to explain what it is "thinking" and doing. So we can see the what and the why behind the e2e.
 

Alex makes an important point: Wayve did not just strap chatGPT to their autonomous driving and ask it to narrate what it sees. Wayve is training a single end-to-end AI model on both vision, language and action. That is an importance distinction.

I can see the application for solving the black box problem in e2e. By training the AI model on both vision, language and action, it is able to explain what it is "thinking" and doing. So we can see the what and the why behind the e2e.
That's interesting. Plus it is great to see London on a sunny day!

If something like that was available for ADAS FSDS (and available in audio so I can keep my eyes on the road) I'd probably use FSDS more often because it would highlight whether I was disagreeing with driving style vs the car not 'understanding' potential dangers I see that it is ignoring (hopefully just until the last second).

Obviously when all cars are AV, they will be able to drive more closely together and at a higher speed and slow down later (and the manufacturers will determine the best balance between regen vs tire and brake pad wear). But by then, we won't need the description of decisions displayed (they may remain in the black box in case of accident) because the system will have earned our trust by then and we won't want to spend our drive looking at the screen, instead we'll be happily looking out the window (or at our phones!)