Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Does anyone actually know how valuable large scale "fleet learning" really is??

This site may earn commission on affiliate links.
There would be, it seems, two possibilities:

1 - Large scale fleet learning of the kind Tesla is doing is the only way to iron out the corner cases and build a robust self driving system.

2 - Large scale fleet learning is not necessary to build a robustly safe system.

If #1 is the truth - then Tesla still has no competition and its lead is only growing. If #2 is the truth, then fleet learning's value is somewhat hyped up, and competing automakers can jump in the market over the next few years - going from zero autonomous cars to selling them off the shelf.

If in fact #1 is the truth, as Musk claims, and the other automakers know it - then why in the world are we going into the third year of Tesla being the only automaker with a large scale fleet test/learning project of autonomous driving?

The other makers could safely implement the exact same thing with zero liability risk if they just ran their fleets in a pure shadow mode - but they are not doing so. Why not? Instead we are seeing small scale deployments of test bed cars bristling with many more sensors than Tesla actually ships in the real world - what seem to amount to never-ending big dollar science projects.

In terms of real world autonomous driving Tesla is competing with - itself.

To play devil's advocate for a moment, let's remind ourselves that these other automakers also employ many phd artificial intelligence researchers. Let us presume these researchers are not stupid and that they are not living under a rock - and are thus aware of Tesla's fleet scale approach. Let's also assume these engineers could design and start shipping a fleet-wide learning system in actual vehicles which are for sale - today.

So why aren't they doing it? Is it in fact, not necessary? If it isn't necessary then what, if anything, does that imply about Tesla's firm public claims that fleet learning is the only viable way to build out a robust neural network that is safe in the real world?
 
What if other automakers are running in shadow mode? How would we know?

But I suspect they aren't running in shadow mode, because they aren't thinking like software companies. They think in terms of COGS, so they want to charge for new hardware as soon as it goes into production. Even if they could see their way to deploying HAV-capable hardware quietly, they aren't set up to gather and analyze the massive amounts of data that this would generate.
 
  • Like
Reactions: GSP
What if other automakers are running in shadow mode? How would we know?

But I suspect they aren't running in shadow mode, because they aren't thinking like software companies. They think in terms of COGS, so they want to charge for new hardware as soon as it goes into production. Even if they could see their way to deploying HAV-capable hardware quietly, they aren't set up to gather and analyze the massive amounts of data that this would generate.


That right there- "Even if they could see their way to deploying HAV-capable hardware quietly, they aren't set up to gather and analyze the massive amounts of data that this would generate."

At the end of the day they are still car companies playing catch-up.
 
Think about Tesla's software as a teenage driver.

It takes time for a teenager to learn how to drive, often starting in a classroom and then transitioning to ride-alongs with another driver in the car (possibly with a second set of controls).

And even after getting a license, it usually takes several years of driving experience to be proficient in driving in all conditions.

The AP 2.0 software development will follow a similar path, except that every car being produced now will have the AP 2.0 capability, and Tesla will all of the new owners as "driving instructors", first providing examples of how cars should be driven (in all conditions) and then later for testing to verify AP 2.0 is driving at least as safely as a human driver.

What Tesla is doing is very clever - effectively "crowdsourcing" AP 2.0 development - by getting a large number of cars on the road for training and testing the AP software.

Unless other manufacturers take a similar path - and deploy their self driving hardware in cars years before the self driving feature will be ready, then they'll be at a significant disadvantage, because they'll have to use test vehicles or a smaller fleet of cars to do the learning and testing.

A case might even be made with Tesla, that they should be giving away the AP 2.0 hardware - because of the value they are getting in using the cars for AP learning & testing.

Wait... They are doing that - every car has the AP 2.0 hardware, and if the owner doesn't activate it now, it can be activated later - which is a benefit Tesla is providing in exchange for using those cars as AP 2.0 trainers...
 
  • Like
Reactions: ahaer
This thread is filled with Tesla fanboyism.

Many other large car companies have autonomous divisions working on their autonomous cars. They just don't go flaunting it, like Tesla does. They also don't release a product that's 90% complete, like Tesla does.

As of now, it seems that Tesla is ahead of the game leading to L4/L5, but who actually gets there first? Time will tell.



As for the whole fleet learning, I feel it was spun into a lot more than it's capable of. But I hope I'm wrong.
 
Can you provide some examples?

Sure. See
Semi-Autonomous Cars Compared! Tesla Model S vs. BMW 750i, Infiniti Q50S, and Mercedes-Benz S65 AMG - Feature

Or for TLDR; :
Lane Control.png
 
  • Like
Reactions: Yuri_G
You're right, my comment was more in a general sense of software releases*. But when it comes to Level 2 autonomous driving, Tesla is ahead of the same, I don't think anyone doubts that (and yes, I've seen that article before)


*When it comes to the "less important" stuff like media, nav, etc. Tesla often releases things when they're good enough, but not complete. Whereas the other manufacturers release more polished items.
 
*When it comes to the "less important" stuff like media, nav, etc. Tesla often releases things when they're good enough, but not complete. Whereas the other manufacturers release more polished items.

Yes and no. (Well, yes on the music).

For example comparing the Tesla Nav with my GMC IntelliLink. So yes, the GMC has waypoints and a bunch of other stuff.

However, the GMC has the worst voice input system in the world, bar none. I've had MUCH better systems in the early 1990's. It takes me about 15 minutes to enter an address - and when you enter it you have to switch to the main console to finish it with 3 steps, which requires you to take your eyes of the road. And even if you have a passenger (which the car knows BTW.), that passenger can't enter an address - even though it would be MUCH safer for a passenger to just use the keyboard than having the driver distracted with 10 minutes of voice control.

The Tesla voice recognition works > 95% on the first attempt, and for the rest, my wife can enter it using the keyboard. So yeah, it technically can't do everything that the GMC can do, but the parts that really matter work. The GMC is just a long feature list of things that don't properly work.

I'd call that a 40% implementation on the GMC part, even though it's technically more "complete".
 
So who here believes an "average" driver training AP is a scary thought? For that matter, who here believes they are an average driver?

I'm surprised by some of the comments RE the radar on 8+ now seeing 2 vehicles ahead. Really, two? Is that the twice as safe as an average driver that has been stated as the target goal?

I highly doubt anyone here fixates on one or two cars in front of them. To me that's not a safe defensive driving posture. While it might be a life saver in some cases, is it something you'd consider as being a better driver than you? I don't. To me it's not enough to compete with the situational awareness that I'm sure all here employ without hesitation.

To me, if I'm not scanning everything as far as I can see down the road, I'm not doing a good job behind the wheel. There are times I'm slowing down and on the brake before several cars in front of me are, and I'm sure others here fall into that same defensive style too.

In 50 years of driving I've seen a lot of crazy things on the roads that are going to take more than I fear is being taking into consideration by all who are working to improve safety. Sure hope I'm wrong.

V2V communications in addition with AP might be an answer, but legacy vehicles will take a long time to cycle out I'm sure.

Still want an S with AP2 as I feel it's going to be the best stopgap we'll see for a while, but I would not trust it with my life and that of my friends and family to the extent I've read on various threads here. Sleep between points A and B? Could be a long sleep. :(
 
I'm dubious that shadow mode is sufficient to test self driving software. It seems to me that many miles with an expert driver ready to take over are necessary to test it.
Hitting the brakes to avoid hitting an object is a case where I understand using shadow mode to test self driving. If the driver hits the brakes and shadow mode did not, that info along with all the sensor data can be sent to the mothership for examination and improvement.
However, consider changing lanes to pass a slow car. There are many possible times to do this. The fact that self driving software in shadow mode and the real driver choose to do it at different times is not an error worthy of sending to the mothership. I don't see a way to test automatic lane changes without making that aspect of the self driving software active and having the driver ready to take over if the lane change would be dangerous. This is just one example. I'm sure there are many others.
Can anyone think of a way to use shadow mode to test whether the self driving software properly knows when it is safe to change lanes?
 
This thread is filled with Tesla fanboyism.

Many other large car companies have autonomous divisions working on their autonomous cars. They just don't go flaunting it, like Tesla does. They also don't release a product that's 90% complete, like Tesla does.

As of now, it seems that Tesla is ahead of the game leading to L4/L5, but who actually gets there first? Time will tell.



As for the whole fleet learning, I feel it was spun into a lot more than it's capable of. But I hope I'm wrong.


Auto Makers Struggle With High-Tech Dashboard Screens


They cannot even nail down the infotainment systems of vehicles, are you sure that these autonomous systems are anywhere near as advanced? I cannot answer that.
 
  • Like
Reactions: ahaer
From a machine learning perspective, the key is data. The more data you have, the better the algorithm will learn. It also helps to handle cases that the software hasn't seen before.

I don't know why the existing automakers are not doing it, other than they are not worried about Tesla. In the grand scheme of things, even though Tesla has the hardware for level 5, they don't have the software to support it. That will take time.
 
A report from Rand concluded that hundreds of millions or billions of miles of driving data may be required to adequately verify the self driving software is safe, which they also concluded would never be possible with small fleets of test cars.

If Tesla has 100,000 AP 2.0 cars on the road (which could be possible by late next year when Model 3 goes into production), each driving (on the average 50 miles per day) - that's 5 million miles of data - every day. As Tesla continues to improve the self driving software, they can roll it out to their cars - have the software shadow driver's actions - and quickly build up data verifying how well the software is doing compared to the human drivers.

Even driving 24x7 test fleets will never be large enough to get anywhere close to enough real-world testing to validate the software.
 
  • Like
Reactions: ahaer