Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How well does Autopilot work compared to the competition?

This site may earn commission on affiliate links.
There's also the very recent auto-parking test:


Based on this video, it seems that ProPilot has superior functionality in this realm. It's a shame that Tesla does not match it.
Haven't watched the video yet, but unfortunately, AFAIK (for whatever reason), the auto-parking feature is NOT available in the US versions of the '18 Leaf. It's available in Japan and obviously, some other countries.

Not sure if this will change in the future...
 
TeslaBjørn tested Nissan ProPilot on Tenerife and it seems pretty solid and on par with AP. Upgradable in service centers according to the video.
The OTA updates at Tesla has until now only compensated for an unfinished system imo. FSD - we shall see, maybe 3-6 months!?

Thanks for the video! He turns on ProPilot Assist at ~12:10 and off at ~16:30. At ~17:35, he does mention what he was told about updates, supposedly no OTA updates but there might be some available eventually at a service center.
Tesla not even on the list, Cruise is quite impressive behind google
Yeah, because Tesla decided to do 0 autonomous vehicle testing on CA public roads for all of 2017 despite such a large concentration of software engineering talent being in Silicon Valley and Tesla being also based there.

For all of 2016, Tesla did 550 miles of such testing on CA public roads and had somewhere past 100 disengagements. I heard a count of 182.

For those wondering what I'm referring to, see these:
Autonomous Vehicle Disengagement Reports 2017
Autonomous Vehicle Disengagement Reports 2016
Testing of Autonomous Vehicles
 
Last edited:
I would love to see a genuine independent comparison of systems for 2018.
Genuine? - because there are so many competing big business agendas in the market I would doubt how independent any comparison will be.

My own guess at the current situation is that on well marked roads many systems are now becoming quite competent, so it is easy to say they are parity with Tesla. However where I suspect Tesla have the edge is that it is more effective than other systems in more scenarios, with poorly marked roads, difficult lighting etc. which increases its utility enormously

A comparative test of systems ability to avoid cars parking half in lane might be a good test too.

I have also seen adverts here in the uk for manufacturers claiming their vehicles will stop for pedestrians.
This frankly terrifies me, as I simply do not believe any systems are mature enough to reliably do this; and buyers believing they have this functionally reliable hopefully wont have to depend on it.
 
Yeah, because Tesla decided to do 0 autonomous vehicle testing on CA public roads for all of 2017 despite such a large concentration of software engineering talent being in Silicon Valley and Tesla being also based there.

For all of 2016, Tesla did 550 miles of such testing on CA public roads and had somewhere past 100 disengagements. I heard a count of 182.

For those wondering what I'm referring to, see these:
Autonomous Vehicle Disengagement Reports 2017
Autonomous Vehicle Disengagement Reports 2016
Testing of Autonomous Vehicles

You misunderstand Tesla strategy I think.

Tesla are reportedly running the AP software in "shadow mode" on a (presumably) good proportion of their AP2 fleet.

This means they do not need to run autonomously and declare disengagements, as they can analyse the data uploaded from a huge number of cars in real world situations and process at HQ. A further benefit of this is that nobody outside Tesla really know how far advanced they are so their development progress is not publicly announced to their competitors.
 
You misunderstand Tesla strategy I think.

Tesla are reportedly running the AP software in "shadow mode" on a (presumably) good proportion of their AP2 fleet.

This means they do not need to run autonomously and declare disengagements, as they can analyse the data uploaded from a huge number of cars in real world situations and process at HQ. A further benefit of this is that nobody outside Tesla really know how far advanced they are so their development progress is not publicly announced to their competitors.
No, I don't misunderstand it at all.

I'm asserting that the above strategy while interesting, helps fill in gaps and provides a fire hose of info is, it is probably insufficient. Observing non-AP driving and AP-use (with EXTREMELY limited functionality and MANY limitations) along with not actually having to actuate controls on a real vehicle from that data and not knowing how other vehicles and people share the road actually react from such actuations is very different from real world testing.

Perhaps you should look at the Waymo safety report pointed to by Autonomous Car Progress. There's a ton of info in there but perhaps start with the example test scenarios on pages 36 to 39 and ask yourself "how many of these scenarios can AP2 handle now?" and "how much improvement or additional supported scenarios have we seen in AP2 over time?"

Sure, Tesla can conduct some real world testing outside CA, but how much do you really think is done outside CA given where so much of the software engineering talent is and where Tesla's HQ is located? It's a lot of extra overhead and hassle for someone in CA to have to work w/someone remotely or have someone else in another state try the latest build that's supposed to address an/some issues.

And, if Tesla's autonomous efforts are really that good so far, why the tiny # of autonomous miles driven in CA w/the horrible disengagement rate in 2016? They were all timed w/the publicity stunt video that CA DMV Report Sheds New Light On Misleading Tesla Autonomous Drive Video - DailyKanban points out. And, if they were that far ahead, wouldn't they be willing to give reporters rides (ala Nissan and GM Cruise Automation)? Waymo's been doing stuff like Waymo makes history testing on public roads with no one at the wheel. Why the need to hide? They should be able to show via disengagement reports some large # of miles and good ratio of disengagements vs. miles driven.

Tesla and Elon are all about hype. Shouldn't they be able to produce some hype that equals or rivals Nissan, Cruise Automation or Waymo on this?

You've seen Why testing self-driving cars in SF is challenging but necessary from the earlier thread I've pointed to, right? How well does AP2 handle this?

As I said, it's one thing to observe. It's another thing for software to actually actuate controls on a real vehicle in the real world w/others sharing the road.
 
Last edited:
  • Love
Reactions: croman
Bottom line is that, in spite of claims to the contrary, none of us know where Tesla has got to with AP development beyond what is in the public released software today, nor the detail of their strategy or it's effectiveness.

But enhancements are coming and when they do we will be in a position to assess them; and further they will apply to the existing fleet according to hardware specification.

The fact that Tesla choose not to run autonomous in CA and report performance publicly is all but irrelevant imo; they simply chose a different strategy is all.
 
Bottom line is that, in spite of claims to the contrary, none of us know where Tesla has got to with AP development beyond what is in the public released software today, nor the detail of their strategy or it's effectiveness.

But enhancements are coming and when they do we will be in a position to assess them; and further they will apply to the existing fleet according to hardware specification.

The fact that Tesla choose not to run autonomous in CA and report performance publicly is all but irrelevant imo; they simply chose a different strategy is all.

There is no "shadow mode" AP testing, at least if there is it's not in public builds that me or you using on the roads. The only thing that is "shadow" about Tesla is that your car uploads small clips to "mothership" even if you opted out in privacy policy.
 
  • Like
Reactions: googleiscoul
There is no "shadow mode" AP testing, at least if there is it's not in public builds that me or you using on the roads. The only thing that is "shadow" about Tesla is that your car uploads small clips to "mothership" even if you opted out in privacy policy.
So, can someone humor me on how the cars decide to upload snippets? Presumably this doesn't happen randomly. I suppose only when the car detects an unexpected event, the clip / context is uploaded.

So how do you define something as unexpected, unless you have an expectation? Who is creating the expectation. Calling it The Shadow mode could range from reasonable to charitable. But it's hard to say nothing of that sort exists.

I see @verygreen disputes the existence of the mode, but the question here is not one of existence, but it's capability. And to that, none of us has a clue.
 
  • Like
Reactions: googleiscoul
So, can someone humor me on how the cars decide to upload snippets? Presumably this doesn't happen randomly. I suppose only when the car detects an unexpected event, the clip / context is uploaded.
Well, it DOES happen randomly. But not just that. Basically when Tesla needs more data - they upload "triggers" that match some criteria to a subset of cars. Typically those criteria range from "randomly with probability of x% capture something to "if we are about to hit something and it's time to trigger AEB - capture a snapshot". Triggers select what they want captured to from full video from front cars and 1fps video from all cams to "just radar snapshot" or "just canbus snapshot". Triggers also have limits on how many snapshots they can create, ranging from 1 to a bigger number, often 10 to 20 for random events and 1 to 2 for nonrandom ones.

Those triggers go around and often have very limited duration, like 24 hours. The whole number of such campaigns at present are probably still under 100 (i saw campaign #64 in November) and I have seen 4 of them (that should give you an idea about how many cars they hit with this at once).

The cars that don't receive a trigger don't do any snapshot other than unconditional ones. Early on unconditional snapshots were rich, say up until sometime in June every FCS wold generate a front cam video snapshot, but then they became very sparse. Now unconditional ones pretty much include: component failure (something did not turn on/something crashed - cams not included unless it lead to a crash), the "rob-silent"/"rob-active" to detect false radar positives and the "car is crashing" snapshot when the car is detected to be in a crash.

So somebody on reddit unearthed an actual definition from Elon about the "shadow mode" that pretty much says "we'll make a snapshot when a condition is met" - and THIS does exist, but it's still a lot lesser scale than implied because not every car does get the conditions, the number of triggers is limited and the time the condition stays in place is limited. If the interesting condition happens without an active trigger - it's not reported.

But the problem is people put so much more into that definition, so before denying the existence of "shadow mode" I typically ask people what do they thing the "shadow mode" is, and oftentimes what they think about it does not actually exist.
 
Last edited by a moderator:
There is no "shadow mode" AP testing, at least if there is it's not in public builds that me or you using on the roads. The only thing that is "shadow" about Tesla is that your car uploads small clips to "mothership" even if you opted out in privacy policy.
https://www.dmv.ca.gov/portal/wcm/connect/f965670d-6c03-46a9-9109-0c187adebbf2/Tesla.pdf?MOD=AJPERES has a description of what Tesla claims is going on with "shadow mode" testing along with their other efforts in addition to explaining why they have 0 miles of autonomous testing on CA public roads.

Google searches for elon musk "shadow mode" will turn up various claims of what's going on.

As usual, as someone else pointed out (Tesla charged me $70 to screw on lug nuts), Tesla's communication style tends to be "where he (Elon) says something and lets peoples' imaginations run wild."
 
https://www.dmv.ca.gov/portal/wcm/connect/f965670d-6c03-46a9-9109-0c187adebbf2/Tesla.pdf?MOD=AJPERES has a description of what Tesla claims is going on with "shadow mode" testing along with their other efforts in addition to explaining why they have 0 miles of autonomous testing on CA public roads.
I highly doubt they had billions of miles driven while the triggers were active as claimed in their unexpectedly verbose report.

Also I imagine people realize that not everything could be tested in such a mode, but only mostly passive things like "brake for traffic/EAB"/things taht would result in a disengagement.

Active features like "overtake"/"change lane by itself" would be a lot harder to test in that mode.
 
  • Like
Reactions: croman and cwerdna
Bottom line is that, in spite of claims to the contrary, none of us know where Tesla has got to with AP development beyond what is in the public released software today, nor the detail of their strategy or it's effectiveness.

But enhancements are coming and when they do we will be in a position to assess them; and further they will apply to the existing fleet according to hardware specification.

The fact that Tesla choose not to run autonomous in CA and report performance publicly is all but irrelevant imo; they simply chose a different strategy is all.
While I agree with your first two paragraphs, I disagree with the last one.

Others are also doing simulations, besides doing real world testing. Search these for simul, for example.
Inside the secret city where Waymo is testing its driverless cars
Waymo built a fake city in California to test self-driving cars

Perhaps Tesla will be in for a rude awakening once they try deploying what they think is ready to hard places for autonomous vehicles to drive in (and annoying for real drivers) like the city of SF and other cities w/lots of traffic, pedestrians and bicycles. See the GM Cruise medium.com piece I pointed to.

What would be even worse and harder are places like major cities in India or China or other places where there's lots of traffic, pedestrians and other non-car traffic sharing the road and where lanes and traffic rules are loosely followed,
 
@verygreen posts suggest what I suspected all along - that Tesla are running the NN back at HQ and variously uploading real world images from the fleet to test performance against areas they are focusing on. What I suspect also is that a certain very select group of cars/owners/beta testers are indeed running the software live in the cars in "shadow mode" but this is not the case across the entire fleet as the magnitude of data generated would be impossible to process.

So really the approach could be
Test at HQ in simulation mode
upload selected test cases from real world vehicles/environments periodically
(Tesla have vastly more cars available to generate test cases than all the others put together)
run software in select number of cars in real world in "shadow mode" and measure performance vs simulator
rinse and repeat

? incrementally release surprisingly mature software features

suggestions/suspicions ... if only we knew

Waymo have set such a high standard, and we should all be thankful for this, that others have to be incredibly careful not to be obviously below par else they will be slaughtered.

now if only they could fix the damn car trying to jink off the road at every mild crest ...
 
I can’t speak for HW2/2.5 cars, but my HW1 car is nearly perfect as to driving. I do not remember the last time it did not stay in the center of the lane. When overtaking, I put on the signal and the car will wait for clearance before moving into the lane. I usually take control in construction areas, but the other day I forgot and realized it had negotiated a difficult ares perfectly. I am of the opinion that the radar was locked on the car it was following. It may not be smart, but it sure does dumb well.
 
I do not remember the last time it did not stay in the center of the lane

I was in narrow lanes today (with HW1/AP1) in a construction area. Original lines blacked-out but still "obvious" (although replacement lines good). Sadly no dashcam, which would have been handy ...

Whilst something was alongside me it staying in lane (which, here in UK on construction site is not much wider than an MS :eek:), but once nothing to the side of me it straddled the line a bit - but dashboard showed car perfectly central in blue-line lane markers. A bit later on it decided it was not actually in lane and moved over such that it was (can't remember if I had caught up with other traffic, which would have perhaps given it a clue ...). This happened more than once.

The cars in front (in my lane and other lane) showed on the dashboard as overlapping - i.e. as though the vehicle in other lane was intruding into my lane - whereas this was not actually the case (based on road-lines rather than "reasonable lane width assumption"!)

Either way, AP1 handled it pretty much perfectly (except for anyone wanting to under-take me as it wasn't leaving enough room for that when no traffic in other lane).

But it did jump on the brakes at a number of points when it considered the gap too narrow or something like that. Only on one occasion did it give me red-hands-on-wheel, but I had to foot-hovering-throttle-pedal to be ready to feather it to avoid surprising following vehicle!

All-in-all I continue to find AP1 very impressive ... although no idea if it is on par with the offerings from other Marques (let alone how AP2 does/will compare with them)
 
No, I don't misunderstand it at all.

I'm asserting that the above strategy while interesting, helps fill in gaps and provides a fire hose of info is, it is probably insufficient. Observing non-AP driving and AP-use (with EXTREMELY limited functionality and MANY limitations) along with not actually having to actuate controls on a real vehicle from that data and not knowing how other vehicles and people share the road actually react from such actuations is very different from real world testing.

Perhaps you should look at the Waymo safety report pointed to by Autonomous Car Progress. There's a ton of info in there but perhaps start with the example test scenarios on pages 36 to 39 and ask yourself "how many of these scenarios can AP2 handle now?" and "how much improvement or additional supported scenarios have we seen in AP2 over time?"

Sure, Tesla can conduct some real world testing outside CA, but how much do you really think is done outside CA given where so much of the software engineering talent is and where Tesla's HQ is located? It's a lot of extra overhead and hassle for someone in CA to have to work w/someone remotely or have someone else in another state try the latest build that's supposed to address an/some issues.

And, if Tesla's autonomous efforts are really that good so far, why the tiny # of autonomous miles driven in CA w/the horrible disengagement rate in 2016? They were all timed w/the publicity stunt video that CA DMV Report Sheds New Light On Misleading Tesla Autonomous Drive Video - DailyKanban points out. And, if they were that far ahead, wouldn't they be willing to give reporters rides (ala Nissan and GM Cruise Automation)? Waymo's been doing stuff like Waymo makes history testing on public roads with no one at the wheel. Why the need to hide? They should be able to show via disengagement reports some large # of miles and good ratio of disengagements vs. miles driven.

Tesla and Elon are all about hype. Shouldn't they be able to produce some hype that equals or rivals Nissan, Cruise Automation or Waymo on this?

You've seen Why testing self-driving cars in SF is challenging but necessary from the earlier thread I've pointed to, right? How well does AP2 handle this?

As I said, it's one thing to observe. It's another thing for software to actually actuate controls on a real vehicle in the real world w/others sharing the road.
Neither you nor I really know where Tesla is up to, but in Tesla’s defence they are unique in having hundreds of thousands of cars already available to gather data.
This unique ability could give them ways of developing autonomy unavailable to other players. For example, they could keep tweaking algorithms until the cars would have done what the drivers actually did. This could all be done in silico.
 
  • Funny
Reactions: googleiscoul
Neither you nor I really know where Tesla is up to, but in Tesla’s defence they are unique in having hundreds of thousands of cars already available to gather data.
This unique ability could give them ways of developing autonomy unavailable to other players. For example, they could keep tweaking algorithms until the cars would have done what the drivers actually did. This could all be done in silico.
To expand on this, Tesla can maintain a permanent record of everything each sensor and camera recorded, as well as a permanent record of all driver input.
Even if all Tesla vehicles were grounded tomorrow, Tesla could have millions of trips already recorded and available for analysis.
Sufficiently good AI (if Tesla had it) would be able to analyse these trips and make software that would have done the same actions as the driver in every scenario of every trip.
Even though this is conjecture, it does mean that we shouldn’t write off Tesla lack of road testing in CA as lack of effort (although it could be).
Always nice to be uncertain...