Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I agree, it's just in direct conflict with "Tesla emphasized that not all these owners have purchased or activated this feature". Tesla's incomplete, conflicting information gets us again!

The really interesting thing is this doc mainly covers just the traffic control detection/response. They need all of this just to know where a traffic light is, one of the most obvious things humans put on roads, and they clearly were only detecting this with vision, so this isn't changing anytime soon.

We also get this!

Yeah, it does not instill confidence. It detects all kinds of arrow signs for open / closed lanes as traffic lights too, had to turn the darn thing off.
What it lacks is context (what is this really based on the relation to other things?) There's no "thought process" reflecting on what a thing really is. The "AI" is dumb as a brick really. Just pure object recognition. They also cheat with speed signs, having a tag in the map data to prime it for detection.
The LED speed signs are also not detected.

1622632280302.png
 
Last edited:
I agree, Tesla is probably referring to actual FSD sales. I will say I am a bit surprised at the numbers. 88k FSD in the US seems lower than I would have thought. I guess I assumed more owners had purchased FSD.


That was May 2020 FWIW.

At start of 2020 US sales of HW2.x/HW3 cars were ballpark 450k total vehicles, so 88k+ doesn't seem crazy based on most folks having guessed historical take rates in the 20-30% range.
 
Apple's been in the background for years. I read a short article that they had some high-level turnover (Apple Loses Multiple Top Managers From Self-Driving Car Division - Slashdot). That's not surprising given what's happening in the autonomous industry.

They've got the $ to go out and buy anyone, outside of Tesla.

According to the CA DMV Report that Apple filed for 2020, they did 18,805 autonomous miles and had 130 safety disengagements. That's a safety disengagement rate of 1 per 144 miles. Suffice to say, that's not a very good autonomous driving system. Yeah, I think Apple probably would be better off just buying an AV company.
 
... HD map skeptics will probably always be able to say that Tesla's HD maps are less detailed than those used by other AV companies and therefore superior.
Google defines their HD maps as centimeter accurate 3D maps. 2d maps don't count as HD maps per Google's definition. Although I find the entire argument about HD maps silly. Some people believe that the use of the term HD map, means difficult to generate map. Google has said creating their HD maps is not a big deal.
 
According to the CA DMV Report that Apple filed for 2020, they did 18,805 autonomous miles and had 130 safety disengagements. That's a safety disengagement rate of 1 per 144 miles. Suffice to say, that's not a very good autonomous driving system. Yeah, I think Apple probably would be better off just buying an AV company.
You'd need to know why it's disengaging to make that judgment call.
 
  • Like
Reactions: rxlawdude
You'd need to know why it's disengaging to make that judgment call.

I do know why it disengaged. It is in the DMV report. Here is the summary:

Disengagement CauseNumber of Disengagements
Hardware diagnostic caused software kick out
25​
Hardware diagnostic detected hardware health issue
3​
Incorrect map encoding lead to undesirable motion plan
1​
Incorrect perception lead to undesirable motion plan
1​
Incorrect perception of traffic signal lead to undesirable motion plan
4​
Incorrect prediction lead to undesirable motion plan
10​
Incorrect prediction lead to undesirable motion plan violating keep clear zone
2​
Incorrect prediction lead to undesirable motion plan
8​
Incorrect prediction of parked vehicle caused undesirable motion plan
1​
Incorrect prediction of vehicle caused undesirable motion plan
2​
Motion control health check caused software kick out
46​
Motion plan exceeded speed limit
1​
Motion planning unable to produce valid trajectory
1​
Reduced visibility of a vehicle due to occlusions resulted in an undesirable motion plan
1​
Safety driver discomfort due to selected motion plan
1​
Safety driver performed improper robotic mode engagement
1​
Sensor data mismatch caused software kick out
15​
Sensor/Perception discrepancy resulted in incorrect predictions for motion planning
2​
System issue interrupted driving algorithm
2​
Undesirable motion plan violating keep clear zone
1​
Undesirable motion plan violating traffic signal
2​
Total
130​
 
Turns out that Tesla does use HD maps. At least this controversy can finally be put to bed.
They partner with TomTom but it's not clear if they buy HD maps from TomTom (HD map | TomTom) though TomTom claims their HD maps are used in 3 million vehicles so it seems plausible.

Other interesting new information is that 88k Teslas have FSD (24k in CA).
Tesla did use HD maps in the past (the type you can tell where the exact lane/turn is in high accuracy, typically in centimeter accuracy, which is the typical definition). It's unclear how much they still use that now.

That paraphrasing of the phone conversation is not clear if this is the same type of map. It only mentions marking traffic signals and signs in the map, which from other discussions we already know Tesla had always been continually doing (it's not clear however how accurate a map they are using for lanes). @diplomat33 makes a good point, this is probably more like a "MD" map, which has more markings than your typical navigation system map, but is not necessarily detailed enough to count as an "HD" map as most autonomy companies definite it.
 
Last edited:
  • Like
Reactions: diplomat33
this is probably more like a "MD" map, which has more markings than your typical navigation system map, but is not necessarily detailed enough to count as an "HD" map as most autonomy companies definite it.
This is something that Karpathy specifically touched on in the 2020 talks (both February and the June)
at the 6:30 timestamp he says "We do not build HD maps"
at 10:05 timestamp mark Karpathy says: "we do have maps of course that we build but they're not high-definition maps"
same thing was said that the CVPR 2020 speech he gave in June 2020.
 
This is something that Karpathy specifically touched on in the 2020 talks (both February and the June)
at the 6:30 timestamp he says "We do not build HD maps"
at 10:05 timestamp mark Karpathy says: "we do have maps of course that we build but they're not high-definition maps"
same thing was said that the CVPR 2020 speech he gave in June 2020.
That's interesting. So basically the claim is Tesla doesn't build them (as they did in the past). Doesn't necessarily mean Tesla doesn't use them.
 
So basically the claim is Tesla doesn't build them (as they did in the past). Doesn't necessarily mean Tesla doesn't use them.
Context is everything.
Listen to the ~4 minutes from 6 minute mark.
He doesn't say they "don't build them but may use them", he says they are not necessary and that when Tesla Vision sees an intersection -- it is vision that does the heavy lifting and "seeing it for the first time" (which would be the opposite of HD maps).
 
This is something that Karpathy specifically touched on in the 2020 talks (both February and the June)
at the 6:30 timestamp he says "We do not build HD maps"
at 10:05 timestamp mark Karpathy says: "we do have maps of course that we build but they're not high-definition maps"
same thing was said that the CVPR 2020 speech he gave in June 2020.
That's interesting. So basically the claim is Tesla doesn't build them (as they did in the past). Doesn't necessarily mean Tesla doesn't use them.

Both sides could be right. Tesla does not build HD maps, but they might use MD maps.
 
I've been reading a lot about AV safety lately, trying to educate myself on what the standards should be for measuring AV safety. I might do a "primer" thread on standards of AV safety if there is enough interest. So, let me know either in this thread or in a private message, if you would be interested in that.

Anyway, I came across this interesting article, entitled "What counts as a valid measurement in autonomous vehicle development?" by Zin Binyamini, CEO and Co-Founder of Foretellix:

In the article, he says that disengagement rates are not an effective metric for measuring AV safety. Instead, he proposes an approach called Coverage Driven Verification (CDV):

"An alternative approach for both creating and tracking effective metrics of AV safety is the Coverage Driven Verification (CDV) approach. CDV originated in the semiconductor industry, where it has been widely used for many years. In recent years, we adapted the approach for safety-verification of AVs. It provides a methodical way to specify scenarios and mixed scenarios required for achieving an objective metric for the completeness of safety requirements.

In utilizing our approach, first, high-level requirements including the operational design domain (ODD) are defined. A comprehensive list of risk dimensions is written based on the requirements. Using the risk dimensions, a comprehensive verification plan is created. The plan includes quantifiable, measurable goals down to the parameter level including both coverage metrics and KPIs. The next step is to define the high-level abstract scenarios and their parameters using a scenario description language. The combinations of the parameters in an abstract scenario create a plethora of concrete scenarios to be tested.

The concrete scenarios are run across different testing platforms, including simulators, hardware-in-the-loop, test tracks and street driving, and produce meaningful metrics which are aggregated into a comprehensive coverage and KPI report. Analysis of the report drives the verification effort to hone in on non-covered areas, while intense constraints-based random test generation is used to complete the coverage holes and to explore areas beyond the known scenarios to assure the highest level of safety."

Article source: What counts as a valid measurement in autonomous vehicle development? | Automotive Testing Technology International

Just thought I would share. I find the idea of CDV to be intriguing.
 
Last edited:
CDV sounds a lot like how you do real, controlled, internal testing in any safety critical system before you even start real world testing.

The thing is, if you did all of that sufficiently, your real world disengagement rate would be low. So I think both are important.

In aviation, we don't say "well, we did a bunch of tests, so it doesn't matter that airliners crash every 10,000 flights". You do the tests to make that statistically unlikely, then you prove it with real world exposure, but if the real world is unacceptable, the system is still unacceptable.

In our current case, what the real world disengagement rate is telling us is that the testing (CDV) was woefully inadequate for any system that is suggested to be L3 or higher.
 
CDV sounds a lot like how you do real, controlled, internal testing in any safety critical system before you even start real world testing.

The thing is, if you did all of that sufficiently, your real world disengagement rate would be low. So I think both are important.

In aviation, we don't say "well, we did a bunch of tests, so it doesn't matter that airliners crash every 10,000 flights". You do the tests to make that statistically unlikely, then you prove it with real world exposure, but if the real world is unacceptable, the system is still unacceptable.

CDV includes real world testing. it says testing would include "simulators, hardware-in-the-loop, test tracks and street driving". Street driving would be real world testing IMO.

My understanding is that CDV is not just do a bunch of internal testing and then if the results look good, go straight to large scale deployment. At least, that is not how I read it.

In our current case, what the real world disengagement rate is telling us is that the testing (CDV) was woefully inadequate for any system that is suggested to be L3 or higher.

I don't think we really know who is using CDV in their current AV development, do we? Not unless, the AV company mentions CDV in their safety reports somewhere. So I don't know if we can assume that real world disengagement rates prove CDV to be inadequate.

Also, disengagement rates can be misleading if they are not properly segmented and normalized. So the problem could be with the disengagement rate metric, not with CDV.
 
Wouldn't it be very difficult to keep maps accurate all the time for every route the cars operate on?
That's why everyone is working on automating map creation. I'm pretty sure I drive better with the mental maps I have of common routes. It makes sense that machines would too.
For example Mobileye uses vehicles with their hardware to generate maps: Mobileye REM™ - Road Experience Management
 
  • Informative
Reactions: rxlawdude