Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FIRMWARE UPDATE! AP2 Local road driving...and holy crap

This site may earn commission on affiliate links.
I have a question. On 17.7.2 does the car implement what it is learning while you are driving it? Or does the car improve only after it get an update from Tesla?

In other words, if I drive the same road many times before I get an update, will the car handle that road better?

The car could get better any time it downloads a new "tile" from Tesla for the area you are driving in. Of course that all happens in the background so you have no way of knowing when you get a new version of a AP 'tile". (Or at least that is how we understood that AP1 worked.)
 
  • Informative
Reactions: P100DHG.
This line of thinking is naive and downright laughable.

Basically what you are saying is, i got a 1.1 GPA but don't pay attention to it because i'm also going to school at another university taking the same courses and major and i promise you i'm doing way better.

The obvious question is, ifyou are taking the same major some other place then your intelligence and knowledge would translate to this university. But since your GPA is a 1.1, you couldn't possibly be sporting a 3.5 some where else.

if Tesla FSD were so good, then you wouldn't need multiple takes to make a video, you would do it in one attempt. Nor would you have any disengagements. But tesla were having so many problems that they had to delay the announcement because they couldn't get a continual shot. Then they settled for a cut and pasted shot. Only a month later were they able to get a successful end to end drive.
Look. It's fine to disagree, but why insult someone?
 
No, it won't. Learning is central and then distributed.
You are right that learning is central and then distributed, but as @MP3Mike points out, it does not necessarily have to wait until an update to be distributed. The high res maps supposedly are distributed as map tiles when you enter a certain area. That allows a car to get "better" in a certain area even though it did not update to a new version yet.
 
You are right that learning is central and then distributed, but as @MP3Mike points out, it does not necessarily have to wait until an update to be distributed. The high res maps supposedly are distributed as map tiles when you enter a certain area. That allows a car to get "better" in a certain area even though it did not update to a new version yet.

Aren't the maps just from Google? This seems to suggest that there's some element of "Teslafication" to the map data.
 
Aren't the maps just from Google? This seems to suggest that there's some element of "Teslafication" to the map data.

There are three sets of maps in use in a Tesla:
* Google maps displayed on the 17" screen
* Navigon maps used for the navigation system and displayed on the IC
* Tesla AP map "tiles" used by the AP system to help it drive the roads.
 
I think Tesla underestimated the value Mobileye brought to the table. It seems they are in over their heads a bit.

Hopefully hiring one of the greatest software development wizards away from Apple will put them back on track in due time (Chris Lattner, new VP of AutoPilot Software).


That is right. I think they bit off way more then they could chew. I think Elon was right it would be working by the end of December. December 2017.
 
Was the FSD video using the new Tesla Vision system or was that car still using MobileEye?

They were using the full Nvidia software Suit, Now they are trying to duplicated mobile eye using neural networks, and it isn't that easy. I think they are going to have to start using more camera's before they can duplicate the accuracy of the mobile eye camera.

Mobile eye was not using neural nets. It was using computer vision.
 
  • Disagree
Reactions: techmaven
They were using the full Nvidia software Suit, ...

I don't believe this is true. Do you have a reference? This would leave Tesla hopelessly tied to yet another vendor, whereas they have already announced that they are designing their own SoC (which is essential, IMHO). I believe they are using the raw platform and doing all their own training and a good bit of the fundamental ML. NVIDIA is developing a complete auto suite to sell to auto makers, doing their own image recognition training and all levels of algorithm for Level 3, 4, 5 automated driving. Tesla would never allow them to share the crown jewels - the training data from the Tesla fleet, and without that data included into the training, there is no Tesla advantage.

I could be convinced that Tesla is using some basic off-the-shelf neural nets (e.g., car/object recognition) from NVIDIA in the CURRENT AP2 software, but that software is likely a throw-away stopgap solution just to get mock-AP1 feature parity and to get past the MobileEye debacle. Once Tesla trains its own nets (using more cameras), I think they will at least TRY to go it alone.
 
I believe Mobileye also uses some human markup of scenes, to help train their systems. That could be a good "moat" against competition. It's time-consuming and labor-intensive to duplicate, which discourages others from trying to duplicate it. They could, but it'll be tempting to try to replace it with more software — and that may not be easy either. I've heard the claim that deep learning already beats humans at some tasks, but that's specific to a benchmark. In real life I suspect that humans using a well-designed quorum-based approach can still be more accurate.

Post-Mobileye, my understanding is that Tesla is using various NVIDIA products for different purposes:
  • FSD demo video(s) used Drive PX2 in the car, with multiple cameras, trained using DriveWorks. This would have been the fastest path to a FSD demo.
  • HW2 cars come with Drive PX2 and multiple cameras. But AP only seems to use the front cameras, possibly just one of them. I suspect it's trained using Tesla's own software (Tesla Vision) instead of NVIDIA's DriveWorks, because Tesla wants to own that know-how.
Either or both of these could have been trained on the NVIDIA DGX-1, or on commodity hardware. Either or both could use NVIDIA DIGITS, or other software. Tesla has probably evaluated different options and uses the more cost-effective approach.

Now I understand Tesla has released AP for HW2 at up to 55 mph. Has anyone else demonstrated speeds above 55 mph with PX2 hardware? The Tesla FSD demo videos seemed to top out around 25-35 mph, and the NVIDIA demo at CES seemed pretty slow too. What about Volvo? Aren't they using PX2 in limited trials in and around Gothenburg? How fast can it go?
 
I believe Mobileye also uses some human markup of scenes, to help train their systems. That could be a good "moat" against competition. It's time-consuming and labor-intensive to duplicate, which discourages others from trying to duplicate it. They could, but it'll be tempting to try to replace it with more software — and that may not be easy either. I've heard the claim that deep learning already beats humans at some tasks, but that's specific to a benchmark. In real life I suspect that humans using a well-designed quorum-based approach can still be more accurate.

Post-Mobileye, my understanding is that Tesla is using various NVIDIA products for different purposes:
  • FSD demo video(s) used Drive PX2 in the car, with multiple cameras, trained using DriveWorks. This would have been the fastest path to a FSD demo.
  • HW2 cars come with Drive PX2 and multiple cameras. But AP only seems to use the front cameras, possibly just one of them. I suspect it's trained using Tesla's own software (Tesla Vision) instead of NVIDIA's DriveWorks, because Tesla wants to own that know-how.
Either or both of these could have been trained on the NVIDIA DGX-1, or on commodity hardware. Either or both could use NVIDIA DIGITS, or other software. Tesla has probably evaluated different options and uses the more cost-effective approach.

Now I understand Tesla has released AP for HW2 at up to 55 mph. Has anyone else demonstrated speeds above 55 mph with PX2 hardware? The Tesla FSD demo videos seemed to top out around 25-35 mph, and the NVIDIA demo at CES seemed pretty slow too. What about Volvo? Aren't they using PX2 in limited trials in and around Gothenburg? How fast can it go?

I agree (all speculation, obviously). The FSD demo was almost certainly a quick one-off prototype and probably just used generic NVIDIA stuff. You can always tell how nearly identical all the demo videos are! I doubt that the FSD product will use much of NVIDIA's software (in the PX2). The current AP2 EAP is probably a very simplistic realization of Tesla's classic AP approach using one camera and (mostly) radar (and Tesla's radar signal processing and training are superb). I believe Tesla wants their own FSD IP trained from their vast fleet data (and that's why I fronted them $4000 for FSD ;) ).

I don't think that either the FSD "demo" or the current AP has ANY direct relationship to Tesla's ultimate FSD solution. I'll be impressed when we see an FSD demo that is an obvious departure from the generic NVIDIA feature overlays and similar bay area neighborhood Sunday morning drive. I HOPE that Tesla has not had to return to square one (perhaps even more than once), but I won't be surprised to learn this is the case in the future when it leaks out. Over the long term, I have a lot of faith in Tesla's approach, but the challenges are really hard.
 
I think really what we're seeing is essentially Autopilot 1.5 - basically because they needed to get something/anything to appease us HW2 customers in the interim.

FSD requires a very different approach to the lane-and-car following approach that AP1 uses... I would imagine that Tesla are probably ahead of the curve with their FSD solution, but that it's reliant on all sorts of other things that aren't ready to go yet. HD maps, driving policy, localisation etc etc

To get us something to play with in the short-term, it seems as if they siphoned off some FSD engineers in late 2016 and said "basically, you have to rebuild AP1"... there may be some overlap here and there but, essentially, it's an entirely different approach. Trying to do that additional ADAS system, without infringing multiple patents, is probably very hard.
 
  • Like
Reactions: Mobster and Turing