Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla autopilot HW3

This site may earn commission on affiliate links.
The thing that drives me crazy in the current EAP is when a vehicle crosses in front of you like 100 yards away. I.E. it turns left across your lane but far enough away that you don't even need to react but EAP jams the brakes anyway. When will EAP/FSD be able to predict that a moving vehicle from right-left across your lane won't be there when you arrive?

When it uses enough time displaced data to know that the obstacle in front is crossing traffic, not turned in front of you and stopped traffic (or crossing into your lane traffic).

It also needs some level of confidence that the crossing vehicle will clear your lane versus starting the left then getting stopped by something in it's path. (trade off between caution and expediency)
 
The thing that drives me crazy in the current EAP is when a vehicle crosses in front of you like 100 yards away. I.E. it turns left across your lane but far enough away that you don't even need to react but EAP jams the brakes anyway. When will EAP/FSD be able to predict that a moving vehicle from right-left across your lane won't be there when you arrive?

I suspect that AP3-FSD will fix this. Remember that EAP was designed for divided highway so it was never really designed for cross traffic. So this is a scenario that Tesla probably never felt like EAP needed to fix. But with the increased processing power of AP3 and the necessity for city self-driving to be able to handle cross traffic, this issue will need to be addressed.
 
  • Like
Reactions: Falkirk and jayman
I found this video of someone driving a Model S with AP3. It's just enhanced autopilot, not any new full self-driving features. But I will refrain from any editorializing. I will let you all watch it and decide.

I didn't get a chance to watch the entire video, but do they ever prove that it's running HW3 ?

I do see the part where they claim its HW3/AP3, and that it's known as HW4 internally to Tesla. So it seems like they know what they're talking about, but then they say it's running 2019.8.4

But, I don't see any indication that 2019.8.4 actually gets installed on AP3.

On TeslaFI I don't see any AP3 vehicle on firmware other than 2019.5.17
 
cool video but waaaaay too long for me to watch it all. TL;DW haha. What is the upshot? It seems like EAP works pretty well in RHD version.

Yeah, EAP does work really well in RHD mode. He drives pretty narrow non divided local roads and the car sticks to the center of the lane really well even with the sun glaring in his face and the lane lines sometimes fading a bit.
 
  • Helpful
Reactions: pilotSteve
Imo this shows why Karpathy was right. We humans try to engineer a good solution, but the real world is too complex. It is often better to just gather data and let gradient descent do the programming for you.

But I'm pretty sure the kangaroo-murdering systems are based on deep learning. But if you don't have kangaroos in your data set, it won't learn about them. Nobody is seriously trying to solve camera perception for vehicles without deep learning (definitely not unique to Tesla), and yet there are still substantial challenges. It's not a silver bullet.
 
But I'm pretty sure the kangaroo-murdering systems are based on deep learning. But if you don't have kangaroos in your data set, it won't learn about them. Nobody is seriously trying to solve camera perception for vehicles without deep learning (definitely not unique to Tesla), and yet there are still substantial challenges. It's not a silver bullet.

And once kangaroos are in the data set, what about wallabies? Will they just be interpreted as really far kangaroos?
 
I think the kangaroo issue it that don't touch the ground when moving. So any system that relies on visible pavement or vertical positioning for depth estimation will think a mid air roo is further away than it really is.

Why would anybody ever try to use unreliable depth cues in monocular vision system without any backup sensing modality for a safety-critical autonomy system?

Oh wait...
 
Why would anybody ever try to use unreliable depth cues in monocular vision system without any backup sensing modality for a safety-critical autonomy system?

Oh wait...

I do every day...

(I also do not live in Australia)
Honestly, my lack of true depth perception makes the Tesla center display a pain to work with (what little I've experienced it). Same issue when trying to push a button when my wife is holding her iPad... almost touching... almost touching... how far is it??? oh there it is....
 
I found this video of someone driving a Model S with AP3. It's just enhanced autopilot, not any new full self-driving features. But I will refrain from any editorializing. I will let you all watch it and decide.


This is 100% NOT a AP3-car. It's just AP2.5HW. There are no cars in the UK with APH4 (AP3) at this point because the transport ships with the first builds will not arrive with these cars prior to May.

Interesting watch anyway. But for sure not with HW3. The guy in the video got it all wrong ;)
 
Doubtful. The Model 3 is the hardest to upgrade and takes the most time, by a considerable margin. When you factor in the labor costs, a Model S or X update probably costs half to two-third the cost of a Model 3 upgrade.

Aha! Dgatwood, you know where this new circuit board is going in the various models, a question that's had me terribly curious. In S and X, is access for the swap gained through the touchscreen or through the rear of the glove box?
 
Aha! Dgatwood, you know where this new circuit board is going in the various models, a question that's had me terribly curious. In S and X, is access for the swap gained through the touchscreen or through the rear of the glove box?

It's in the same module as the MCU for the Model 3. Here it is in the parts catalog. Video from a tear-down.

Model S & X it's in the dash over the glove box based on Daerik's video.

Edit: NM, not sure where the hell it is in the Model 3.

Edit2: NM. Found it in the parts catalog. Oddly, the link brings you to the Model S version, but if you search for MCU under Model 3, or follow the links through Infotainment, it'll bring you to the same part. According to this Electrek article, it's an integrated module.
 
Last edited:
  • Like
Reactions: MorrisonHiker
Mid-air kangaroos are shaped differently than when they're touching ground. I'd hope NN training could distinguish between the two.

Sure, it can tell the roo is in mid jump, but how does it tell if it is a big roo jumping far away, or a small roo closer?
With a 25 + foot horizontal, 6 foot vertical, and 35MPH top speed, there may not be many frames to use.
Same goes for deer jumping in front of (on into the side of) one's car....
 
Sure, it can tell the roo is in mid jump, but how does it tell if it is a big roo jumping far away, or a small roo closer?
With a 25 + foot horizontal, 6 foot vertical, and 35MPH top speed, there may not be many frames to use.
Same goes for deer jumping in front of (on into the side of) one's car....

I suppose if the kangaroo is close enough it would be an issue, the parallax between the 3 front facing cameras (even at different resolutions) may be able to estimate distance. That and radar.