Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla autopilot HW3

This site may earn commission on affiliate links.
Bug report logs a timestamp but nothing happens to this automatically. There is no red flashing light that goes off in the engineering bay and suddenly all the developers drop what they're doing and investigate your issue. In fact it goes into a black hole unless you also call Tesla and convince somebody to take a look -- which I know from experience has gotten to be almost impossible now because their support lines are completely slammed. They don't want your bug reports, and they don't want your trouble tickets. If by some miracle you do get somebody to pay attention (probably by going through your service center rather than telephone support) then hitting that bug report button and making a note of the approximate time will allow them to find it in the logs.

But -- this is the key thing -- they are not going to care to do this unless the service center believes there is a particular hardware problem with your car. This is not how they fix software bugs for the most part. Believe me, they have more than enough examples of their software doing stupid stuff; they don't need your examples anymore.

That's the opposite of what Elon told us a few months ago, but I don't have any insight into Tesla's internal process to know either way.

Certainly the bug report is an easy way to document your issue and get it into a collection that someone may look at at some point.
 
tumblr_nzd8u3JJ9P1uis579o1_400.gif
33xizpu.gif
 
  • Funny
Reactions: Matias
Any bets when the retrofits begin? I'm going to place my bets on September/October. I hope to be wrong since I want HW3 sooner rather than later :D

Well a "few months" is rather imprecise. "few" means a small number larger than 2. So a few months could be 3-6 maybe? And of course it will take some time to get through all the upgrades. I think your Sept/Oct bet is a good one for when they start the upgrades.
 
  • Funny
Reactions: APotatoGod
FSD price going up after May 1. Another sign that Tesla is moving quickly on the FSD front.
So quickly they still can't read speed limit signs in 2019, something even cheap ICE cars can and that should be loads easier than stop signs and lights. Because, depending which country you're in, the map data for speed limits is outright dangerous.

Plus Elons track record isn't great when it comes to Autopilot. Ah well, let's hope for the best.
 
They downsample all the cameras 4:1 (and the cameras are pretty low rez to begin with). That's probably why they can't read the speed limit signs. With HW3 they won't be downsampling the camera data.
that's not how it works. NN gives them a bounding box for the sign, then they could look into full resolution image at those coordinates to extract the actual speed value.
 
AP2.x computer is at 80% capacity without sign reading functionality. Sign recognition likely does not share a lot with lane tracking/ vehicle recognition, so they use map based data to save space/ cycles.

that's not how it works. NN gives them a bounding box for the sign, then they could look into full resolution image at those coordinates to extract the actual speed value.

A relook would require multiple executions of the NN with different coefficients, wouldn't it?
Side question:
Are the cameras downsampled or cropped?
 
  • Like
Reactions: APotatoGod
They downsample all the cameras 4:1 (and the cameras are pretty low rez to begin with). That's probably why they can't read the speed limit signs. With HW3 they won't be downsampling the camera data.

The numbers on a speed limit sign are about a foot tall. At 100 feet, that's .572 degrees. The Tesla main forward camera has a 50-degree field of view. (Is that diagonal, horizontal, or vertical? I'll assume vertical, but it only makes a factor of 1.5 difference, so if I'm wrong, then oh well, whatever.) So the sign is around one hundredth the vertical resolution. At 480 pixels high, that's about 5.5 pixels, which should actually be enough to make out the numbers, barely. At a hundred feet. Using the standard camera. The narrow camera would give you almost eight pixels at a hundred feet. And at less ridiculous distances, it gets even easier.

These folks did it using a camera that's even wider than Tesla's main camera, at a quarter the resolution of Tesla's cameras, and were reliably reading them at ~70 feet.

So I'd imagine it's more because of a lack of available horsepower in general, rather than because of the downscaling that results from that inadequate horsepower.

A relook would require multiple executions of the NN with different coefficients, wouldn't it?

Realistically, I suspect you would use an entirely different NN to interpret the numbers, if you even use one. (You could also do it procedurally, in all likelihood, given the relative regularity of number form.)
 
AP2.x computer is at 80% capacity without sign reading functionality
that's just the GPU, cpu is at less than 50%. And the vision NNs have a ton of currently unused outputs including traffic signs and such.

A relook would require multiple executions of the NN with different coefficients, wouldn't it?
no. You can do it on cpu, don't need NN to read numbers from a rectangle.

Are the cameras downsampled or cropped?
they are downsampled for NN consumption, but full resolution is used on the cpu for further processing when needed.
 
If Verygreen is right about having access to the full resolution, and I'm sure he is, then I'm pretty disgusted that they haven't got the speed limit reading to work yet.
Even if the hardware has bandwidth, it is another development task that either pulls human resources from other aspects or requires additional engineers.
Number recognition is just one aspect. Which sign is the dominant one if multiple are in view? How to handle time based school zone limits? What if the sign was tampered with? What if there is no speed marking?