Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla autopilot HW3

This site may earn commission on affiliate links.
I gave you a disagree alone for posting to a thread without reading its opening post - if you just want to demonstrate your knowledge on this topic without discussing other posts, you are free to create your own thread.
Quick question. A bit off-topic. But would appreciate it if you can answer it. Can you go from Munich to hamburg using freeways but without using/encountering a highway interchange?
 
Quick question. A bit off-topic. But would appreciate it if you can answer it. Can you go from Munich to hamburg using freeways but without using/encountering a highway interchange?

No. There are basically two options, both fully Autobahn. The classical is the European E-45 through the former West-Germany (via Würzburg + Kassel).

Since some years the Autobahn between Munich and Berlin (through the former DDR) has been refurbished, with long stretches of three lanes with no speed limit*. This option (via Leipzig + Magdeburg) is about 15 km longer, which is easily gained by the higher speed possible.

Counting only junctions with an Autobahn change, the old route would have 2, the new 3.

Btw, going from Flensburg to Munich via Hamburg and Leipzig is close to 1000 km and can be done in less than 6 hours in an Audi A8 incl. one fuel stop and without blatant disregard for the speed limits where they are imposed. A bit more than a year ago I crashed my Audi on that route, luckily not at top speed.

PS. Counting both directions, I have driven between Hamburg and Munich about a 100 times.

* Conditions apply.
 
Last edited:
@verygreen the text on the attached images are too blurry. Is a higher resolution version available? Thanks!
This is intentional, we downsampled the pictures since it gives you a good idea about the shape, but it's not detailed enough in case Tesla has some valuable IP in there they might want to keep secret for now.
 
No. There are basically two options, both fully Autobahn. The classical is the European E-45 through the former West-Germany (via Würzburg + Kassel).

Since some years the Autobahn between Munich and Berlin (through the former DDR) has been refurbished, with long stretches of three lanes with no speed limit*. This option (via Leipzig + Magdeburg) is about 15 km longer, which is easily gained by the higher speed possible.

Counting only junctions with an Autobahn change, the old route would have 2, the new 3.

Btw, going from Flensburg to Munich via Hamburg and Leipzig is close to 1000 km and can be done in less than 6 hours in an Audi A8 incl. one fuel stop and without blatant disregard for the speed limits where they are imposed. A bit more than a year ago I crashed my Audi on that route, luckily not at top speed.

PS. Counting both directions, I have driven between Hamburg and Munich about a 100 times.

* Conditions apply.

Nice. Thanks. Was wondering what kind of capability a system would have to have in order to do that trip.


Did you get off work yet?

Busy Saturday, planning a lengthy write up soon.
 
1iPh.gif
 
This isn't a limitation of the radar sensors used, but rather in how radar works and must be implemented in a car. First, the car uses doppler radar, so it uses a difference of speed to "see" objects. Second, all still objects effectively "look" the same to the radar sensor. So, passing a road sign looks like a giant flat reflection, and the system would want to filter it out. Likewise, a stopped truck basically blends into the surroundings and gets filtered out.

Something to note is that every vehicle using radar-based EAB has the exact same problem. If someone changes lanes quickly in front of you to avoid a stationary object, you're going to hit that object.

I agree that's how it works atm but am unsure how that realistic view of the radar's capabilities can be reconciled with the official story, in which the sensor does have sufficient resolution to distinguish between stopped and moving objects on the highway, up to and including UFOs appearing parked in your lane in dense fog.

The following parts in particular cause me difficulty, by suggesting a radar with improved resolution should manage what the current one evidently does not:

"we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition."

The first part of solving that problem is having a more detailed point cloud. Software 8.0 unlocks access to six times as many radar objects with the same hardware with a lot more information per object.

It is hard to tell from a single frame whether an object is moving or stationary or to distinguish spurious reflections. By comparing several contiguous frames against vehicle velocity and expected path, the car can tell if something is real and assess the probability of collision.

The net effect of this, combined with the fact that radar sees through most visual obscuration, is that the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions.

Taking this one step further, a Tesla will also be able to bounce the radar signal under a vehicle in front - using the radar pulse signature and photon time of flight to distinguish the signal - and still brake even when trailing a car that is opaque to both vision and radar. The car in front might hit the UFO in dense fog, but the Tesla will not.
"
 
  • Informative
Reactions: kbM3
@jimmy_d Do you think it’s 80 TOPS to run the neural network(s), or 40 TOPS to run the neural network(s) and then the other 40 TOPS to run a redundant instance of the neural network(s)?

Those estimates are so rough they shouldn’t be used as the input to anything. They could easily be off by several times in either direction. The point of making them was to answer the question of whether the numbers @verygreen found are plausible- and I think they are.

Apparently it’s unknown what runs on the duplicate hardware that is called side B. The firmware for both TRIP units is identical.
 
The question that i have is, how much power will this new chip draw? @verygreen
who knows, it's not like I actually have the board in hand. I guess it's not going to be any more than what the current hw2.5 drawing (which at idle is ~40 watts, but quite a bit more when it does stuff (i.e. always when it's in the car and on, as opposed to my bench where it cannot see the rest of the car and is idle))
 
who knows, it's not like I actually have the board in hand. I guess it's not going to be any more than what the current hw2.5 drawing (which at idle is ~40 watts, but quite a bit more when it does stuff (i.e. always when it's in the car and on, as opposed to my bench where it cannot see the rest of the car and is idle))

I'd actually expect a purpose build processor to operate cooler and more efficiently than the GP GPU that is currently used.
 
Mobileye's Amnon just said that their EyeQ5 (24 TOPS, 10 Watts) sits on the windshield with no fan, no liquid cooling and that other competitors couldn't even fathom that.

Yes, it's great performance per watt, but they will use 3x EyeQ5 on the BMW next year as you mentioned, which would put them up into presumably L4 territory with 30W.

If Tesla's HW3 remains around the 40W envelope and proves extremely capable, they would be in the same ballpark.
 
  • Like
Reactions: croman and Joel
At 100 MPH, a vehicle would travel 147 feet per second. With a camera operating at 60 fps, capturing two frames for motion estimation, the object in the camera would travel around 9.7 feet. If the object is 100-200 feet away, it's not too big a deal. At 300 feet per second, that object travels 20 feet in the same 2-frame span. There aren't too many situations in which two vehicle approach one another with a speed difference of 300 feet per second, so that would be a pretty rare case. But even still, with the object covering only 20 feet of distance, the system could have worked out what it is, where it is, where it's headed, and how fast it's getting there.

I thought all Tesla cameras capture at 36fps, or is that just how the dashcam feed is delivered?


This isn't a limitation of the radar sensors used, but rather in how radar works and must be implemented in a car. First, the car uses doppler radar, so it uses a difference of speed to "see" objects. Second, all still objects effectively "look" the same to the radar sensor. So, passing a road sign looks like a giant flat reflection, and the system would want to filter it out. Likewise, a stopped truck basically blends into the surroundings and gets filtered out.

Something to note is that every vehicle using radar-based EAB has the exact same problem. If someone changes lanes quickly in front of you to avoid a stationary object, you're going to hit that object.

If, with ample stopping distance available (e.g. 200m @80mph) when the cut-out occurs, the radar cannot distinguish a massive stationary object in planned path from inconsequential roadside clutter across the whole range of AP speeds, then it is effectively worse than useless for redundancy.

How can FSD ever be made to work safely with such a sensor?

Are the Tesla Network's paying passengers to be remorselessly fused into the attenuator @80mph, like Walter Huang, when the AI/cameras recognise the wrong lines and there is no backup to contradict this 'solution'? That would be a rather costly business model.

Something obviously needs to change and withstand rigorous hazard testing to rectify the current ridiculous situation which has been permitted to persist for far too long, i.e. 3 years since contributing to the first AP fatality, namely Gao Yaning, † 20 January 2016, Handan, Hebei, China:
 
Last edited: