Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How does Tesla know 20x more compute than HW 2.5 is enough?

This site may earn commission on affiliate links.
Can you come up with 10 (improved-)Highway-Autopilot-quality features that you would want the car to take care of?

Do we know what frame-rates and resolutions the cameras are running at presently? I believe there was some downscaling of the inputs involved for HW2.5. If HW3 can make full use of the fastest possible frame-rate and best possible resolution, that would be a good use of the 20x processing power.
 
Elon said this, Elon said that.

I don't think he or anyone in his team would know conclusively, if the current hardware (processing power, cameras, radar, sonars) are enough for Level 4, lets along Level 5. This is a new territory for everyone, so it is all a guess.
 
I hope you win three prizes!
The definition of FSD is now drastically different than 2018 and earlier. Our version is pretty clear cut expectation wise. I should be able to sleep at the wheel.

Do you have concerns about HW 3 being powerful enough?[/QUOTE]


I don't think HW3 will be capable of sleeping behind the wheel or better yet dropping me off at the airport and returning home on its own. I do believe tho that the current slate of cars (Tesla HW3) will be well out of commission before either of those two things actually come to fruition.
 
Do we know what frame-rates and resolutions the cameras are running at presently?
From greentheonly youtube videos, the non-main cameras seem to have neural network updates every 4th frame. You can verify by watching the overlaid bounding boxes or lane lines and stepping frame-by-frame with "." (or "," to go backwards) and while the video updates every frame, the overlay is updated not as frequently.

Compare this to Autonomy day video when Karpathy starts talking around 1:53:24 showing white lane lines and blue drivable space, and watch the white car get passed in the right pillar camera and the bounding box updates every frame.

So yes, it does seem likely with FSD computer, Autopilot will process all cameras at 30 frames per second at full resolution. Whereas with HW2.5, non-main cameras are processed at ~8 frames per second and potentially cropped.
 
So yes, it does seem likely with FSD computer, Autopilot will process all cameras at 30 frames per second at full resolution. Whereas with HW2.5, non-main cameras are processed at ~8 frames per second and potentially cropped.

Heck, a few frames doesn't sound like much but it could make a big difference in reaction speed.

In a worst case scenario, if HW2.5 catches a pedestrian 0.133 seconds slower than HW3 traveling at 70 MPH, the HW3 vehicle is afforded about 13 feet of additional space to stop.
 
  • Informative
Reactions: APotatoGod
Do we know what frame-rates and resolutions the cameras are running at presently? I believe there was some downscaling of the inputs involved for HW2.5. If HW3 can make full use of the fastest possible frame-rate and best possible resolution, that would be a good use of the 20x processing power.

Ap2 software has been using full resolution from all 8 cameras from a while now according to verygreen.
 
Kind of the same thing going on with the newer V3 Superchargers. Only little overall speed benefits for many of the legacy cars, but might give much more substantial charging speed increases with newer releases.

Well, not really. Having a 4-way or 5-way power split instead of 2-way matters a great deal even if your charge rate maxes out at 100 kW; after all, that's still 75 kW more than I've seen at busy V2 superchargers on bad days.

The V3 supercharger upgrades are closer to the 8.0 rewrite of the AP system. Everybody benefitted a lot, but folks with newer hardware will eventually benefit a lot more.
 
  • Like
Reactions: APotatoGod
Compute is never enough assuming time goes by. Eventually, it's capabilities will be overrun by what is possible. So the definition of FSD is what's important here.

Is FSD Level 3...where a human still needs to be in a drivers seat, ready to take over but it can navigate anywhere you tell it? I personally think this would be possible with HW3 and what Elon is thinking. Or is FSD more like Level 4...the first level of total system control but there would be some situations like environmental that would not be supported. This would be a likely maybe in my opinion. Or does it mean level 5, whereby the drivers seat essentially becomes a passenger seat and the steering wheel and pedals can be removed from the car and all environmental conditions are supported. I would think HW3 would be pushed to sustain L5 but that's a random internet opinion so please take it just like that.

exactly this
 
Heck, a few frames doesn't sound like much but it could make a big difference in reaction speed.

In a worst case scenario, if HW2.5 catches a pedestrian 0.133 seconds slower than HW3 traveling at 70 MPH, the HW3 vehicle is afforded about 13 feet of additional space to stop.
I drive where the speed limit is 70 and I often drive with traffic which is not the same speed as this limit. I have made 115 mile trips where my end to end average speed is 74 MPH. Slower computer can make a difference. A lot goes on in a second at this speed.
 
They know it's enough the same way they knew v2.0 hardware was enough.

I feel like the GPU solution was enough, and offered advantages of being able to dynamically shift to different changes in FSD algorithm, however I would also say in HW 3.0 the only thing possible to provide the computational increase is an ASIC based platform not GPU. Yes they may call it GPU but its a hardware ASIC specifically designed to process the current algorithm. Danger of ASIC platform is there is no adoption into changing Algorithms. Maybe being an ASIC we know it doesnt need to be. My personal conspiracy theory is that FSD city too, has been out, however they need to retrofit the community so customers aren't in an uproar there not upgraded.

Im new to tesla but not GPU or AI.
 
it may well be true that HW3 is not good enough. Personally I reckon it will be, but nobody really knows, but I think the sensors are fine. We all manage to drive just with 2 forward facing head-mounted cameras of very low quality, so we know it can be done :D
 
Nobody knows if it's good enough. They just set a target on running on the current basic maths that the NN processing is built on: matrix calculations. Then they have a price target so he could put it in every car. And they are just hoping for the best, hoping that the current way of doing AI is correct and sufficient.
 
  • Like
Reactions: DanCar
Bad analog, but also when we turn our head, we still have 2 cameras any direction we look. The Tesla only has 1 camera in every direction but front. If all your life you only had 1 eye, you'd be bad at figuring out distances.

Not true for the last part. Humans really use stereo (binocular) vision for near vision and for distance end up using other ways to judge distance or depth (contextual cues like shadows and light, subject size ect). There are plenty of people driving around with one functional eye, sure it messes with field of vision, how comfortable they feel and redundancy if they happen to get something in your eye but it doesn’t affect their ability to perceive depth at distance.
 
There are plenty of people driving around with one functional eye, sure it messes with field of vision, how comfortable they feel and redundancy if they happen to get something in your eye but it doesn’t affect their ability to perceive depth at distance.

Never said you can't figure out distances with one eye. Keyword: bad (i.e., a lot worse than someone with two working eyes).
 
you said “you would be bad at figuring out distances.” I’m telling you it wouldn’t even be bad at distance. You don’t need binocularity for distance perception.

Well I'm telling you it is bad. When I cover one eye and drive at night, I feel like I'm bad at seeing and judging distance. Many people will agree with me.

I agree with you, you don't need two eyes for distance perception. That wasn't my point.