Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Nvidia introduces new FSD computer chip

This site may earn commission on affiliate links.
It is because comparisons are done using public benchmarks (with the exception of Tesla ofcourse). This is how GPU, CPU, RAM, etc from different companies are compared. That way its clear cut no one is lying. Obliviously Tesla won't participate in something as fair as that.

Didn't Elon reiterate this year that Level 5 can be done with the current Nvidia's GPU-only hardware?
Can't have it both ways, either the GPU sucks or it doesn't suck. But here

Now mind you, this is a hardware with only 10 TOPs and around 200 Watts for early AP2.0 versions that Eon is saying is sufficient.
There was a huge debate here when AP2 were unveiled and majority of Tesla fans said the chip was more than enough. When Nvidia came out and said it wasn't they claimed Nvidia was lying in order to sell more powerful chips. Few years later now all of a sudden Tesla fans believe that the AP2 chip is worthless.

Remember the chip Elon said this year was enough is three generations old and only had 10 TOPS and doesn't even use any of the next gen improvement that Nvidia has.
  • It doesn't have Tensor Cores (NN accelerator to accelerate large matrix operations and perform mixed-precision matrix multiply and accumulate calculations in a single operation)
  • It doesn't utilize Tensor RT deep learning inference optimizer.
  • It uses unbearably slow pci-e and not the new NVLink 2.0 which was created to eliminate memory bottlenecks (NVLink 2.0 has data rates transfer of up to 300 GB/s)
  • It uses miserable slow RAM instead of HBM2 (+200 GB/S).
  • I could go on and on as there are so many new tech.
Years ago Tesla fans said the 10 TOP 200 watts chip was all you need for Level 5 FSD.
After three gen of innovation Tesla fans now say 800 TOP on 300 watts chip is absolutely useless and isn't good for SDC.

Gotta love tesla fans immaculate logic.
You tiresome bashing of 'typical Tesla fans' does not do your arguments any favors. Technology progresses, and each chip will be leap-frogged by the next. So, you are right, NVidias next chip will be great, and better than Teslas current chip. Big deal.

Thanks for the info, but no thanks for the editorializing.
 
  • Like
Reactions: bxr140
You tiresome bashing of 'typical Tesla fans' does not do your arguments any favors. Technology progresses, and each chip will be leap-frogged by the next. So, you are right, NVidias next chip will be great, and better than Teslas current chip. Big deal.

Thanks for the info, but no thanks for the editorializing.

Its literally a one liner. Which i have removed. Happy? :p
 
Nvidia's Orion goes up to 2,000 Tops which took me like 1 seconds. Second of all its obvious that any chip ever made will be put on a board and that board can contain multiple chips. Especially when the chip is coming from a chip maker.

2x Orin = 400 TOPS, 130W
2x Orin + 2 GPU = 2,000 TOPS, 750W

Sounds great! Let's see who is first to market with their next gen chips. I bet AMD can't do any better *neener neener*, Samsung, where's your product at? Lets make this a battle royal and get all the players involved and really push the curve on how fast we can get to the next level.

It's 2020....Let the battle begin (and the consumer win)!
 
750W is nothing for a SDC. Absolutely mean-less. This is typical tesla fan's line of thinking and rationalization.

Meaningless? It's a hard number that we can perform math on. That's hardly meaningless. 750W is:
  • About ten MacBook Pro laptops compiling something with Xcode continuously.
  • A 28-core, $50,000 Mac Pro running at full tilt.
  • A loss of more than three miles of driving range for every hour of driving.
Assuming an average driving speed of 30 MPH (probably optimistic for city driving), and the theoretical estimate of 241 Wh per mile for the car itself, that's 7230 Wh per hour for the car, and 750 for the computer. So 9.4% of your power could easily be burned just running the computer. I'm pretty sure if you told someone that the self-driving computer would mean getting almost 10% less range, most folks wouldn't serious consider turning it on.

That's a crazy amount of power even for a computer running on household current. In a car, using a tenth of your power on computing is absolute madness.

Yes, if you're talking about a taxi fleet, it probably doesn't matter. But for personal cars, numbers that big matter a great deal.
 
Meaningless? It's a hard number that we can perform math on. That's hardly meaningless. 750W is:
  • About ten MacBook Pro laptops compiling something with Xcode continuously.
  • A 28-core, $50,000 Mac Pro running at full tilt.
  • A loss of more than three miles of driving range for every hour of driving.
Assuming an average driving speed of 30 MPH (probably optimistic for city driving), and the theoretical estimate of 241 Wh per mile for the car itself, that's 7230 Wh per hour for the car, and 750 for the computer. So 9.4% of your power could easily be burned just running the computer. I'm pretty sure if you told someone that the self-driving computer would mean getting almost 10% less range, most folks wouldn't serious consider turning it on.

That's a crazy amount of power even for a computer running on household current. In a car, using a tenth of your power on computing is absolute madness.

Yes, if you're talking about a taxi fleet, it probably doesn't matter. But for personal cars, numbers that big matter a great deal.
It’s funny that people are worried about 15 cents an hour. I can just picture the customer.
How much will it cost me for this car to drive me to work?
About 1kwh per hour, so about 15 cents.
That’s ridiculous, I’ll drive myself.
 
  • Funny
Reactions: AlanSubie4Life
It’s funny that people are worried about 15 cents an hour. I can just picture the customer.
How much will it cost me for this car to drive me to work?
About 1kwh per hour, so about 15 cents.
That’s ridiculous, I’ll drive myself.

Nobody is worrying about cents per hour. I'm talking about reducing the car's maximum range by 30+ miles.
 
  • Helpful
Reactions: SmartElectric
Nobody is worrying about cents per hour. I'm talking about reducing the car's maximum range by 30+ miles.

Yes but as the max range of EVs increases, this loss of 30+ miles will have less of an impact. The Model S already has a range of 370 miles and as battery tech improves, it is likely that 400+ miles of range will be possible in the not too distant future. Just put a slightly bigger battery in your EV to compensate. Plus, future computer chips will be more energy efficiency as well, in order to use less power. So I don't think the range loss will be a big deal.
 
Nobody is worrying about cents per hour. I'm talking about reducing the car's maximum range by 30+ miles.
There are many Tesla owners who use Sentry mode which decreases range by 20 miles per day.
Your calculations are based on an average speed of 30mph which would be 10 hours of continuous driving. Very few people do that.
Also self driving cars will obey the speed limit which will more than make up for the 10% loss :p
 
Looks like we have a nvidia challenger although without the development suites that comes with Nvidia DRIVE.

The platform is built on modular multi-core CPUs, energy-efficient AI and computer vision engines and a GPU, the company said. It’s also thermally efficient and can offer 30 tera operations per second (TOPS) for the lower level active safety systems, up to more than 700 TOPS at 130W for autonomous driving. This means the platform can operate at these various levels without requiring additional liquid-cooled systems, which helps lower the cost and boost reliability, Duggal said, adding it can be particularly useful in electric vehicles.

Pretty impressive hardware.

Qualcomm unveils its Snapdragon Ride platform for all levels of automated driving – TechCrunch
 
Seeing Mobileye and Qualcomm with their plug and play stacks for autonomous driving, it makes me wonder if autonomous driving is about to go mainstream. By that I mean, auto makers will be able to put the right sensors and hardware on their cars, get the software off the shelf from say Qualcomm or Mobileye and put it all together, calibrate it and done. I am sure it is a bit more complicated than that but maybe not really. It just seems like a lot of the aspects of autonomous driving like perception and even a lot of driving policy have already been done. There is no need for a company to develop autonomous driving from scratch.

@Bladerskb am I talking BS or am I on to something?