Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP2.0 video

This site may earn commission on affiliate links.
Several things:

Tesla continues to have an ironic sense of humor with song choices (a song about depression??).

Cameras apparently are still gray scale, not color.

There were at least two examples of not quite correct turns where the car essentially turned into the wrong lane, or partially into the wrong lane.

It is interesting that they are processing all of this on a single Titan GPU - not what I was expecting. I was expecting them to use nVidia's DrivePX platform or something with even more horsepower than a single Titan. Would be interesting to get more specs. Teardown time!

It was interesting that the announcement got delayed so that they could actually complete this demo. Not just compile video, but actually do the demo (according to Elon's tweet). Man, that's cutting it really close, Elon. You announce a product unveiling before you've even run the demo once????
 
Yes it is. What's interesting is that Tesla Vision is running 12 tera operations/s - as fast as (distant) future Mobileye 5. Of course benchmarks and reality can diverge greatly.

It's interesting that Musk mentioned AMD in his presentation - maybe they are working on their own chip with cooperation with AMD?
 
Last edited:
Awesome video, I can't wait to try it, very exciting!

But wow, that soundtrack... Interesting choice. I would have expected something more upbeat with a more positive outlook. On the other hand, this does perfectly express the frustration they must be experiencing in the face of such strong public reactions to the perceived safety issues.
 
Several things:

Tesla continues to have an ironic sense of humor with song choices (a song about depression??).

Cameras apparently are still gray scale, not color.

There were at least two examples of not quite correct turns where the car essentially turned into the wrong lane, or partially into the wrong lane.

It is interesting that they are processing all of this on a single Titan GPU - not what I was expecting. I was expecting them to use nVidia's DrivePX platform or something with even more horsepower than a single Titan. Would be interesting to get more specs. Teardown time!

It was interesting that the announcement got delayed so that they could actually complete this demo. Not just compile video, but actually do the demo (according to Elon's tweet). Man, that's cutting it really close, Elon. You announce a product unveiling before you've even run the demo once????

'Grayscale' sensors have higher resolution, dynamic range and light sensitivity. We don't know if one of the front facing camera is color.

Tesla Vision has been in the works for over a year - back when Titan was one of the fastest GPUs with reasonable TDP and still is many times faster than 2018 Mobileye 4. Besides, EM said that they can run on other processors, including AMD. As I said earlier Tesla may eventually design their own chip with AMD's cooperation similar how AMD does it for Microsoft or Sony for their consoles.

EDIT: Color vs. Monochrome Sensors
 
Last edited:
Awesome video, I can't wait to try it, very exciting!

But wow, that soundtrack... Interesting choice. I would have expected something more upbeat with a more positive outlook. On the other hand, this does perfectly express the frustration they must be experiencing in the face of such strong public reactions to the perceived safety issues.

where can I find the name of the soundtrack?
 
  • Funny
Reactions: u00mem9
'Grayscale' sensors have higher resolution, dynamic range and light sensitivity. We don't know if one of the front facing camera is color.

Tesla Vision has been in the works for over a year - back when Titan was one of the fastest GPUs with reasonable TDP and still is many times faster than 2018 Mobileye 4. Besides, EM said that they can run on other processors, including AMD. As I said earlier Tesla may eventually design their own chip with AMD's cooperation similar how AMD does it for Microsoft or Sony for their consoles.

EDIT: Color vs. Monochrome Sensors

Is it confirmed it is the titan GPU and not the new announced Nvidia system?
 
Yes it is. What's interesting is that Tesla Vision is running 12 tera operations/s - as fast as (distant) future Mobileye 5. However, just announced nvidia Drive PX 2 (Parker) is supposed to be twice as fast. Of course benchmarks and reality can diverge greatly.

It's interesting that Musk mentioned AMD in his presentation - maybe they are working on their own chip with cooperation with AMD?

Concerning there being another board (Drive PX 2) being twice as fast as what Tesla is using...

There is a big difference between the commercial world of graphics cards and the choice of hardware in an embedded system like the Tesla autopilot. In the commercial world, everyone is pushing the envelope, and the goal is always to be faster, bigger, cheaper. It's simply Moore's law, you need to be able to be able to one up your competitor. And, there will always be use cases where that additional processing power is needed, and other use cases where 10x, 100x or 1000x that level of processing power is needed.

For the Autopilot application, there is only a single "use case". The hardware that is installed in the vehicle needs to be able to run software that is capable of self driving the car. Period. Tesla would not have picked the hardware platform they did, and stated that it can do Level 5 autonomy, unless they were sure that the hardware could get the job done. Software will lag, but thats always the case.

The important take away WRT to the computing power required is that basically "enough is enough". If Tesla cars have the required hardware to eventually (with additional software improvements) "solve" the self driving problem, it doesn't matter whether some other company builds a processing platform that is 2x faster, 10x faster or even 100x faster. Simply doesn't matter. If the hardware and software solution that Tesla deploys can provide Level 5 autonomy, they are done. No need to hardware upgrade anything in the future.

Having said that, if in 4 years the just announced hardware can be shrunk down to 1/4 the size and 10x the speed at 1/2 the cost, then Tesla would roll out a cheaper platform that would essentially be "doing the exact same thing". But once the basic problem (self driving) has been solved, all you are doing from that point forward is decreasing the cost, and improving (via software) the performance, and the safety of the system.

RT
 
  • Informative
Reactions: krazineurons
Concerning there being another board (Drive PX 2) being twice as fast as what Tesla is using...

There is a big difference between the commercial world of graphics cards and the choice of hardware in an embedded system like the Tesla autopilot. In the commercial world, everyone is pushing the envelope, and the goal is always to be faster, bigger, cheaper. It's simply Moore's law, you need to be able to be able to one up your competitor. And, there will always be use cases where that additional processing power is needed, and other use cases where 10x, 100x or 1000x that level of processing power is needed.

For the Autopilot application, there is only a single "use case". The hardware that is installed in the vehicle needs to be able to run software that is capable of self driving the car. Period. Tesla would not have picked the hardware platform they did, and stated that it can do Level 5 autonomy, unless they were sure that the hardware could get the job done. Software will lag, but thats always the case.

The important take away WRT to the computing power required is that basically "enough is enough". If Tesla cars have the required hardware to eventually (with additional software improvements) "solve" the self driving problem, it doesn't matter whether some other company builds a processing platform that is 2x faster, 10x faster or even 100x faster. Simply doesn't matter. If the hardware and software solution that Tesla deploys can provide Level 5 autonomy, they are done. No need to hardware upgrade anything in the future.

Having said that, if in 4 years the just announced hardware can be shrunk down to 1/4 the size and 10x the speed at 1/2 the cost, then Tesla would roll out a cheaper platform that would essentially be "doing the exact same thing". But once the basic problem (self driving) has been solved, all you are doing from that point forward is decreasing the cost, and improving (via software) the performance, and the safety of the system.

RT

Agreed, although car platforms are usually built for longer life than computers are. Anyway, it turns out I was comparing apples to oranges. Google lists Parker at 1.5TFLOPS and Titan at 11TFLOPS (Nvidia said Parker can do 24 trillion deep learning operations but we don't have such number for Titan)

Bottom line, TFLOPS to TFLOPS Titan is much faster than Parker. For reference, 2018 Mobileye 4 is supposed to do 2.5TFLOPS.
 
Last edited:
Pretty crazy.

Seems like a strange name for the video, though. Level 5 is a car that doesn't even have an interface for someone to drive it. So not only can it drive itself, but it's not possible for a human to drive it because it has no controls. Clearly this is a demonstration of Level 4 autonomy, yes?

Kind of wish it was a single cut. I wonder how many trips it took to get all of that right? Still pretty impressive, though.
 
My takeaways:

* Soundtrack irony - black is the only no cost colour now :)
* The autonomy was being controlled by an external system. There is no route planned in the navigation and the IC does not show AP active.
* When the car is pulling up at the traffic lights, you can see multiple cars in the IC which (IMO) would have triggered AEB, but didn't
* The Autopark scene was clearly staged
* Indication seems a bit flaky - e.g. it seems to indicate when going around the car park, but not when parking for example

Other than that: wow!

Would love to see the car on this track next, please :)