Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomy Investor Day - April 22 at 2pm ET

This site may earn commission on affiliate links.
I just assume that it's because if you are quick to market with a cheap solution, you will make absolutely massive amounts of money with which you can go to Mars.

Not sure if that makes sense. Remember one of the greatest selling points of Tesla cars is they are nice to drive. Car brands like Porsche and Ferrari are all so adament that their cars will (of course) never be autonomous. Tesla could sell more cars and make more money by putting those resources used on developing the FSD to make cars even nicer and nicer to drive. Or just keep that money, sounds like it had spent a lot of money on that, as profit the stock price could shoot up to Mars.

Elon either had the vision and did not want to be disrupted by the inevitable or, like I mentioned, he wanted to replace as many ICE cars as possible. Those gas burning Uber cars running on city streets probably made him, and a lot others, cringe. Don't despair if you're a Tesla investor though. The monetizing opportunities would be beyond your and my wildest imagination if Tesla FSD can come to market a couple years ahead of everyone else. From the technology part of yesterday's presentation there is a distinct likelihood that will just be the case.
 
Last edited:
  • Like
Reactions: RLC3 and Guy V
Kind of peculiar with the simulation of the exact same route that they were driving. Premapped? Or just thoroughly tested?
That's an interesting observation. I'm sure they have thoroughly tested the route (and perhaps optimized the training data) for the demo, both in simulations and actual driving. But there could be a more benign explanation: my guess is that the area around the Palo Alto HQ (which is where Karpathy's team is located) is one of their favorite and most detailed simulation scenarios because they can easily test training results in the real environment right in front of their door step.
 
  • Like
Reactions: emmz0r
That's an interesting observation. I'm sure they have thoroughly tested the route (and perhaps optimized the training data) for the demo, both in simulations and actual driving. But there could be a more benign explanation: my guess is that the area around the Palo Alto HQ (which is where Karpathy's team is located) is one of their favorite and most detailed simulation scenarios because they can easily test training results in the real environment right in front of their door step.

It definetely is the area behind Tesla HQ. That said it's still only a demo which can be tested and practiced. One thing that is better than the rest we've seen is it's one continuous run instead of assembly of many short clips. In the end it is still what Tesla, or whoever, makes available to cars in customer's hands that counts.
 
Last edited:
It definetely is the area behind Tesla HQ. That said it's still only a demo which can be tested and practiced. One thing that is better than the rest we've seen is it's one continuous run instead of assembly of many short clips. In the end it is still what Tesla will make available to cars in customer's hands.

Seems like they are staying within their comfort zone. Not sure why you wouldn't want to showcase a new/more challenging scenario.
 
That's an interesting observation. I'm sure they have thoroughly tested the route (and perhaps optimized the training data) for the demo, both in simulations and actual driving. But there could be a more benign explanation: my guess is that the area around the Palo Alto HQ (which is where Karpathy's team is located) is one of their favorite and most detailed simulation scenarios because they can easily test training results in the real environment right in front of their door step.

If they did that then wouldn't they have to file disengagements in their disengagement report?
 
Seems like they are staying within their comfort zone. Not sure why you wouldn't want to showcase a new/more challenging scenario.

Yes it's still a work in progress. They are only starting to test the new SW/HW and probably not that comfortable with showing more than that. Even it's a demo it's still not a commercial. Not sure I can say the same for some others I've seen.

Musk says they don't do premapped. So just thoroughly tested probably. Obviously, it is a route that they have used in simulations so it's been tested a lot so it was good for their demo.

Not only they don't premap but Elon said premapping does not work for their system that relies on general perceptions to run. Matter of fact he does not think it works for others who have to rely on that either.

If they did that then wouldn't they have to file disengagements in their disengagement report?

If the car is not a test car but a real production car, perhaps owned by an employee, then it does not need to file the disengagement report. Remeber Tesla was recuiting employee owners to test the HW3? Matter of fact all Tesla drivers are helping Tesla testing and developing FSD in shadow mode. That's one of the great advantages Tesla has by making all production cars act like test cars.
 
Last edited:
  • Disagree
Reactions: croman
Looks like Nvidia was watching Tesla's Autonomy Investor Day presentation yesterday. They posted this blog defending Nvidia's chips but they do offer some compliments to Tesla:

Tesla, however, has the most important issue fully right: Self-driving cars—which are key to new levels of safety, efficiency, and convenience—are the future of the industry. And they require massive amounts of computing performance.

Indeed Tesla sees this approach as so important to the industry’s future that it’s building its future around it. This is the way forward. Every other automaker will need to deliver this level of performance.

There are only two places where you can get that AI computing horsepower: NVIDIA and Tesla.
Tesla Raises the Bar for Self-Driving Carmakers | The Official NVIDIA Blog

Interesting that Nvidia feels that they and Tesla are the only places offering the best AI computing power.
 
If they did that then wouldn't they have to file disengagements in their disengagement report?
That is a good question. My guess is that they used this clause from the CA DMV regulation on autonomous vehicle testing:

"An autonomous test vehicle does not include vehicles equipped with one or more systems that provide driver assistance and/or enhance safety benefits but are not capable of, singularly or in combination, performing the dynamic driving task on a sustained basis without the constant control or active monitoring of a natural person."

So as long as the system is considered just a driver assistance system, they don't need to report. Also, the reports are due only once a year (in January). If they cross the threshold from driver assistance to autonomous driving this year, we'd only see a report in 1/2020.
 
It is quite different. LIDAR and cameras are both optical systems with the same common modes of failure, which gives greatly reduced redundancy.
Radar is a microwave pulse that works effectively where optical systems do not, hence you have a much more robust system.

Lidar, cameras, and radar are all based on electromagnetic radiation. They all operate at different wavelengths; nobody is using visible light spectrum in their lidar. They tend to be in the infrared, but some are using a wavelength very close to visible and some in the far infrared. Those different wavelengths have very different characteristics.

Lidar is also active vs cameras which are passive, which again gives different characteristics. Lidar can see in total darkness for example.

Radar is farther from the visible spectrum than lidar for sure, but that doesn't mean that lidar does not provide diversity over cameras. It absolutely provides diversity over just cameras.

Cameras, lidar, and radar together are what every serious AV company is using, because each type of sensor provides features that the other two do not.
 
...
Tesla has extremely little in terms of redundancy towards different directions and indeed also some blindspots low front and none of this was explained away yesterday unfortunately.
Unlike YOU, who as a human driver has 17 eyes pointed in all directions simultaneously. You of course have full redundancy and no weak areas of observation. /s
 
Last edited:
  • Love
Reactions: Guy V
That's an interesting observation. I'm sure they have thoroughly tested the route (and perhaps optimized the training data) for the demo, both in simulations and actual driving. But there could be a more benign explanation: my guess is that the area around the Palo Alto HQ (which is where Karpathy's team is located) is one of their favorite and most detailed simulation scenarios because they can easily test training results in the real environment right in front of their door step.

Seems like a pretty varied route too with the stop signs, cloverleafs etc.

Looks like Nvidia was watching Tesla's Autonomy Investor Day presentation yesterday. They posted this blog defending Nvidia's chips but they do offer some compliments to Tesla:


Tesla Raises the Bar for Self-Driving Carmakers | The Official NVIDIA Blog

Interesting that Nvidia feels that they and Tesla are the only places offering the best AI computing power.

In all honesty, it's not a bad option either :)
 
Teslas have adaptive headlights which illuminate in the direction of a turn.

Tesla does not have adaptive headlights. They have what they call adaptive headlights but it's just a gimmick. The lights just come on as the steering column turns. It isn't adaptive and the headlights are not great to begin with.

And Model 3's have neither, and they already comprise a (quickly growing) majority of HW2+ Teslas on the road.
 
  • Like
Reactions: croman
Unlike YOU, who as a human driver has 17 eyes pointed in all directions simultaneously. You of course have full redundancy and no weak areas of observation. /s

I do have some abilities that a stationary camera does not. For one, I can move my head. Second, I can clean my eyes if something gets in them. Third, I have redundant vision (two eyes) in all directions through moving my head. I can even leave the car if need be.

For example: Tesla’s entire coverage on a 90 degree intersection leftwards or rightwards is the respective B pillar camera. No other sensor can see the intersection towards the left and right. There is no radar pointing there, the ultrasonics do not see far enough nor are fast enough and the side marker cameras have no coverage there. The fisheye sees forwards but not wide enough.

You are making life and death decisions to merge based on a single, small, stationary camera input without a wiper.
 
  • Like
Reactions: cwerdna