Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomy Investor Day - April 22 at 2pm ET

This site may earn commission on affiliate links.
For example if side-mounted Lidar or radar are pinging an approaching car without lights on or a deer at a 90 degree turn where the camera only sees blackness that would be one such scenario where Tesla has no redundancy in their AP2/3 robotaxi suite.
That is diversity/ coverage, not redundancy. You are talking about sensor capability, not handing a failure of the sensor. In a black/ no headlight operating condition, you would need radar and lidar to have redundancy.

Also, Teslas have adaptive headlights which illuminate in the direction of a turn. (And humans have worse night vision when driving than cameras)
 
  • Love
  • Like
Reactions: Guy V and CarlK
That is diversity/ coverage, not redundancy. You are talking about sensor capability, not handing a failure of the sensor. In a black/ no headlight operating condition, you would need radar and lidar to have redundancy.

Also, Teslas have adaptive headlights which illuminate in the direction of a turn. (And humans have worse night vision when driving than cameras)

Teslas adaptive headlights won’t help with a fast approaching object from the side. I am discussing sensor capability, sensor blocking and failure modes, what have you, all can benefit from redundancy and in Tesla’s case there is awful little of it in this design compared to other robotaxi designs.

Luckily this is all quite moot now. Tesla Level 5 is here shortly or it was all a big lie but either way we’ll know in a year. This is the sensor suite they have locked in for Level 5 no geofence. I expect to be riding my Model X still in a years time so I am rooting for this being real and redundancy not being an issue.
 
Last edited:
  • Like
Reactions: GeoX750
Kind of peculiar with the simulation of the exact same route that they were driving. Premapped? Or just thoroughly tested?

tesla-fsd.jpg
 
Kind of peculiar with the simulation of the exact same route that they were driving. Premapped? Or just thoroughly tested?

View attachment 399776

Musk says they don't do premapped. So just thoroughly tested probably. Obviously, it is a route that they have used in simulations so it's been tested a lot so it was good for their demo.
 
Some of my thoughts on the autonomy day presentation. I enjoyed watching and learning about the technical details and I think it's clear that Tesla has made some amazing progress and achieved some technical breakthroughs. That said overall I have some more critical thoughts about ultimately bringing the feature to production.

I've had my Model 3 for over 6 months. It's an amazing car and definitely the best vehicle I've ever owned and driven. I have EAP and I use it for long-ish drives on the highway (usually 30-60 minutes). I enjoy city driving (small city, little traffic here) but it is interesting to consider getting in the vehicle in the morning, press the 'WORK' button, and have the car take me there while I read news or enjoy coffee etc. I think that is a long way away even with the technical achievements.

The reason I'm a bit critical is because with all the sophistication Tesla has managed to achieve, there are too many small things that should be easily solvable but Tesla continues to miss the mark. Most of these are not directly related to FSD but ultimately they are part of the car's software stack so I think they're important.

Here are a few non-FSD issues I've encountered. I mention them because if Tesla can't fix these, how can I expect 100.0% reliability when operating a "2 tonne death machine" (quote from Elon Musk, Apr 22, 2019).

- the LTE radio does not automatically come back online after being parked underground with no signal
- the backup camera view breaks, is fixed, breaks, is fixed, with every software release
- streaming Bluetooth causes the main console to occasionally crash and reboot
- can't turn on defrosters from the mobile app (seriously, this should be stupid simple to implement)

There are many other small-ish issues but in aggregate I get the impression that the software is only ever 99% done, and that elusive 1% always slips through the cracks. I know it's not directly FSD control but the perception is important.

On the FSD front (from my experience using EAP):

- in the winter in Canada, with any weather event the EAP will disengage due to limited sensors. Driving in slush causes the front radar to stop working. This is a normal occurrence. I'm fine with it but I can't see a robo-taxi fleet dealing with this for months at a time. I've never had my Uber driver unable to drive when it's snowing.

- situations with emergency braking - if the leading car brakes hard while I'm using EAP, I almost always have to take over. The Tesla will brake but I can feel the deceleration rate is insufficient for safety, and I take over. I understand the NN can likely be tuned to accomplish this also, but with any potential radar obstructions it would be dealing with limited data. Without a driver to make a final determination it could lead to a very dangerous situation. If the vehicle gets inconsistent information about an imminent front crash, does it hard brake and risk a chain collision from behind, or does it smash into the car in front. I'd like to see this situation shown in real-life or a simulation - i.e. with one or more sensors obscured)

- the rain sensors!! I think they've gotten better but they're still pretty horrible at sensing what's required

So, in summary my opinion is that things are advancing significantly but Tesla needs to up their game in terms of software quality. I also think the sensors need more redundancy. I think that there should be 2 additional wide angle, front facing cameras on the A-pillars (driver and passenger) or side-view mirrors. The rear should have a long range radar similar to the front to detect eminent collision from behind. All the sensors/cameras need to be designed with winter in mind, including self-heating/cleaning. I think if Tesla can solve winter they've pretty much nailed it. But until EAP works seamlessly in snow, I am very skeptical of a wide release.
 
Finally saw the replay last night. I hung on every word of the chip presentation. I’m the kind of nerd who read Microprocessor Report in the ‘90s for the pure joy of learning whose translation lookaside buffer had the lowest miss rate. And my interest in neural nets dates back to Rumelhart’s “Parallel Distributed Processing.”

With that said, the single comment I found most noteworthy was Elon’s saying that Tesla’s cars now shipping (by which I assume he meant the Model 3) are designed for a million-mile useful life, the same spec as for large trucks. Doing this obviously makes achieving a gross cost that supports a $35k price more difficult. The same is true about their currently shipping all the systems redundancy needed to achieve reliable autonomy. And in terms of profitability, his comment that the cost of developing autonomy was essentially the company’s whole expense structure was also telling.

This is not meant as a criticism: I want them to succeed and I am not a short. My high level conclusion from all this – which has been pretty obvious all along, but the comments in this presentation somehow drove it home for me at a more emotional level – is that Elon really has a completely different view of what business he is in, and of which success factors are important and which are not, than any other car company. A view that is differentiated by much more than electrification. He is pursuing a certain vision of the future of transportation, whether in space, in tunnels, or on surface roads, that is not evolutionary. And he is pursuing his vision even if it takes him off the path of maximizing business success and survival in the nearer term, that path being one where he would refresh his luxury car to have more luxury, cost-cut his affordable car to have more affordability, fix the bugs and deficiencies in his driver-facing software, and use his expense structure to generate more sales either through advertising or more frequent styling changes to stimulate repeat buying.

No, he is pursuing a vision of what customers ultimately will want, even though they don’t realize it yet (a la Steve Jobs, only more so). I will say that I, for one, shudder at the thought of buying a car only to let a bunch of probably vandalous strangers ride around in it while I am having dinner. I similarly shudder at the thought that the most fun-to-drive car I’ve ever owned (and I’ve had some good ones) won’t be shipped with a steering wheel in a few years. But then I’m 60 years old, so I’m probably not the best indicator of trends in the future.

It will be interesting to see how it all unfolds! FWIW I thought both the hardware and software presentations were very good. It seems like their on-chip memory architecture allows the multiply-accumulate array to have much better utilization than would GPUs that fetch their data off chip. I would also love to ask them about the number of distinct neural nets they now use and plan to use, and their specific roles, particularly in path planning/driving policy, which they implied is now mostly heuristics implemented on conventional CPUs, and relatedly I would ask about the different levels of abstraction that the different nets operate at: e.g. pixel processing vs. processing various representations of identified objects.
 
Kind of peculiar with the simulation of the exact same route that they were driving. Premapped? Or just thoroughly tested?

View attachment 399776

Great catch! Also gives a flavour of how visually accurate their simulation environment is.

On another note, anyone else think it was strange that Google's TPU2 wasn't mentioned in comparison to the NN engines in HW3?
 
  • Like
Reactions: emmz0r
Musk says they don't do premapped. So just thoroughly tested probably. Obviously, it is a route that they have used in simulations so it's been tested a lot so it was good for their demo.

That looks like the Palo Alto hill side just behind the Tesla HQ. Where else are they going to do the test drive or making videos?

Finally saw the replay last night. I hung on every word of the chip presentation. I’m the kind of nerd who read Microprocessor Report in the ‘90s for the pure joy of learning whose translation lookaside buffer had the lowest miss rate. And my interest in neural nets dates back to Rumelhart’s “Parallel Distributed Processing.”

With that said, the single comment I found most noteworthy was Elon’s saying that Tesla’s cars now shipping (by which I assume he meant the Model 3) are designed for a million-mile useful life, the same spec as for large trucks. Doing this obviously makes achieving a gross cost that supports a $35k price more difficult. The same is true about their currently shipping all the systems redundancy needed to achieve reliable autonomy. And in terms of profitability, his comment that the cost of developing autonomy was essentially the company’s whole expense structure was also telling.

This is not meant as a criticism: I want them to succeed and I am not a short. My high level conclusion from all this – which has been pretty obvious all along, but the comments in this presentation somehow drove it home for me at a more emotional level – is that Elon really has a completely different view of what business he is in, and of which success factors are important and which are not, than any other car company. A view that is differentiated by much more than electrification. He is pursuing a certain vision of the future of transportation, whether in space, in tunnels, or on surface roads, that is not evolutionary. And he is pursuing his vision even if it takes him off the path of maximizing business success and survival in the nearer term, that path being one where he would refresh his luxury car to have more luxury, cost-cut his affordable car to have more affordability, fix the bugs and deficiencies in his driver-facing software, and use his expense structure to generate more sales either through advertising or more frequent styling changes to stimulate repeat buying.

No, he is pursuing a vision of what customers ultimately will want, even though they don’t realize it yet (a la Steve Jobs, only more so). I will say that I, for one, shudder at the thought of buying a car only to let a bunch of probably vandalous strangers ride around in it while I am having dinner. I similarly shudder at the thought that the most fun-to-drive car I’ve ever owned (and I’ve had some good ones) won’t be shipped with a steering wheel in a few years. But then I’m 60 years old, so I’m probably not the best indicator of trends in the future.

It will be interesting to see how it all unfolds! FWIW I thought both the hardware and software presentations were very good. It seems like their on-chip memory architecture allows the multiply-accumulate array to have much better utilization than would GPUs that fetch their data off chip. I would also love to ask them about the number of distinct neural nets they now use and plan to use, and their specific roles, particularly in path planning/driving policy, which they implied is now mostly heuristics implemented on conventional CPUs, and relatedly I would ask about the different levels of abstraction that the different nets operate at: e.g. pixel processing vs. processing various representations of identified objects.

Or you could use the car yourself, I think most of us will, and take the FSD as your 7/24 personal chauffeur who can drive you for long trips, drop you at downtown restaurants and pick up kids at schools for you. After many years of that and when you are ready to get a new or different car you can put it up for rental to generate income. Or sell it to others who want to do that and probably get a much more decent price than a 5~10 years old Audi or Honda.

My thought of Elon pushing for robo taxi is because he could replace many more ICE cars on the road this way. He'd be OK if some don't want to share their cars that way. Also many multiple car families could still reduce number of (ICE) cars they own when they can share the car among themselves this way.
 
Last edited:
  • Like
Reactions: Guy V
Of course there is nothing new about the story.

But this time it does not come from a startup but from a company which has the product in production and shipping to customers saying getting feature complete to Level 5 no geofence is end of this year.

They must be close — or they must be lying. That is what the market is now pondering.
Based on the market reaction, Mr Market believes the latter. Of course, technically it is not lying, if Musk believes himself.
 
Tesla does not have adaptive headlights. They have what they call adaptive headlights but it's just a gimmick. The lights just come on as the steering column turns. It isn't adaptive and the headlights are not great to begin with.

Cornering or turning lights are of course the correct industry term and yes I agree they are usually separate from adaptive headlights, I didn't see a need to get stuck on that given the point stands for both cornering and adaptive headlights — they will not light up a fast approaching object in a 90 degree intersection. :)
 
  • Like
  • Informative
Reactions: mongo and croman
My thought of Elon pushing for robo taxi is because he could replace many more ICE cars on the road this way. He'd be OK if some don't want to share their cars that way. Also many multiple car families could still reduce number of (ICE) cars they own when they can share the car among themselves this way.

I agree that is part of his vision and that saving the world is definitely part of his motivation for the whole thing.
 
  • Like
Reactions: Guy V
My thought of Elon pushing for robo taxi is because he could replace many more ICE cars on the road this way. He'd be OK if some don't want to share their cars that way. Also many multiple car families could still reduce number of (ICE) cars they own when they can share the car among themselves this way.

I just assume that it's because if you are quick to market with a cheap solution, you will make absolutely massive amounts of money with which you can go to Mars.
 
  • Like
Reactions: Guy V
So far market seems quite neutral... probably more pressed by concerns of tomorrow’s results. I would agree the market is not jumping with joy which could suggest the market is in a wait-and-see posture.

Exactly. The market wants Tesla to have a product ready to sell to willing customers to resolve its cash problems. The rest is hyperbole and possibility.
 
  • Like
Reactions: croman
Right now AP will run into stopped cars, jersey barriers, and semi trucks. Obviously they plan on fixing that but I would assume that having a secondary sensor that could detecy those things would help. Once you start operating in much more complicated urban environments I would imagine that you’ll find more situations where the camera and neural net miss things.
Again I’m not saying lidar is necessary but it sure doesn’t hurt. This whole lidar argument is sort of silly and it sounds like it’s been going on in the forum forever. There’s no way to prove anything until someone achieves safety greater than a human with cameras alone.

I think for the foreseeable future, you will need radar as a backup to get regulatory approval, and to have some emergency functions in reduced visibility.

I can see UBER charging surge rates, in bad weather, when the robo taxis go out of service due to very poor visibility. Of course if it is that bad, maybe UBER should not be operating either.
 
Several people seem to be of the view that Tesla is still miles away, because they're still using straightforward heuristics for the driving component, while working hard on the image recognition.

In my layman's terms, it takes a human child years to be able to understand the world around them and know what they are looking at. I see this every day with my own kid, shouting out "bus" every time he sees a car bigger than a Corolla. But once that same human understands the world around them, it's a relatively trivial exercise to learn how to drive (30 hours of lessons?).

Does this comparison make sense to the experts or am I completely wrong?

You are basically right. The neural net learn to recognize things the same way as your kid do, mostly learn by trial and error. However it has a simpler goal that the recognition only needs to be related to driving a car. Instead of needing to tell if the object is a tree, a car or an elephant it only needs to put evreything in one of two categories- definitely hazzardouz and definitely not hazzrdouz. The challenge now is to be able solve all edge cases, those you don't see on the road often. That's where all those Tesla cars on the road sharing what they see comes in handy. Makes sense?
 
It's not a lie if you believe it.
I feel like it was a good presentation of the infrastructure they have setup to pursue FSD. They emphasized their competitive advantage over other players int he market. It's hard to argue that their timeline is wrong when they're trying to do something that no one knows how to do or has successfully done. Maybe it is possible for them to do it in a year? It's impossible to disprove.
Extraordinary claims require extraordinary evidence.