Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomy Investor Day - April 22 at 2pm ET

This site may earn commission on affiliate links.
SAE does mention steering wheel and it is widely accepted that Level 5 is when that can come off, it is the whole hands off (Level 2), eyes off (Level 3), mind off (Level 4), steering wheel off (Level 5) mantra after all, though that part is unofficial of course. :) SAE does mention it may be removed for specific applications on Level 4, but what Tesla here is picturing a Model 3 being sold without any steering wheel as a consumer car and that can obviously be only Level 5.

I'm not sure that they couldn't get away with Level 4 since I doubt many people will take their Model 3 off roading on autopilot. But I agree that Tesla has made it obvious that they're shooting for Level 5.

What the Tesla narrative completely misses abour Lidar and HD maps though (another surprise from Elon that people using HD maps are doomed like those using Lidar) is that they are not something to be relied on but something to provide you with an additional layer of redundancy.

It is more obvious by the day that Tesla is aiming for the easiest way of producting a self-driving car and as something becomes too costly or too difficult (like relying on radar in the winter or the more complex NoA they had prior to the current one — that we assumed used HD maps), Elon just discards it and goes for the path of least resistance. Redundancy is hard so he goes for vision only, even on record saying so about their own forward radar, and forgets about using HD maps. Sensor and data fusion IS hard, of that I have no doubt. But the layer of sanity checking and double checking it can offer for safety can and likely will mean ”orders of magnitude” safer autonomous vehicles (and driver’s aids).

Because the thing about HD maps (constantly updated through the fleet of course like MobilEye EyeQ4) or 360 degree radar of 360 degree Lidar, when done right, is not that they limit you anywhere. It is that they provide the car with a second or third opinion when they are available. Say your vision gets blocked or severely diminished and you need to navigate to a safe space. Having secondary sensors and HD maps certainly help plotting a better course than basically nothing but a memory of what you last saw.

As with any redundant system, it needs the weigh every piece of data and come to the best possible conclusion. With a non-redundant system, you only have one possible conclusion, but that conclusion may be worse than a redundant system would have made. The latter makes for an easier system to implement, which is why I wager it is so inviting to Elon, but how safe it ends up being remains to be seen.

That said, what concerns me more at this time is not redundancy but Tesla getting their vision reliability and versatility on par with MobilEye. Their biggest issue at the moment is unreliable vision, not lack of redundancy... but looking forward redundancy may matter too as this, the Model 3 sold today, is the suite and fleet Tesla aims to start using to Tesla Network as the leases end.

I think that Elon's point is that those companies relying primarily on GPS, Maps or Lidar are doomed. I don't think he intentionally means to poke fun at say MobilEye who obviously are a 'vision first' autonomy developer. But he is poking fun at the ones who aren't.

Like with the cars themselves, I don't think he wants to be the only player in the game. He's hoping other companies will take up the torch and provide a variety of options (to different market segments than Tesla). The one place he is looking to be the number one though is autonomous ride sharing, and mostly by getting there first.

I don't disagree that Lidar and Maps can be used for redundancy. But I think the key here is cost and speed. We all know how laughably outdated SatNavs can be, and Google Maps isn't always on spot either. Using maps is a huge expense with constant additional costs for updates. I can see why Tesla doesn't want to tie into some additional subscription route for a product that's out of date as soon as it's compiled.

We know Lidar is expensive. I expect if they become $10 a piece then we'll see them added to Teslas. It would be a no-brainer to have that sort of redundancy. I just don't think it's a right fit, right now for them. It wouldn't surprise me if Hardware 4, in 2 years, has the option to plug one in. But I do think that Elon's point stands: you have to get the vision right first.

Tesla's whole gig is to run as fast as it can to keep ahead of the companies who otherwise, with their massive cash reserves, would crush them. I think that's a major reason why the huge, super-fast, push to get the cheapest generalised version of autonomy working and out there as soon as possible.
 
Last edited:
But I do think that Elon's point stands: you have to get the vision right first.

I don’t disagree with that — that is why I think MobilEye is very promising and why I have mostly concerns about Tesla’s visions reliability at this time, not their redundancy. (We’ll worry about redundancy when their vision works first...)

What does set Elon and thus Tesla apart though is that they have now doubled down on this being a Level 5 suite they are going to use as a robotaxi fleet. The suite they sell today. They are making bold Level 5 plans for this suite with very limited sensor redundancy.

Elon also said Lidars are unnecessary appendices that you don’t need... he did clearly say you don’t need them at all. He is also on record on Twitter saying Tesla plays to go vision only regarding the radar too. It doesn’t feel like he takes redundancy seriously, quite the opposite he makes fun of it.
 
Not sure that radar is being viewed purely as a redundant system for vision. Along with ultrasound it has its uses in poor visibility (but would any of us have the confidence to trust any system in that sort of weather?)
 
@malcolm All sensors types have unique strenghts and weakenssess (Lidar included) so certainly a good system makes use of them all and decides what matters and when. You can train an NN to take inputs from many types of sensors and make the judgement call on who to trust and when, for truly superhuman sensing many times over.
 
  • Informative
Reactions: malcolm
What the Tesla narrative completely misses abour Lidar and HD maps though (another surprise from Elon that people using HD maps are doomed like those using Lidar) is that they are not something to be relied on but something to provide you with an additional layer of redundancy.

Your statement is internally inconsistent.
If you use lidar data as a reason to ignore your vision data, then lidar is your primary sensor (not a redundant one) (plastic bag versus tire in road). If you ignore your vision data because it doesn't match your HD map, then the map is the primary data source (construction bypass) . If you cease operation because of a data conflict, you're giving both sources equal billing and made your system more brittle.
You can't have a 'helping' sensor, you either rely on it or you don't.
(Yes, there can be edges cases where lidar detects an object that vision does not, but that should be fixed by improving the vision system. On Teslas, the radar covers the majority of that space)

Trite, but fun, saying from the days of sailing ships:
When you go out to sea, never take 2 clocks, either one or three...
 
@LasairfionBecause the thing about HD maps (constantly updated through the fleet of course like MobilEye EyeQ4) or 360 degree radar of 360 degree Lidar, when done right, is not that they limit you anywhere. It is that they provide the car with a second or third opinion when they are available.
Based on the patents and demonstrations from Tesla, I think it was slightly disingenuous of Tesla to say they aren't using HD maps. You saw the 3D radar rendering in the demo, very neat. They've patented a way to improve GPS location using cameras, again very clever. They clearly are doing what you suggest- the data is there. Elon said they used to use HD maps. What they found, taking Elon's word at face value, is that the HD maps sometimes impaired driving. I think the confusion is Tesla's emphasis that the mapping data cannot be a primary source of information; that it is, as you suggest, there for a fail safe only, but the vision should always be most trusted.

I also suspect there's a terminology discrepancy for the sake of marketing. HD mapping created using vision+radar+magic instead of LiDAR is something Elon puts under the banner of vision and refuses to call it "HD maps" to differentiate it from competitors. Again, it's not a lack of data. They've shown that.
 
The difference between L4 and L5 is huge.

L4 is limited to certain geolocation and certain roads or weather condition.
L5 is L4 without any limits. No geolocation, no limits of certain roads, no weather condition limit.
L5 is the ability to drive anywhere a human can drive and in any weather condition.
 
That is because you misunderstood them ... they do use HD maps, they just aren't relying on them as a primary guide for steering.
They have HD maps and can use them where they can be of use ...

No one misunderstood him, he presented it in a way to look superior that he is not using it.

"I think HD maps are a mistake. We actually had HD Maps for a while, actually canned that because you either need HD maps, in which case, if anything changes about the environment, the car will break down.

Or you don't need HD maps, in which case, why are you wasting your time doing HD maps. So the HD maps thing is, like the two main crutches that should not be used and will, in retrospect be obviously false and foolish are LIDAR and HD mass. Mark my words."​
 
How are they going to solve “the last mile” problem? It’s a huge cost in telecom, wouldn’t it be here too?

In March, I was trying to get to Jersey Mike’s subs in Conroe.

Jersey Mike’s
3091 College Park Dr
Unit 150
Conroe, TX 77384


The nav took me to the residential street just south, with no way to access and at least 10 min out of the way due to one-way streets and traffic.
 
  • Like
Reactions: croman
The difference between L4 and L5 is huge.

L4 is limited to certain geolocation and certain roads or weather condition.
L5 is L4 without any limits. No geolocation, no limits of certain roads, no weather condition limit.
L5 is the ability to drive anywhere a human can drive and in any weather condition.

Why is why Elon keeps promising L5, not L4. He believes that the current AP3 hardware can do L5 based on that definition and regardless if it can, that is the goal he is setting. Elon is not interested in geofencing or limiting FSD to certain situations. Personally, I am skeptical. I wish Elon would say L4 instead because that would be more realistic IMO. But I can also see why he does not say L4 since he is not planning to geofence or limit the software to certain areas or certain weather conditions. So that is the story he is sticking to.
 
Last edited:
How are they going to solve “the last mile” problem? It’s a huge cost in telecom, wouldn’t it be here too?

In March, I was trying to get to Jersey Mike’s subs in Conroe.

Jersey Mike’s
3091 College Park Dr
Unit 150
Conroe, TX 77384


The nav took me to the residential street just south, with no way to access and at least 10 min out of the way due to one-way streets and traffic.

This happens to me with my local trader Joe's. I know now how to get there but when I first moved Tesla and Google were clueless.
 
Don't these add up to TSLA not integrating basic data from GPS/maps?
Shadow brake at an overhead bridge
Verygreen vid of car crossing a double yellow, running in opposite lane
Another of a car distracted by a sudden flash of
headlights from the other side of a divided highway.

Bridge location,two lane vs highway
are not part of the calculation, not integrated
 
This happens to me with my local trader Joe's. I know now how to get there but when I first moved Tesla and Google were clueless.

Same thing with complicated parking garages, often underground and with confusing signage and badged entry.

This is why I think it’s disengenuous for Tesla to show renderings of cars without a steering wheel and for Elon to talk casually about just removing them and capping the steering wheel column within a couple of years.

AP is great - I’m writing this while on NoA - and getting better. It may be able to soon drive us from point A to point B reliably. But to suggest there won’t be edge cases that require human control for the forseeable future is just wrong, and Elon must know it.
 
By the way, folks can make fun of Elon for promising L5 autonomy by 2020 but is there any auto maker even trying to do what Tesla is doing? I don't think so. Sure Waymo is close to L4 but they are just doing ride-sharing in small locations. Mobileye has good FSD hardware and software but they are not likely to get a lot of auto makers to adopt it. And the other auto makers are researching FSD but not actually adding the hardware to production cars. In Q1, Tesla delivered over 60,000 cars with the hardware for self-driving and with OTA software updates to upgrade the features seamlessly. Now, what level of autonomy they reach? That is still an open question. But you can't deny that Tesla is the only auto maker to move so aggressively to try to actually make it happen!
 
By the way, folks can make fun of Elon for promising L5 autonomy by 2020 but is there any auto maker even trying to do what Tesla is doing? I don't think so. Sure Waymo is close to L4 but they are just doing ride-sharing in small locations. Mobileye has good FSD hardware and software but they are not likely to get a lot of auto makers to adopt it. And the other auto makers are researching FSD but not actually adding the hardware to production cars. In Q1, Tesla delivered over 60,000 cars with the hardware for self-driving and with OTA software updates to upgrade the features seamlessly. Now, what level of autonomy they reach? That is still an open question. But you can't deny that Tesla is the only auto maker to move so aggressively to try to actually make it happen!

You are right. Only Tesla is selling a car that will be Level 5 no geofence feature complete at the end of 2019.
 
All this depends on your definition of "HD".

IMO, they will need "HD" maps for very specific things such as restricted parking zones, or time-controlled lanes.

And doesn't NoA use HD enriched maps?
 
Not sure that radar is being viewed purely as a redundant system for vision. Along with ultrasound it has its uses in poor visibility (but would any of us have the confidence to trust any system in that sort of weather?)

Radar, just like Lidar, will give you accurate distance info. Tesla is using it to complement camera vision but it does one step further. It uses the radar data to train vision NN so it could learn to decipher depth content in images and eventually reduce or even eliminate the reliance on radar. It's always the most elegant solution if one type of sensors could take care off all scenarios. We human do it with eyes only. However radar likely wil still be there for foul weather situations. Lidar is useless in that case.

That is because you misunderstood them ... they do use HD maps, they just aren't relying on them as a primary guide for steering.
They have HD maps and can use them where they can be of use ...

Elon said they have tried using HD map (to guide the car) but it did not work. It's too rigid (does not allow machine decisions) and fragile (small changes in road feature could cause it to crash). He said Tesla uses location map only to decide which part of NN data to use. I'm not sure they need a "HD" map in those cases.