Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
- Mercedes: Only available on the recently announced EQS?
- Audi eTrons had their LIDAR removed in the past year
- Porsche Taycan: No LIDAR?
- VW id.3/4: No LIDAR
- Ford Mustang MME: No LIDAR
- Volvo/Polestar: No LIDAR
- Lexus UX 300e: No LIDAR?
- Honda/Acura: RL is no longer available, no autopilot-like features with LIDAR on recent US models
OEMs put lidar on their flagships, not so much their EVs.
- Valeo Scala2 is on Mercedes S Class. Don't know about EQS
- Audi had a crude Scala1 lidar on the A7/A8. I had not heard they removed it, but it definitely needs an upgrade
- Porsche isn't into self-driving, they'll follow instead of lead
- ID3/4 and MachE are too downmarket
- Volvo says they'll add Luminar lidar next year. Some say all models
- Lexus LS has 4 lidars, supposedly from Denso (might be private-labeled)
- Honda Legend has 2 Scala2 lidars. The Legend was sold as the Acura RL in the US until last year

Others:
- XPeng P5/P7 have or will have lidar
- Hyundai Genesis G90 (and others?) get Scala2s this year
- Nio ET7 has or will have lidar

This is mostly for enhanced L2. Some may do "door-to-door L2" (like FSD, the car drives but you have to watch it like a hawk). Honda Legend is L3, kinda. Mercedes promises limited L3 this fall. Audi promised it a couple years ago but had to walk it back. The Chinese are
 
  • Like
Reactions: Bladerskb
you can assume that it will be ready some time between T + 6 months and T + 5 years, depending on how ambitious it was.
That's way too much credit. Elon said they would have FSD in two years in 2015, and now it's 2021. There is zero evidence that any timeline that Elon has given around autonomy can ever be met, no matter how much time you add to the end. Elon's timelines on autonomy:

December 2015: "We’re going to end up with complete autonomy, and I think we will have complete autonomy in approximately two years."

Elon Musk Says Tesla Vehicles Will Drive Themselves in Two Years

January 2016: "In ~2 years, summon should work anywhere connected by land & not blocked by borders, eg you’re in LA and the car is in NY"

June 2016: "I really consider autonomous driving a solved problem, I think we are less than two years away from complete autonomy, safer than humans, but regulations should take at least another year," Musk said.

March 2017: "I think that [you will be able to fall asleep in a tesla] is about two years" -

March 2018: "I think probably by end of next year [end of 2019] self-driving will encompass essentially all modes of driving and be at least 100% to 200% safer than a person."

SXSW 2018

Nov 15, 2018: "Probably technically be able to [self deliver Teslas to customers doors] in about a year then its up to the regulators"

Jan 30 2019: "We need to be at 99.9999..% We need to be extremely reliable. When do we think it is safe for FSD, probably towards the end of this year then its up to the regulators when they will decide to approve that."

Tesla Q4 Earnings Call

Feb 19 2019: "We will be feature complete full self driving this year. The car will be able to find you in a parking lot, pick you up, take you all the way to your destination without an intervention this year. I’m certain of that. That is not a question mark. It will be essentially safe to fall asleep and wake up at their destination towards the end of next year"

On the Road to Full Autonomy With Elon Musk — FYI Podcast

April 12th 2019: "I think it will require detecting hands on wheel for at least six months…. I think this was all really going to be swept, I mean, the system is improving so much, so fast, that this is going to be a moot point very soon. No, in fact, I think it will become very, very quickly, maybe and towards the end this year, but I say, I’d be shocked if not next year, at the latest that having the person, having human intervene will decrease safety. DECREASE! (in response to human supervision and adding driver monitoring system)"

April 22nd 2019: "We expect to be feature complete in self driving this year, and we expect to be confident enough from our standpoint to say that we think people do not need to touch the wheel and can look out the window sometime probably around the second quarter of next year."

April 22nd 2019: "We expect to have the first operating robot taxi next year with no one in them! One million robot taxis!" "I feel very confident predicting autonomous robotaxis for Tesla next year," "Level 5 autonomy with no geofence"

May 9th 2019: "We could have gamed an LA/NY Autopilot journey last year, but when we do it this year, everyone with Tesla Full Self-Driving will be able to do it too"

April 12th 2020: How long for the first robotaxi release/ deployment? 2023? "Functionality still looking good for this year. Regulatory approval is the big unknown.


April 29th 2020: "we could see robotaxis in operation with the network fleet next year, not in all markets but in some."

July 08, 2020: "I’m extremely confident that level five or essentially complete autonomy will happen, and I think, will happen very quickly, I think at Tesla, I feel like we are very close to level five autonomy. I think—I remain confident that we will have the basic functionality for level five autonomy complete this year, There are no fundamental challenges remaining. There are many small problems. And then there’s the challenge of solving all those small problems and putting the whole system together."

Dec 1, 2020: "I am extremely confident of achieving full autonomy and releasing it to the Tesla customer base next year. But I think at least some jurisdictions are going to allow full self-driving next year." Axel Springer Award

Jan 1, 2021: "Tesla Full Self-Driving will work at a safety level well above that of the average driver this year, of that I am confident. Can’t speak for regulators though."


1 Million Robotaxis will activate mid 2020 pending regulatory approval, Battery Day

Bonus Andrej: "Elon decided we can detect rain in vision…So now that’s my problem"

So basically…Ya don’t say, lol. It’s been "around the corner" every year since 2015 where it was two years away. Blaming regulators is such BS, the software is obviously not ready yet and will need a few more years in the oven, so what happens when someone who paid for it in 2016 is out the entire life of their cars with nothing approaching "Full Self Driving", which Elon has publicly said meant L5?
 
It's well known that whenever Elon gives a timeline it's going to run late.
...
BTW it's not Elon specific. All tech companies (except Apple) are like this with their internal timelines. The only Elon difference is he announces the ridiculously optimistic internal timeline to the world.
Yes, and it's not just the misleading on timelines. The exaggeration of the capabilities of AP/FSD and the official videos of self-driving with small-print disclaimers are outright dangerous.
 
  • Like
Reactions: DanCar
The patent applies to all traffic signs, and was filed in 2007, issued in 2011. Some people that have read it claim it doesn't even pertain.

Tesla knew about this in 2016 when they showed off the self driving video that clearly detected signs. Heck, they advertised "will read the parking sign and decide if it can park there." They should not have been advertising this if they did not have a clear path through patents. Tesla should have a patent portfolio in autonomy plenty deep to protect against something as simple as reading a sign if they are truly a leader in this space, and Elon doesn't seem like someone who really cares about patents if it's in the way of his "mission." There's likely hundreds of patents that are technically being broken by any autonomous system out there.

And now, Teslas do detect speed limit signs, and stop signs only 5 years after Tesla first showed it off. Yet the patent is still active. Lots of other companies have speed limit sign detection, and it's going to be required in the EU starting next year. A 2019 Corolla could detect stop signs and speed limits before a AP2 Tesla could. It really doesn't look like the patent is stopping anyone.

There's always some excuse with Tesla, when the most logical reason is that they just didn't have the tech working well enough for years.
Patents can be licensed, plus any OEM using Mobileye's system (which plenty are) would not need to worry about that patent, but we are not privy to the negotiations. We know however Tesla had a very bad public break with Mobileye, to the point where Mobileye wasn't even willing to sell more of their products to Tesla so Tesla can do a "hybrid" system. I doubt at the time Mobileye and Tesla had much desire for reaching a patent licensing deal. Of course five years can change a lot of things, Tesla may have even licensed Mobileye's patent if they exhausted all other options, but we wouldn't know that.

My point is what I'm not seeing is that this being a technical issue, as even you point out Tesla was able to demonstrate it working very early on (plus the fact many other independent projects demonstrate speed limit sign detection, it's not that hard to do and is a very common OCR task).
 
Last edited:
as even you point out Tesla was able to demonstrate it working very early on
LOL. Everything in that video was faked and hardcoded. It just looked like it was working. I didn't say it was working. I said they said it was working.

Intel bought Mobileye in March 2017. In 2018, The Model 3 came out using an Intel processor for the center screen, showing a partnership between Intel and Tesla. Then it took 2 more years for stop sign/speed limit detection to show up.

Yeah, they just didn't have it working. You think we couldn't find other patents Mobileye has in the ADAS space that Tesla is theoretically breaking since 2016 when they switched to AP2?
it's not that hard to do and is a very common OCR task).
Identifying a sign at a bunch of angles and light conditions, making sure it is a speed limit, making sure it applies to your lane and then being sure enough to commit that to your ADAS system is not that trivial. This is even worse for stop signs at complex intersections.
 
LOL. Everything in that video was faked and hardcoded. It just looked like it was working. I didn't say it was working. I said they said it was working.

Intel bought Mobileye in March 2017. In 2018, The Model 3 came out using an Intel processor for the center screen, showing a partnership between Intel and Tesla. Then it took 2 more years for stop sign/speed limit detection to show up.
The Intel processor is just a bog standard Atom E3950 processor. There's no "partnership" required to use it. It's even less of a collaboration than when Tesla used Nvidia's Drive PX2 system (where both sides made public announcements about it). I highly doubt it had much play, if at all, in Mobileye and Tesla's (poor) relationship at the time.
Yeah, they just didn't have it working. You think we couldn't find other patents Mobileye has in the ADAS space that Tesla is theoretically breaking since 2016 when they switched to AP2?
Maybe, but the speed limit sign one was the only one I remember being pointed out (which is why I brought it up), as it's fairly specific. No others were brought up in relation to their possible effects on Tesla's AP2 (at least that I'm aware of).
Identifying a sign at a bunch of angles and light conditions, making sure it is a speed limit, making sure it applies to your lane and then being sure enough to commit that to your ADAS system is not that trivial. This is even worse for stop signs at complex intersections.
The "Speed Limit" sign is a easier task as the color of the sign (black and white) is designed for maximum contrast, the length and design of the "Speed Limit" wording is relatively long/specific (so less likely to get false positives), and it's typically located along the road without being next to other signs nor in busying looking locations like intersections (unlike stop signs).
 
Last edited:
  • Like
Reactions: rxlawdude
Maybe, but the speed limit sign one was the only one I remember being pointed out
But it was brought up by community members as an excuse why Tesla was not doing this, not by Tesla. There's no data behind it, just someone's theory about why Tesla wasn't doing a basic feature yet. If someone can find one other Mobileye patent that is violated in 2016/2017/2018/2019 this can be easily disproven. Someone even read the patent and said that it might not apply to what Tesla is doing.

Here's a patent which covers optical detection of rain on a windshield with a camera. Why didn't that stop Tesla in 2017?

Here's a Mobileye patent on light detection used for auto high beams:

Mobileye has 1,587 patents. Hard to believe the speed limit sign recognition was the first one Tesla ran into against the features they had released 2016-2020.
 
Here's a Mobileye patent on light detection used for auto high beams:

Mobileye has 1,587 patents. Hard to believe the speed limit sign recognition was the first one Tesla ran into against the features they had released 2016-2020.
And reading that, it performs the same function as my Dad's 1957 Cadillac. Auto dimming high beams. Analog, obviously.

Mobileye's patent looks weak. Nonetheless, Tesla innovates their methodology independent of Mobileye's process and be perfectly fine.
 
Mobileye's patent looks weak. Nonetheless, Tesla innovates their methodology independent of Mobileye's process and be perfectly fine.
Of course. Just took them until 2020, not that they had it in 2016 and just had to wait for Mobileye to license to them like some people believe happened, yet somehow none of the other AP/ADAS features they released in 2016-2020 bumped into any of the 1,500+ Mobileye patents.
 
I'm calling you Catskills going forward. You can see why. 😚
You can keep using your worn out joke stolen from Woody Allen, but to do so ignores the fact that I keep telling you my issue is how Tesla is deceiving people on their progress in autonomy or what they are getting for their money, which is evidenced by how poorly it works and how long it is taking them to add even small features. You also miss the other side of the joke which is that those are the only things a restaurant needs to do to be "good" and it can't do either.

But go right ahead and give me a nickname, because yep, that's Tesla to a T, so it's not the insult you think it is. They are failing on ALL fronts around autonomy, completely. They are slow to deliver even individual ingredients, and when they do, they taste horrible. It's nothing like they advertised, which was large, delicious portions, delivered rapidly. I find it interesting that you believe this to be a good way to dismiss someone you disagree with and think it exposes them being silly, but I'm glad we can agree that's it's an accurate description of how Tesla is delivering autonomy vs the way they advertised it.
 
Last edited:
You can keep using your worn out joke stolen from Woody Allen, but to do so ignores the fact that I keep telling you my issue is how Tesla is deceiving people on their progress in autonomy or what they are getting for their money, which is evidenced by how poorly it works and how long it is taking them to add even small features. You also miss the other side of the joke which is that those are the only things a restaurant needs to do to be "good" and it can't do either.

But go right ahead and give me a nickname, because yep, that's Tesla to a T, so it's not the insult you think it is. They are failing on ALL fronts around autonomy, completely. They are slow to deliver even individual ingredients, and when they do, they taste horrible. It's nothing like they advertised, which was large, delicious portions, delivered rapidly. I find it interesting that you believe this to be a good way to dismiss someone you disagree with and think it exposes them being silly, but I'm glad we can agree that's it's an accurate description of how Tesla is delivering autonomy vs the way they advertised it.
And yet the portions are too small.. For you.

Wow. Annie Hall fan. Perhaps David Downer or Doug Whiner are more apropos? 🤔
 
  • Like
Reactions: mikes_fsd
Of course. Just took them until 2020, not that they had it in 2016 and just had to wait for Mobileye to license to them like some people believe happened, yet somehow none of the other AP/ADAS features they released in 2016-2020 bumped into any of the 1,500+ Mobileye patents.
Sounds like a false dilemma argument. Who says there might not be other patents that tripped up Tesla, but they just found a workaround quicker? Not every patent and every problem is exactly the same and takes exactly the same amount of time to find a workaround to.
 
The "Speed Limit" sign is a easier task as the color of the sign (black and white) is designed for maximum contrast, the length and design of the "Speed Limit" wording is relatively long/specific (so less likely to get false positives), and it's typically located along the road without being next to other signs nor in busying looking locations like intersections (unlike stop signs).
There's a specific exit in downtown Seattle where I've always been curious how a purely vision-based autonomous car would handle it. It's a left-exit from the 5 freeway northbound (Seneca St I believe), where the offramp parallels the freeway, and there's a "Speed Limit 55" sign positioned exactly between offramp and freeway, facing so it's ambiguous whether it applies to freeway or offramp. Having taken this offramp, I can tell you that 55 is a completely unsafe speed for that point on the road (since a sharp turn through a stoplighted intersection shortly follows), so the sign is mightily confusing and arguably dangerous. Would be an interesting test case, with an alert backup driver!
 
  • Helpful
Reactions: rxlawdude
EVs are relatively expensive compared to their gassers sister models. Mercedes EQS is their S-class EV.

OEMs put lidar on their flagships, not so much their EVs.
- Valeo Scala2 is on Mercedes S Class. Don't know about EQS
- Audi had a crude Scala1 lidar on the A7/A8. I had not heard they removed it, but it definitely needs an upgrade
- Porsche isn't into self-driving, they'll follow instead of lead
- ID3/4 and MachE are too downmarket
- Volvo says they'll add Luminar lidar next year. Some say all models
- Lexus LS has 4 lidars, supposedly from Denso (might be private-labeled)
- Honda Legend has 2 Scala2 lidars. The Legend was sold as the Acura RL in the US until last year

Others:
- XPeng P5/P7 have or will have lidar
- Hyundai Genesis G90 (and others?) get Scala2s this year
- Nio ET7 has or will have lidar

This is mostly for enhanced L2. Some may do "door-to-door L2" (like FSD, the car drives but you have to watch it like a hawk). Honda Legend is L3, kinda. Mercedes promises limited L3 this fall. Audi promised it a couple years ago but had to walk it back. The Chinese are
 
Camera - Struggles in Low Light conditions and direct sunlight
Lidar - Excels in Low light conditions and direct sunlight

Different fail modes.



The camera Tesla has 115 dB dynamic range and the human eye has a dynamic contrast ratio of about 1,000,000:1 or 120 dB.
But the most important thing being resolution. Tesla camera has 1.2 megapixels and can't even read a speed limit sign 100 meters away in broad day light.
The human eye on the other hand has 576 megapixels.



The whole point is that the probability of them failing at the same same is extremely low. I have had my lights burn out on me acouple times while driving over the years. You are claiming that my lidar immediately go out at the exact same time.

Its funny that people like you claim there's no redundancy or back up here. You say things like if the camera says theres no ped and the lidar/radar says there is, which one should you believe?

This is like saying, why have two guards at the outpost, what if one guard sees someone move in the bushes for a moment and the other guard didn't.. which one should you believe? Its absurd because you investigate what either guard sees.

But let's keep going. People like you claim that the FSD Computer is redundant because when one fail, the other can keep going, or when one gives wrong data, the correct one will be used.

But wait, if one chip says there's a ped and another says there isn't which one should we believe? which one is the correct one? why have two? that's stupid unneeded complexity.

Its not yet illegal to think. Stop letting elon think for you.
576 mp when eyes moving, at single glance around 10-20.
 
But the most important thing being resolution. Tesla camera has 1.2 megapixels and can't even read a speed limit sign 100 meters away in broad day light.
The human eye on the other hand has 576 megapixels.
It's less about total resolution and more about angular resolution. I predict future HW revs might have gimbaled telephoto cameras (at least two, for redundancy) to be able to rapidly "investigate" areas of interest picked up by the wide-angle cameras, such as faraway speed limit signs or small obstacles in the roadway. But it may be quite a while before this becomes feasible in practice, because I don't think current NN architectures have the flexibility to deal with attention models and variable inputs like this.
Its funny that people like you claim there's no redundancy or back up here. You say things like if the camera says theres no ped and the lidar/radar says there is, which one should you believe?
There will be a meta-layer in the NN that modulates which inputs to prioritize based on conditions. In clear visibility conditions, prioritize the cameras more (or exclusively). In heavy fog, prioritize the radars more. This is incidentally why I think it's a mistake to remove the radars altogether.
But wait, if one chip says there's a ped and another says there isn't which one should we believe? which one is the correct one? why have two? that's stupid unneeded complexity.
In this case, believe neither. Fall back to safe mode and pull over ASAP, since congratulations, you just detected a hardware error on one of your chips. (Or a cosmic ray bit-flip.) The point of the redundancy is to reliably detect hardware failure; that's all. But it's necessary for achieving 99.999999% reliability.
 
I predict future HW revs might have gimbaled telephoto cameras (at least two, for redundancy) to be able to rapidly "investigate" areas of interest picked up by the wide-angle cameras, such as faraway speed limit signs or small obstacles in the roadway.
I know this thread is about autonomy in general, but it's this kind of stuff which is why many of us are skeptical that Tesla's current autonomy platform will be able to achieve L4+ FSD like it is advertised to. When even people with a very positive view are saying "future hardware revs," it means we've all kind of acknowledged that what is there today isn't going to work, and Elon/Tesla's constant "we're almost there" stories are problematic.

Fall back to safe mode and pull over ASAP, since congratulations, you just detected a hardware error on one of your chips. (Or a cosmic ray bit-flip.) The point of the redundancy is to reliably detect hardware failure; that's all. But it's necessary for achieving 99.999999% reliability.
Detecting a hardware failure and then pulling over only works if that hardware failure doesn't impact your ability to pull over. Which is a pretty complex task in many situations, so it's hard to understand why this is possible with a hardware failure that is forcing you to pull over. Pulling over on the side of a highway or road is not a fully safe option for the occupants either, so you have already suffered some reliability hit. Duplex redundancy is really only interesting when you can use an external voter or redundancy- like a human- to take over or tell you how you are wrong, and that is an L2 task. Or in situations where the failures you are protecting against are "hard" failures like the computer completely failing to produce output, not trying to determine between two inconsistent outputs.

This is why airplanes use triplex when needed, so they can self vote. Because continuing to fly until you can land is like "pulling over" in a car- it can be many minutes away, and it's not OK if that time between detection and the vehicle coming to a stop is at much higher or unknown risk.

Also, all the redundancy in the world doesn't get you out of common mode failures. Your whole 12V system failing. A tire blowing out. A rock chip right in the camera FOV. Brake pads that aren't maintained and fail. These all count in the 1:100M reliability, and they're all experience humans deal with today. When you look at a system, you often realize that redundancy is the last thing you need to worry about because even perfect compute hardware doesn't get you the system reliability you need.

In the end, HW reliability is going to be the least of autonomy's worries. 1:100M miles is about 1:3M hours (assume 35 MPH). 3 million hours is nothing in aerospace. We already know humans can drive a car at 1:100M, without crazy redundancies in the car and humans only live about 700k hours. We're going to be limited by the sensing and processing reliability for a very long time before raw hardware reliability is a diving factor.
 
Last edited:
  • Like
Reactions: hgmichna
As of May 2nd, Baidu has deployed driverless robotaxis for the public in a small geofenced area in Beijing, China
You need to use the full quote...
It continues...
At the launch, a Baidu representative said "at least it's not in the desert" and walked away without answering why the need to limit the service to such a small area!

/sarc