Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
It sounds like you agree that the new language is not an accurate description of what we paid for a promised FSD.

Autosteer automatically keeps the car center within a lane. It is not Auto Lane Change because with basic autopilot, you have still have to pay FSD in addition to Autosteer to get that Auto Lane Change feature. Autosteer is not automatic brakes. TACC performs that function for you if there's a car in front. Without a car in front, you'll need to pay FSD in addition to TACC to do that, for example stopping at a stop sign.

Thus, the new language "autosteer on city streets" sounds more restrictive than the old language "Automatic driving on city streets."

The old language "Automatic driving on city streets" should cover everything, from basic autopilot, to autosteer, to TACC, to auto-stopping without a car in front, to auto-turning at intersections...

The new language sounds like scaling back from the previously overgenerous promise.

Yes, IMO the new language feels like a dumb down version of the original FSD.

The original FSD promised that we could just get in, input an address and the car would completely drive us to that destination with no intervention needed on our part. I suspect Tesla is starting to realize that was overly ambitious. Real self-driving where the car really is in charge and you are just a passenger, requires very advanced perception, planning and driving policy and has to handle hundred of thousands of sometimes unpredictable driving situations that people get into. So Tesla is probably realizing that they need to aim for something a bit less ambitious first. The new langage is more a collection of features that will do certain tasks but require driver supervision and intervention. Basically, Tesla is focusing on automating as much of driving as possible with the driver supervising and ready to intervene.

I suspect that "autosteer on city streets" will be similar to the current NOA but for city streets and I think it will be L2, at least for the near future. So it will try to follow a route on city streets, lane keep, stop at traffic lights, make turns at intersections and handle roundabouts. So like NOA, it will probably be pretty useful to automate a boring commute but we will need to supervise. I don't think "autosteer on city streets" will be able to handle every case on city streets. There will be plenty of cases, like a weird construction zone, double parked cars blocking the lane etc... where we will need to take over.

I will add that if I am right, it could still be a win for Tesla, even if it is not the originally promised FSD. I know a lot of Tesla owners, including myself, find long highway drives much more relaxing with AP/NOA. If Tesla can deliver something similar for "city driving" where AP can do a lot of the steering and braking, and we just need to watch and take over in some instances, I think a lot of owners would find that to be a positive.
 
  • Like
Reactions: X-pilot
It sounds like you agree that the new language is not an accurate description of what we paid for a promised FSD.

Autosteer automatically keeps the car center within a lane. It is not Auto Lane Change because with basic autopilot, you have still have to pay FSD in addition to Autosteer to get that Auto Lane Change feature. Autosteer is not automatic brakes. TACC performs that function for you if there's a car in front. Without a car in front, you'll need to pay FSD in addition to TACC to do that, for example stopping at a stop sign.

Thus, the new language "autosteer on city streets" sounds more restrictive than the old language "Automatic driving on city streets."

The old language "Automatic driving on city streets" should cover everything, from basic autopilot, to autosteer, to TACC, to auto-stopping without a car in front, to auto-turning at intersections...

The new language sounds like scaling back from the previously overgenerous promise.

The entire conversation over the last few pages of this thread has been really about interpretation.

I agree that language has changed where it scaled back, but I'd argue that the video they show when you click "learn more about autopilot" clearly demonstrates a system beyond that of the language. The extent of how far beyond that is whether you interpret it as L3 or L4.

To me there is a large gap between the deliverables as worded now, and the video.

There is no reason to believe the video shouldn't set expectations because it's linked to directly from the Model S, X, and 3 sections on the website.

So my expectation is that Tesla will claim FSD feature complete from an accounting perspective without actually fulfilling it. Instead they'll be a constant stream of updates after autosteer on city streets that will handle additional tasks. Things like handling 4 way stops.
 
Last edited:
The entire conversation over the last few pages of this thread has been really about interpretation.

I agree that language has changed, but I'd argue that the video they show when you click "learn more" clearly demonstrates a system beyond that of the language. The extent of how far beyond that is whether you interpret it as L3 or L4.

To me there is a large gap between the deliverables as worded now, and the video.

There is no reason to believe the video shouldn't set expectations because it's linked to directly from the Model S, X, and 3 sections on the website.

As I see it, the FSD features on the order page are the "deliverables". They are a list of features that Tesla is working to deliver to customers in the near future. The FSD videos are more aspirational. They represent what Tesla hopes FSD will become.
 
I don't think it's possible to apply the user monitoring tactics in the current iteration of NoA to "city NoA." I think that's why a public release of intersection turns will be an immense challenge. When they release turns to the public, it will have to be at a reliability much better than the current traffic control feature.

Releasing turns to the public is the end-game for FSD, imo. To do so reliably means almost all the perceptual challenges have been solved (imo).
 
  • Like
Reactions: S4WRXTTCS
I don't think it's possible to apply the user monitoring tactics in the current iteration of NoA to "city NoA." I think that's why a public release of intersection turns will be an immense challenge. When they release turns to the public, it will have to be at a reliability much better than the current traffic control feature.

Releasing turns to the public is the end-game for FSD, imo. To do so reliably means almost all the perceptual challenges have been solved (imo).

True to the point that its really hard to gauge where things will be at that point.

Of course this is still Tesla.

They'll find a way to phantom brake during the turn. :p
 
As I see it, the FSD features on the order page are the "deliverables". They are a list of features that Tesla is working to deliver to customers in the near future. The FSD videos are more aspirational. They represent what Tesla hopes FSD will become.

Do you think they'll add more deliverables/milestones after the autosteer is released?

Or do you believe its pretty much the end of deliverables because of accounting reasons? Part of me believes the deliverables were really about recognizing revenue for FSD without actually ever achieving FSD. Hence the watering down of them.

The video is the only part that contradicts that because it's not just aspirational. It sets the expectations, and is still a promise for a deliverable.
 
Do you think they'll add more deliverables/milestones after the autosteer is released?

Or do you believe its pretty much the end of deliverables because of accounting reasons? Part of me believes the deliverables were really about recognizing revenue for FSD without actually ever achieving FSD. Hence the watering down of them.

The video is the only part that contradicts that because it's not just aspirational. It sets the expectations, and is still a promise for a deliverable.

If Tesla can come up with more "deliverables" that can bring in more revenue, sure, I think they could add more FSD features after "autosteer on city streets" is released. We also know that Tesla is not shy about changing things, like removing EAP, moving features from EAP to FSD, adding EAP again etc... So I would not be surprised if Tesla suddenly announced new FSD features after "autosteer on city streets".

Also, the current features do not achieve the promise of the FSD video IMO. So that's another possible reason why Tesla might add more features after "autosteer on city streets" in order to bridge the gap.

I guess the argument against more features being added is that "autosteer on city streets" is pretty broad. So any new "city driving" features could be wrapped up into "autosteer on city streets".
 
Last edited:
I know a lot of Tesla owners, including myself, find long highway drives much more relaxing with AP/NOA. If Tesla can deliver something similar for "city driving" where AP can do a lot of the steering and braking, and we just need to watch and take over in some instances, I think a lot of owners would find that to be a positive.

But, the thing is the current state of AP/NOA is not relaxing for 95% or more of the users. There are too many broken things that we know would be easy to fix if Tesla wasn't so FSD focused.

Judging from lots of threads there is a lot of money sitting on the table waiting for improvements to AP for people to get AP, and improvements to EAP to get EAP (apparently its back on the table).

The feature want is the equivalent of beep on green, but for 4+ way stops. Sure I'd still be paying attention, but it would be a confirmation of what I saw. At least it would be something because I hate those things (where there are lots of cars).
 
I work with CV in my software projects (OCR in particular). I find it fascinating that CV continues to progress every year. As Elon says, CV is becoming as good or better than humans in recognition.

One of the reasons why Elon says that software development in FSD is exponential is that once you solve a particular neural network problem to a certain level of accuracy, you can essentially apply that same neural network to a wide variety of similar problems. For example, once DeepMind was able to develop AlphaZero, they used that same neural network architecture to solve essentially all board games with rules.

It seems kind of silly when I get excited about one particular AP feature (like traffic control), but when you understand what neural network approaches are capable of, you can see the true potential of what Tesla is doing.
 
If Tesla can come up with more "deliverables" that can bring in more revenue, sure, I think they could add more FSD features after "autosteer on city streets" is released. We also know that Tesla is not shy about changing things, like removing EAP, moving features from EAP to FSD, adding EAP again etc... So I would not be surprised if Tesla suddenly announced new FSD features after "autosteer on city streets".

Also, the current features do not achieve the promise of the FSD video IMO. So that's another possible reason why Tesla might add more features after "autosteer on city streets" in order to bridge the gap.

I guess the argument against more features being added is that "autosteer on city streets" is pretty broad. So any new "city driving" features could be wrapped up into "autosteer on city streets".

I would love to see them add milestones because there is that huge gap.

I'm not expecting it because there isn't a milestone list for anything other than FSD. Like reverse summons is completely missing, but we know its part of their development efforts.

If they did add additional milestones it would give people the impression that it wasn't complete, and would upset the revenue recognition for FSD.

Instead we're likely to see tweets from Elon teasing us about some update.
 
  • Like
Reactions: diplomat33
I would not buy EAP today.

I got EAP because when I bought my car the choices were FSD, EAP, or dumb cruise control and no autonomy features at all. Now that basic AP is standard, if my car were totaled and I were replacing it, I'd get plain AP. The only feature of EAP I use is changing lanes by tapping the turn-signal stalk. I use AP all the time, but none of the other features of EAP, and that one lane-change feature is not important to me.

I think that autosteer is the greatest thing since sliced bread. I absolutely love it. But none of the other available features on EAP or FSD matter to me. I'll pay whatever I have to for Level 3 or above when and if that becomes available. But right now there are no features I want, beyond basic AP until they are Level 3.
 
  • Love
Reactions: Daniel in SD
I would love to see them add milestones because there is that huge gap.

I'm not expecting it because there isn't a milestone list for anything other than FSD. Like reverse summons is completely missing, but we know its part of their development efforts.

If they did add additional milestones it would give people the impression that it wasn't complete, and would upset the revenue recognition for FSD.

Instead we're likely to see tweets from Elon teasing us about some update.
It wouldn't upset revenue recognition for FSD versions that were already sold though. Once they deliver "Autosteer on city streets" they can recognize the revenue for people who bought that but not for people who bought "automatic driving on city streets."
It probably makes sense to add more items to the milestone list once "Autosteer on city streets" is delivered. People might look at the reality of "Autosteer on city streets" and say that's not worth $10k (or whatever it costs by then)! But if there's something new yet to be delivered they might be convinced that it will make it worth it, especially with the knowledge that the price might go up. I would be interested to know the take rate for FSD.
 
Waymo posted an informative blog on how they use HD maps.

A couple interesting points:

Waymo cars are able to respond in real-time to changes in the map:

"Of course, our streets are also evolving, so if a vehicle comes across a road closure or a construction zone that is not reflected in a map, our robust perception system can recognize that and make adjustments in real-time."

Maps are efficient and scalable. Updating maps is almost 100% automated and changes are automatically shared with the entire fleet.

"We’ve automated most of that process to ensure it’s efficient and scalable. Every time our cars detect changes on the road, they automatically upload the data, which gets shared with the rest of the fleet after, in some cases, being additionally checked by our mapping team."

Read more in the blog here:

Waypoint - The official Waymo blog: The Waymo Driver Handbook: How our highly-detailed maps help unlock new locations for autonomous driving

I know we've debated HD maps and Waymo a lot. So hopefully, this blog answers some questions. :)
 
Last edited:
Updating maps is almost 100% automated and changes are automatically shared with the entire fleet.

I don't read it that way. Uploading the data is automated. Detecting possible mismatches is automated.

However, I don't think it's automatically shared:

"they automatically upload the data, which gets shared with the rest of the fleet after, in some cases, being additionally checked by our mapping team."

Their biggest challenge is similar to Tesla's, but with less data: you have to program the mismatch detection, but to program in the mismatches, you have bias and also a "list" of possible mismatch scenarios. It's similar to the simulation problem.
 
  • Like
Reactions: mikes_fsd
I don't read it that way. Uploading the data is automated. Detecting possible mismatches is automated.

However, I don't think it's automatically shared:

"they automatically upload the data, which gets shared with the rest of the fleet after, in some cases, being additionally checked by our mapping team."

Their biggest challenge is similar to Tesla's, but with less data: you have to program in the mismatch detection, but to program in the mismatches, you have bias and also a "list" of possible mismatch scenarios. It's similar to the simulation problem.

The word "additionally" seems to imply that the manual check happens with the sharing of the data with the fleet. So some map changes are shared automatically and some map changes are double checked first before being shared with the fleet. Hence, why they say that the process is not completely automated.

I could be wrong but that's how I read it. Maybe I should have clarified my post better.
 
  • Like
Reactions: powertoold
Mobileye is bringing their SuperVision ADAS powered by EyeQ5 to Chinese EVs:

“Our collaboration with Geely is a game changer for the global automotive industry as it brings our industry-leading surround-vision technology to market in one of the most advanced driver-assistance systems,” said Amnon Shashua, senior vice president at Intel and president and chief executive officer of Mobileye, an Intel company. “We are thrilled to help Geely offer Lynk & Co drivers an exciting and advanced package of high-level driver aids and safety features, including point-to-point highway pilot and traffic-jam assist, all powered by Mobileye’s SuperVision surround-view driver-assistance system and kept current with OTA updates.”

The collaboration between Geely and Mobileye comes amid a growing demand for electric vehicles in China and beyond, as well as increased interest in safer, cleaner transportation solutions. The future production-ready Zero Concept EV featuring Mobileye SuperVision ADAS technology will present a new, groundbreaking option for consumers as China’s EV market rapidly expands.

Lynk & Co CoPilot, powered by Mobileye’s SuperVision system, is a first-of-its-kind ADAS-to-AV scalable system, supported by the unprecedented use of surround-view cameras and other driving policy and navigation technologies powered by two EyeQ5 SoCs, Mobileye’s most advanced SoC. The solution brings cutting-edge safety technology to assist human drivers in a multitude of different driving scenarios.

In addition to enabling high-level driver assistance in the Zero Concept EV over several years, Geely and Mobileye announced a high-volume ADAS agreement to equip a variety of Geely Auto Group makes and models with Mobileye vision-sensing technology. The long-term agreement will see multiple Geely Auto Group brands and vehicles outfitted with Mobileye-powered ADAS features such as automatic emergency braking and lane-keeping assist.

Mobileye, Geely to Offer Most Robust Driver-Assistance Features | Intel Newsroom
 
More...

  • Two EyeQ5 SoCs
  • 11 Cameras
  • Computer Vision by Mobileye
  • Driving Policy by Mobileye (First)
  • HD Mapping (REM) by Mobileye
  • OTA by Mobileye (First)
  • Point-to-point (navigation based) highway pilot
  • Highway hands-free
  • Arterial
  • Urban hands-free driving
  • Late 2011
"Geely Auto Group will use Mobileye’s full-stack, 360-degree camera solution in its brand-new, premium-model, L2+ electric vehicle (EV) from Lynk & Co – the Zero Concept – reaching consumers in late 2021. This system, which today we are launching as Mobileye SuperVision™, is a direct derivative of our autonomous driving program and utilizes the camera-only portion of our truly redundant sensing suite that we are developing for Level 4 autonomous vehicles (AVs)."

"This win marks the first time Mobileye will be responsible for the full solution stack, including hardware and software, driving policy and control. Due to the complexity of the project, we will also supply a multidomain controller that will be validated for automotive and serve as a subsystem for very advanced ADAS solutions worldwide."

"It also marks the first time that an OEM has publicly noted Mobileye’s plan to provide over-the-air updates to the system after deployment. While this capability has always been in our repertoire, Geely and Mobileye want to assure customers that we can easily scale their driving-assistance features and keep everything up to date across the car’s lifetime.

Our SuperVision camera-only solution is based on two Mobileye EyeQ5® system-on-chips – complete with seven long-range and four close-range cameras – and delivers a 360-degree surround view to enable a scalable feature bundle supporting highway hands-free, navigation-based highway-to-highway, arterial, and up to urban hands-free driving. We’re also providing our Responsibility-Sensitive Safety (RSS) based driving policy, which helps the vehicle operate safely where lane markings may not be visible and other road users might pose a hazard. The Mobileye SuperVision system supports the capabilities we have shown in our drone-view videos of our AVs driving in Jerusalem."

Why the Geely Auto Group Win is a Game Changer | Intel Newsroom
 
More...

  • Two EyeQ5 SoCs
  • 11 Cameras
  • Computer Vision by Mobileye
  • Driving Policy by Mobileye (First)
  • HD Mapping (REM) by Mobileye
  • OTA by Mobileye (First)
  • Point-to-point (navigation based) highway pilot
  • Highway hands-free
  • Arterial
  • Urban hands-free driving
  • Late 2011
"Geely Auto Group will use Mobileye’s full-stack, 360-degree camera solution in its brand-new, premium-model, L2+ electric vehicle (EV) from Lynk & Co – the Zero Concept – reaching consumers in late 2021. This system, which today we are launching as Mobileye SuperVision™, is a direct derivative of our autonomous driving program and utilizes the camera-only portion of our truly redundant sensing suite that we are developing for Level 4 autonomous vehicles (AVs)."

"This win marks the first time Mobileye will be responsible for the full solution stack, including hardware and software, driving policy and control. Due to the complexity of the project, we will also supply a multidomain controller that will be validated for automotive and serve as a subsystem for very advanced ADAS solutions worldwide."

"It also marks the first time that an OEM has publicly noted Mobileye’s plan to provide over-the-air updates to the system after deployment. While this capability has always been in our repertoire, Geely and Mobileye want to assure customers that we can easily scale their driving-assistance features and keep everything up to date across the car’s lifetime.

Our SuperVision camera-only solution is based on two Mobileye EyeQ5® system-on-chips – complete with seven long-range and four close-range cameras – and delivers a 360-degree surround view to enable a scalable feature bundle supporting highway hands-free, navigation-based highway-to-highway, arterial, and up to urban hands-free driving. We’re also providing our Responsibility-Sensitive Safety (RSS) based driving policy, which helps the vehicle operate safely where lane markings may not be visible and other road users might pose a hazard. The Mobileye SuperVision system supports the capabilities we have shown in our drone-view videos of our AVs driving in Jerusalem."

Why the Geely Auto Group Win is a Game Changer | Intel Newsroom

Looks great. Thanks for sharing.

I like that Mobileye is delivering the whole ADAS solution.

We knew it was only a matter of time. Looks like Mobileye is ready to start deploying their camera vision part, along with their driving policy to deliver robust ADAS.

The clock is ticking for Tesla.
 
  • Funny
Reactions: mikes_fsd