Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon: "Feature complete for full self driving this year"

This site may earn commission on affiliate links.
BTW, during Autonomy Day 2019 Musk said they were going to take investors for FSD drives through the day. Later in internal email he wrote that they got a very positive feedback. But has anybody seen a single first-hand response from somebody who took a ride?
I’ve tried to search, and all I found was official Tesla’s video. It’s hard to imagine nobody has shared at least something anonymously on Reddit etc. if rides actually took place.

Yes, there were a bunch of reviews. It was all discussed at length in the autonomy investor day thread at the time of the event.

Here is a review by Morgan Stanley.

D47JjL6UYAALbSA.jpg:large


It is interesting to go back and revisit that thread. There was so much optimism by Tesla fans, including me, at the time that Tesla was really going to deliver FSD very soon. According to the Morgan Stanley quote above, Tesla showed them a decent 20 minute autonomous ride on highway and some off highway streets. Yet, 10 months after autonomy investor day and that demo, Tesla still has not delivered any part of "City NOA", not even any meaningful "traffic light response". It is especially striking when compared to Cruise that has demonstrated real autonomous driving.

So what happened? Now, I don't dismiss the demo ride. I think it really happened just as Morgan Stanley describes. Although I do think Tesla picked a super easy route that would put their FSD in the best possible light. But I think it is clear now that Elon is grossly underestimating the work that is required to make autonomous driving happen and/or overestimating the ability of AI and machine learning to solve the problems. He sees what Tesla has and thinks Tesla can finish the features in 10 months when it actually requires a lot more time. He thinks lidar is doomed because he genuinely seems to think that machine learning will solve autonomous driving in a matter of months. And maybe, Tesla's camera vision approach will eventually work in the long term but it is obvious now that it will take a lot longer than Elon thinks. Based on what we see from Waymo, Cruise and others, clearly, in the short term, lidar is still very helpful, even essential for safe reliable autonomous driving.
 
the simple fact that camera get blocked and don't self-clean or self-clear - and there is no serious redundancy in key vision systems, I'll put my hat down on the 'wont ever do FSD with existing sensor array' table.

there is nothing that can sense side streets as you are coming to an intersection. the cameras, today, can't even capture all traffic lights when I'm stopped at them! if an angle is too big for the camera, the traf light is rendered but no light color. ie, incomplete vision.

forget lidar, for now; just the cameras alone need more redundancy and cleaning ability.
 
So what happened? Now, I don't dismiss the demo ride. I think it really happened just as Morgan Stanley describes. Although I do think Tesla picked a super easy route that would put their FSD in the best possible light. But I think it is clear now that Elon is grossly underestimating the work that is required to make autonomous driving happen and/or overestimating the ability of AI and machine learning to solve the problems. He sees what Tesla has and thinks Tesla can finish the features in 10 months when it actually requires a lot more time.

Looking at all the criticism Tesla receives for Autopilot in its current form, I think the caution with which new features are being released is deliberate. The state at which feature-complete FSD is launched in will probably determine whether regulators let the vehicles operate without an attentive driver in this decade or the next.
 
Yes, there were a bunch of reviews. It was all discussed at length in the autonomy investor day thread at the time of the event.

Here is a review by Morgan Stanley.

It is interesting to go back and revisit that thread. There was so much optimism by Tesla fans, including me, at the time that Tesla was really going to deliver FSD very soon. According to the Morgan Stanley quote above, Tesla showed them a decent 20 minute autonomous ride on highway and some off highway streets. Yet, 10 months after autonomy investor day and that demo, Tesla still has not delivered any part of "City NOA", not even any meaningful "traffic light response". It is especially striking when compared to Cruise that has demonstrated real autonomous driving.

So what happened? Now, I don't dismiss the demo ride. I think it really happened just as Morgan Stanley describes. Although I do think Tesla picked a super easy route that would put their FSD in the best possible light. But I think it is clear now that Elon is grossly underestimating the work that is required to make autonomous driving happen and/or overestimating the ability of AI and machine learning to solve the problems. He sees what Tesla has and thinks Tesla can finish the features in 10 months when it actually requires a lot more time. He thinks lidar is doomed because he genuinely seems to think that machine learning will solve autonomous driving in a matter of months. And maybe, Tesla's camera vision approach will eventually work in the long term but it is obvious now that it will take a lot longer than Elon thinks. Based on what we see from Waymo, Cruise and others, clearly, in the short term, lidar is still very helpful, even essential for safe reliable autonomous driving.

You now understand why so many of us that have owned every iteration of AP Hardware and pre-paid for FSD 3 times now are more than a little skeptical of ElonSpeak, prognostications, "promises" etc etc etc etc....You are now a full fledged member of the Eyes Opened of Elon Talk Society. LOL LOL. (just kidding......kind of)

As a side note, I would NOT have pre-paid for FSD on my 2019 Raven P+L if Tesla had not neutered basic EAP and made FSD "almost" required for what used to be EAP. I firmly believe this was Tesla’s only way of getting multi-repeat buyers to pay the FSD fee again
 
Last edited:
Looking at all the criticism Tesla receives for Autopilot in its current form, I think the caution with which new features are being released is deliberate. The state at which feature-complete FSD is launched in will probably determine whether regulators let the vehicles operate without an attentive driver in this decade or the next.

Yes, I have no doubt that Tesla is being cautious and deliberate about releasing new features. Although it could probably be argued that Tesla is not demonstrating enough caution based on the AP concerns and problems. But I think your point just reinforces my point about Elon's "optimism". He grossly underestimates the time it takes not just to do a feature but to make it safe enough for release. And yes, Tesla is probably being cautious about releasing "city NOA" which is why it has been delayed, because the Tesla engineers understand the safety concerns. But Elon clearly looked at an incomplete system that is incredibly complex like "city NOA" and thought "yeah, we can finish and validate this in 9 months, no problem" when it clearly would take a lot longer than that.
 
  • Like
Reactions: willow_hiller
You now understand why so many of us that have owned every iteration of AP Hardware and pre-paid for FSD 3 times now are more than a little skeptical of ElonSpeak, prognostications, "promises" etc etc etc etc....You are now a full fledged member of the Eyes Opened of Elon Talk Society. LOL LOL. (just kidding......kind of).

Oh yes, I am a member now. LOL. What helped me though was more than just seeing missed timelines. What really helped open my eyes was seeing what the competition has and also educating myself a bit more on autonomous driving. When I started watching videos from Waymo, Cruise, Aurora, Aptiv, Mobileye, and others, I was struck by how far ahead they are to Tesla. It became impossible to deny that Elon's prognostications were way too optimistic. Back when Autonomy Investor Day happened, I was very uninformed. So when Elon said lidar is doomed and machine learning will do autonomous driving by end of 2019, I took him at his word. Now, that I have a better understanding of how autonomous driving works and how far ahead the competition is, I know better.

As a side note, I would NOT have pre-paid for FSD on my 2019 Raven P+L if Tesla had not neutered basic EAP and made FSD "almost" required for what used to be EAP.

Yeah, it kinda sucks that Tesla is making FSD a must-have if you want any kid of real AP.

For me, I am still glad I got FSD when it was on sale. Since I am only a year into owning my Model 3, I will probably get some useful FSD features out of it before it is time to trade in my car.
 
Last edited:
When I started watching videos from Waymo, Cruise, Aurora, Aptiv, Mobileye, and others, I was struck by how far ahead they are to Tesla.

The biggest difference between Waymo/Cruise and Tesla is that when they ship a new feature, the worst thing that can happen is a single accident or two in their small test fleets. When Tesla ships city NOA, it goes out to hundreds of thousands of customers around the world.

Call me naive, but I think Tesla could replicate some of these Waymo/Cruise demonstration videos reasonably well today; but they'd rather finish the features and let their customers do the advertising for them.
 
the simple fact that camera get blocked and don't self-clean or self-clear - and there is no serious redundancy in key vision systems, I'll put my hat down on the 'wont ever do FSD with existing sensor array' table.

The sooner Tesla accepts that more sensors are needed, the better. Their problem is that they've committed to FSD on cars that cannot be easily retrofitted with more sensors. Just one of the pitfalls of selling a technology before it exists, and therefore before you know its hardware requirements.

Looking at all the criticism Tesla receives for Autopilot in its current form, I think the caution with which new features are being released is deliberate. The state at which feature-complete FSD is launched in will probably determine whether regulators let the vehicles operate without an attentive driver in this decade or the next.

I don't think the state at which FSD is released will affect regulators a decade later. I think regulators will look at the current state of the technology each time there's a new development or a new feature.
 
  • Like
Reactions: N5329K
It is especially striking when compared to Cruise that has demonstrated real autonomous driving.

Based on what we see from Waymo, Cruise and others, clearly, in the short term, lidar is still very helpful, even essential for safe reliable autonomous driving.

Cruise’s demo is just as limited as Tesla’s, but in different ways. Take it to a different city or country.

Elon said LIDAR is fine for what it does. But LIDAR won’t get you to the finish line. I’m not saying the cameras (especially as they exist now with today’s firmware) are perfect, or that they will become the final solution. But they are following the path he seems to expect even if not the timeline.

My car, TODAY, is fine in -at least- 99% of reasonable highway situations. That might be 99.9% in a year, then 99.99%. The car IS improving over time. But it’ll need to be 99.99999% before humans are OK completely giving up control. Can the new FSD computer accomplish that? I don’t know. Can the sensor array? Again, we don’t really know.

But as we move from 99% towards 99.99999%, we will all learn. For right now, everyone has limitations, and we are all guessing.

To gauge Tesla’s confidence level, did they move forward with leases on the M3 that could not be bought out at the end? Cause they’ll end up with a PILE of non functional robots is if that’s still the plan.
 
  • Helpful
Reactions: Fiddler
I don't think the state at which FSD is released will affect regulators a decade later. I think regulators will look at the current state of the technology each time there's a new development or a new feature.

How often do you see the AP1 Florida accident brought up when Autopilot on a Raven/3 is discussed? In reading the article, do you get ANY sense that the author is aware that there’s any difference?
 
  • Like
Reactions: willow_hiller
the cameras, today, can't even capture all traffic lights when I'm stopped at them! if an angle is too big for the camera, the traf light is rendered but no light color. ie, incomplete vision
What the visualization shows isn't everything that Autopilot has detected let alone what the camera can see. Here's a screenshot from Autonomy day just before the car crosses the first line of a crosswalk to make a left turn through the intersection:
autonomy intersection.png


Notice how the wide fisheye camera (3rd image from the left) can see the back of the oncoming traffic's left turn light that is almost directly above the car. So pretending it was a light for this car's direction, the wide camera could see it but maybe the neural network to detect the light color is less accurate than main camera right now. Or perhaps for the current visualization implementation, only traffic lights detected from the main camera are rendered thus resulting in lights that move out of view of the main camera render no light (perhaps to avoid rendering duplicate traffic signals or adjusting positioning due to fisheye distortion).

And the pillar cameras (1st and 4th image) do see the side traffic to determine safety to enter the intersection. The overlay here is rendering pink lane lines for the destination road to complete the left turn, but cross traffic vehicles waiting at the red light can be seen but aren't rendered.
 
Although I do think Tesla picked a super easy route that would put their FSD in the best possible light.
I believe the critical piece for Tesla being confident in giving FSD rides around their headquarters is that they validated ahead of time that all the traffic lights and stop signs are correctly detected. However, deploying FSD to the fleet before they're confident in their traffic light and stop sign detection for "anywhere" (in the US) is much harder to validate with very high risk. I do wonder why the stop sign warning wasn't deployed sooner assuming the warning feature provides more data to Tesla to correct false positives, but then again, Tesla might have been collecting as good quality data even without a user facing warning.

Setting low expectations for initial FSD deployment, my guess is basic non-highway driving will be mostly working like Autopilot does right now except with the added ability to stop and make turns through intersections. However maneuvering through complex downtown situations such as partially blocked intersections or double parked vehicles will have a "correct" but cautious behavior of just stopping and requiring the driver to take over if the driver doesn't want to wait.
 
Last edited:
  • Informative
Reactions: pilotSteve
How many people here spent the cash on the FSD option thinking that one day, with any luck, it would be able to drive itself while you carefully supervise, ready to take over at a moment's notice?
You can count me as one of those. I can recite edge-case scenarios all afternoon that I doubt Tesla will solve in my lifetime...

* snow covered road obscures all lane markings
* newly paved interstate without lane markers
* policeman standing in the middle of the road directing traffic around an accident
* '67 VW rear bumper tastefully decorating your lane
* ladder flies out of the pickup truck in front of you -- which way will it bounce?
* intersection changed from stop sign to roundabout since the last time maps were updated
* there's a 2 foot deep puddle in your way
* the car misses the "20 MPH SCHOOL ZONE" sign

added bonus for you California residents:

* your subdivision is in the path of a major fire, and the route nav chooses for you goes straight for the worst flames

Note: none of these (except the fire) are imaginary, they've all happened to me.

But I have confidence that every year the car will get better. It will eventually stop twitching towards exit ramps; it will learn the difference between exit ramps and passing lanes and choose the correct lane to follow instead of diving for the new stripe then diving left or right at random; it will correctly get off the interstate and slow down to 5mph at the end -- then stop instead of turning off nav-on-autopilot and accelerating into cross-traffic; it will discontinue its sporadic habit of sudden braking for shadows, or possibly hallucinations; it will be able to recognize that there's just a single truck next to you, not oscillate between 0, 1, or 2 trucks every few milliseconds.
 
  • Like
Reactions: Barklikeadog
However, deploying FSD to the fleet before they're confident in their traffic light and stop sign detection for "anywhere" (in the US) is much harder to validate with very high risk. I do wonder why the stop sign warning wasn't deployed sooner assuming the warning feature provides more data to Tesla to correct false positives, but then again, Tesla might have been collecting as good quality data even without a user facing warning.

I totally agree that deploying "city NOA" to the public is very high risk. I am actually not surprised at all that Tesla has not deployed it yet.

It is worth remembering the timeline of releases:
- Dec 2018, Elon tweeted that Tesla was testing traffic lights, stop signs and roundabouts. He said our Teslas would go from home to work with no driver input at all "soon".
- March 2019, Tesla releases stop light warning
- Dec 2019 Tesla releases stop sign warning
- Jan 27, 2020, no traffic light or stop sign response yet where the car stops on its own.

So we see that Tesla started testing traffic lights and stop signs well over a year ago now and has presumably been collecting data from stop light and stop warning for months. I think it illustrates the challenge in getting these features working reliably anywhere in the US. So my earlier point was more about Elon's excessive optimism for putting on the FSD order page that the features were coming in 2019 when that was never a realistic timeline. These are indeed high risk, challenging features that will require more time than Elon thought before they are ready for the public
 
You can count me as one of those. I can recite edge-case scenarios all afternoon that I doubt Tesla will solve in my lifetime...

* snow covered road obscures all lane markings
* newly paved interstate without lane markers
* policeman standing in the middle of the road directing traffic around an accident
* '67 VW rear bumper tastefully decorating your lane
* ladder flies out of the pickup truck in front of you -- which way will it bounce?
* intersection changed from stop sign to roundabout since the last time maps were updated
* there's a 2 foot deep puddle in your way
* the car misses the "20 MPH SCHOOL ZONE" sign

added bonus for you California residents:

* your subdivision is in the path of a major fire, and the route nav chooses for you goes straight for the worst flames

Note: none of these (except the fire) are imaginary, they've all happened to me.

But I have confidence that every year the car will get better. It will eventually stop twitching towards exit ramps; it will learn the difference between exit ramps and passing lanes and choose the correct lane to follow instead of diving for the new stripe then diving left or right at random; it will correctly get off the interstate and slow down to 5mph at the end -- then stop instead of turning off nav-on-autopilot and accelerating into cross-traffic; it will discontinue its sporadic habit of sudden braking for shadows, or possibly hallucinations; it will be able to recognize that there's just a single truck next to you, not oscillate between 0, 1, or 2 trucks every few milliseconds.

This is a fun game

* You see a deer on the side of the road
* Rear tire on tractor trailer looks like it's flapping
* You get to a stop at the same time as someone else and they wave at you to go
* You see a school bus with blinking lights, slowing down, but know the stop sign won't pop out until they are stopped
* The school zone, or any other sign, is obscured by a truck
* The power is out
 
How often do you see the AP1 Florida accident brought up when Autopilot on a Raven/3 is discussed? In reading the article, do you get ANY sense that the author is aware that there’s any difference?

The popular press (newspapers, TV reporters, etc.) are notoriously incompetent at reporting news. And when we get to news about science and technology they're ten times worse, maybe a hundred times worse. Reporters learn how to tell a story. They learn what readers want to read and what viewers want to see. They learn how to select a sound bite that will grab attention. They do not learn how to understand information. They learn how to write, but not how to read technical information. There are science journalists who understand science, but they are as rare as honest politicians. And journalists covering technology who know the first thing about the subject they are covering are even more rare.

Looking to the media for anything other than crappy entertainment is foolish. So I would expect reporters to bring up irrelevant factoids.

But please note that I was not speaking of the media or of public perceptions. I was speaking of regulators, who will be strongly influenced by a variety of lobbyists pushing for approval. Big car companies and the powerful insurance industry are on our side in this: They want approval for autonomous cars. The car companies want to sell the cars, and once the cars have demonstrated that they are safer than human drivers, the insurance companies will lobby for approval. Nothing that happens today will affect the regulatory process a decade from now. Only the performance of the cars then will affect the process then.
 
Haha! So it was another Elon's "soon" a year ago? Oh, that's hilarious! You should add his revised "soon" from Jan 2020 to your timeline.

A year is not so long. I'd be willing to consider 5 years to be "soon." Of course, others of Musk's promises have specified dates that have passed. But the original robotaxi promise specified no date. But as soon as the cars first sold with the promise of FSD start to wear out from normal use, if they can not yet drive themselves without a human in the car, Musk has broken his promise.

I don't think he can accomplish this on these cars because I think more sensors are needed. Sensors are the reason why I bought EAP but not FSD. I'll buy the FSD car when it becomes available. And I'll buy it as soon as I can.