Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
The language on the configuration page has changed, but as you say, the language on this page remains largely unchanged: Autopilot

Yep, that's the vision, that we/I bought into, and subject to the potential risks. Thanks for linking that.

"
Full Self-Driving Capability
All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.

All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you.

The future use of these features without supervision is dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving capabilities are introduced, your car will be continuously upgraded through over-the-air software updates.
"

Basically A-B journey, in most circumstances with no action required by the person in the driver seat. It doesn't say 'unsupervised' in the first part, so that would be L3+ (Somewhere between L3 and L4 as it says "almost all circumstances").

Later it goes on to talk about "without being supervised" which would be the elevation to L4, and caveats that with more risk.

The difference between the "supervised" functionality, and then to the "unsupervised", is really just a match of the 9's to prove adequate reliability, rather than providing any additional functionality. (i.e. in Musk's terms, it would be functionally complete when it can do the journey, but then needs to be refined, proven to get to this level). BTW. this isn't a trivial, no-risk.

Pretty sure we will get to the "Supervised" vision, which would be fantastic. How far we get into the "unsupervised" territory, is probably where the major risk (and payback) is.
 
I wonder if this is why Traffic Control still requires stalk confirmations? It would seem that there are still edge cases that can cause traffic light detection to be unreliable.

AJFsalu.png
 
Me too, but I find that it's been getting better over time. For example, incorrect green light notifications are less nowadays compared to a few months ago. I've also never had a situation where it ran a red or overran a yellow (after stalk confirm).

I've also never seen the fsd beta run an obvious red. It has run a red before, but not because it didn't recognize it, it was more of a planning issue.

In honesty, I think the traffic control recognition feature is better than an avg human right now. I believe it will more consistently stop at the appropriate traffic controls than the avg human statistic, although I have no data from Tesla to back this up lol. This is a good time to bring up this video:

 
Last edited:
Add very dim lights as well.

Eastbound El Camino turning north on Shoreline in Mountain View. The green lights for the two left-most turn lanes are super dim and AP can't detect them.

elcamino_to_shoreline_lights.jpg


Me too, but I find that it's been getting better over time. For example, incorrect green light notifications are less nowadays compared to a few months ago. I've also never had a situation where it ran a red or overran a yellow (after stalk confirm).

I've also never seen the fsd beta run an obvious red. It has run a red before, but not because it didn't recognize it, it was more of a planning issue.

In honesty, I think the traffic control recognition feature is better than an avg human right now. I believe it will more consistently stop at the appropriate traffic controls than the avg human statistic, although I have no data from Tesla to back this up lol. This is a good time to bring up this video:

 
The HW3 computer has 2 parallel sides of the processor, which can run independently?

It's been stated that the existing apps doesn't need the processing power of both sides.

Do we know/think that the FSD software may be running on one side collecting data, making predictions, while the other side is running the existing apps (for those not running the Beta program) and doing the control.

If possible, seems like a smart way of collecting data, without having the new stuff active.
 
  • Like
Reactions: mikes_fsd
The HW3 computer has 2 parallel sides of the processor, which can run independently?

It's been stated that the existing apps doesn't need the processing power of both sides.

Do we know/think that the FSD software may be running on one side collecting data, making predictions, while the other side is running the existing apps (for those not running the Beta program) and doing the control.

If possible, seems like a smart way of collecting data, without having the new stuff active.


It has 2 for redundancy.

Otherwise your robotaxi crashes if one chip fails.

IIRC Green found for a long time the B side didn't do anything but in a late 2019 update it began running an exact copy of the A-side code, presumably to fail-over if the A side crashes.
 
My estimate / guestimate is three years. What is your estimate / guestimate for no accidents on average every 150K miles?

It's crazy I even think this, but 6-9 months lol. We've seen essentially every human maneuver from this beta, even unintuitive or difficult to program ones like:

Overtaking a slowly moving leaf blower on a two lane street
Moving over slightly to give room for a person opening a car door
Stopping for pedestrians intending to cross the street
Anticipating possible trajectories of running pedestrians
Moving slightly to the right because oncoming car slightly crosses line (at night)
Able to recognize an oncoming bicyclist at night, with just one visible flashlight

It looks like @powertoold was a bit optimistic in his prediction last fall. Does it feel like we are closer to 150k miles between accidents?
 
It looks like @powertoold was a bit optimistic in his prediction last fall. Does it feel like we are closer to 150k miles between accidents?
Dont worry he now says it will happen in 2 months (just 4 more updates). This is one of the brightest tesla folks out here. cut him some slack. he is totally not insane (saying the same thing over and over again).

so after going from 1 safety disengagement per mile in 2016 CA report to 1-10 miles on avg today after 72 month (6 years).
and after going from 1-3 miles in Oct 2020 to 1-10 miles on avg today after 10 months.

It will go from 1-10 miles to 150k+ miles in just 2 months. Get Ready!
 
  • Like
  • Disagree
Reactions: KJD and MP3Mike
It looks like @powertoold was a bit optimistic in his prediction last fall. Does it feel like we are closer to 150k miles between accidents?

I'm definitely wrong on that prediction, but I'm still optimistic about the progress. Again, it's an average for all driving (city + highway) that typical people do, not "find the worst areas" and use those to gauge overall performance.

I'm sure Tesla can achieve higher averages by avoiding unprotected lefts, roundabouts, etc. But it's refreshing to see that there's essentially no limitation in fsd beta, so we get to see its true performance, warts and all.

I'll refrain from these predictions, but I do think Tesla will get there soon, hopefully before end of year, optimistically within 2-3 months.

It's been said many times, but people fail to understand the significance of this milestone. If Tesla can achieve human level+ stats with their approach, it'll be the most valuable "tool" ever created by humanity by a long shot. The fact that they're late a few months or a year isn't a big deal. If people want to point out fatal flaws (unfixable) in V9, that'd be more constructive.

What I find most interesting is that prior to fsd beta, many of us were saying the sensor suite is impossible or pointing out some hardware limitation, but now with V9, we're not talking about hardware inadequacy anymore. Now we're pointing out disengagements or dumb decisions that have nothing to do with the sensors but rather the software or perception not seeing or acting on something that is obviously there.
 
Last edited:
FSD timeline : Many years
Robotaxis: years, but maybe some limited deployment sooner
Better than average human (40K miles per accident): Hopefully a year.
Button: Hopefully this year.
Questions:
  1. Pothole detection?
  2. Works when looking into the sun?
  3. Range at which cars detected when when making an unprotected left?
  4. How does it perform in poor visibility situations? Rain / snow at night.
  5. Ability to see objects not trained for?
  6. When will we get faster turnaround on FSD updates for training on unusual situations. Should be able to get an update to fleet in 24 hours.
 
so after going from 1 safety disengagement per mile in 2016 CA report to 1-10 miles on avg today after 72 month (6 years).
and after going from 1-3 miles in Oct 2020 to 1-10 miles on avg today after 10 months.

It will go from 1-10 miles to 150k+ miles in just 2 months. Get Ready!


...what?


The 2016 report showed 1 disengagement per 3.27 miles driven- not 1 to 1.

The only other report we have from CA on Tesla is 2019- where it was zero disengagements, but only 12 miles of driving.... (this was for the autonomy day video- they got it all in 1 take without any disengagement).


So not only is your 2016 # we actually have wrong, I'm unsure were you're getting any of your supposed 2020 or "today" numbers from.
 
  • Like
Reactions: FSD_Scribe
Dont worry he now says it will happen in 2 months (just 4 more updates). This is one of the brightest tesla folks out here. cut him some slack. he is totally not insane (saying the same thing over and over again).

so after going from 1 safety disengagement per mile in 2016 CA report to 1-10 miles on avg today after 72 month (6 years).
and after going from 1-3 miles in Oct 2020 to 1-10 miles on avg today after 10 months.

It will go from 1-10 miles to 150k+ miles in just 2 months. Get Ready!
...what?


The 2016 report showed 1 disengagement per 3.27 miles driven- not 1 to 1.

The only other report we have from CA on Tesla is 2019- where it was zero disengagements, but only 12 miles of driving.... (this was for the autonomy day video- they got it all in 1 take without any disengagement).


So not only is your 2016 # we actually have wrong, I'm unsure were you're getting any of your supposed 2020 or "today" numbers from.

The problem is that we have almost no data because Tesla has not released any real data.

We have 1 disengagement per 3.26 mi in the 2016 report. We have 12 miles and 0 disengagements in 2019 and that's about it. We have no disengagement rates, no accident rates, for FSD Beta. So we are all just guessing about how well FSD Beta actually is. All we have are videos which are insufficient for statistical analysis.

It's mighty convenient if you ask me, because Elon can just say "we are making progress" and you can't really disagree but you can't really quantify the progress either.
 
...what?


The 2016 report showed 1 disengagement per 3.27 miles driven- not 1 to 1.

The number i gave isn't exact. I have posted the exact numbers numerous times in the past and didn't want to waste my time to look up the number because they are virtually the same and it won't matter anyway because majority of tesla fans don't look at facts/data only hype and sensationalism.

The only other report we have from CA on Tesla is 2019- where it was zero disengagements, but only 12 miles of driving.... (this was for the autonomy day video- they got it all in 1 take without any disengagement).
That drive consisted of alot of highways. I don't count highways. People who got rides on autonomy day talked about having one or more disengagements.

So not only is your 2016 # we actually have wrong, I'm unsure were you're getting any of your supposed 2020 or "today" numbers from.
from urban and suburb videos from DirtyTesla, AIAddict and Frenchie and some other youtubers since FSD Beta 1. Again not exact.
 
  • Disagree
Reactions: MP3Mike
The number i gave isn't exact. I have posted the exact numbers numerous times in the past and didn't want to waste my time to look up the number because they are virtually the same

You think a disengagement per mile number over 3x lower is "virtually the same"

Glad you're not in charge of anything related to safety :)


and it won't matter anyway because majority of tesla fans don't look at facts/data only hype and sensationalism.

I mean, you're the one who appears to be posting made up figures, rather than actual facts/data, so....



That drive consisted of alot of highways. I don't count highways.

Weird, because every regulatory body that looks at this stuff counts them.



People who got rides on autonomy day talked about having one or more disengagements.

People who got rides were in an L2 vehicle- that's why none of those had to be reported to CA DMV.

That said- "someone talked out" isn't data, it's an anecdote.



from urban and suburb videos from DirtyTesla, AIAddict and Frenchie and some other youtubers since FSD Beta 1. Again not exact.


The plural of anecdote is also not data.


If you want people to focus on facts instead of speculation I'd suggest you post more facts and less speculation.
 
  • Like
Reactions: FSD_Scribe