Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I am curious about whether many people think a driver-assist "autosteer on city streets" would be a useful product that you would want. Autopilot is a useful product. In the simpler highway environment you can monitor it without a lot of mental energy and stay reasonably safe if you don't cheat on your monitoring. There are tricks to do that.

On the streets I find driving with FSD prototype to be harrowing. Interesting in the "may you live in interesting times" sense. Now, the jerky wheel will be fixed in time, but for now you can't keep your hands on the wheel. You have to be scared about it doing something scary at any intersection. It does not make the drive relaxing. Can it make the drive relaxing? Or worse, if it gets better does it put you in a state of complacency which ends up being dangerous. For now, most "beta" testers are keeping rein on it, but I am not sure where it goes.

So would you pay money for a driver-assist city street autopilot? Or is an actual real working full self driving what you would pay for? Or would you rather they worked on highway self-driving or traffic jam self-driving, even with level-3 style standby driving necessary?
I agree 100%. For me I want an automation system that does lane keeping and speed control really well because that's 99% of driving. Save all the fancier stuff for automatic collision avoidance systems that run in the background.
Of course I would pay a lot of money for an "eyes off" system that could handle a significant percentage of my driving.
 
  • Like
Reactions: Terminator857
Just see the disclaimer on FSD order page. Also master plan 2, IIRC.
The 10x safer in the Master Plan is for supervised Autopilot.
I don't think the current FSD page has a safety target but I think the original page was 2x. Elon always say 2-3x I think except for HW4 which will be better.

"We've also got to make it work and then demonstrate that if the reliability is significantly in excess of the average human driver or to be allowed... um... you know for before people to be able to use it without... uh... paying attention to the road... um... but i think we have a massive fleet so it will be I think... uh... straightforward to make the argument on statistical grounds just based on the number of interventions, you know, or especially in events that would result in a crash. At scale we think we'll have billions of miles of travel to be able to show that it is, you know, the safety of the car with the autopilot on is a 100 percent or 200 percent or more safer than the average human driver"
 
I am curious about whether many people think a driver-assist "autosteer on city streets" would be a useful product that you would want. Autopilot is a useful product. In the simpler highway environment you can monitor it without a lot of mental energy and stay reasonably safe if you don't cheat on your monitoring. There are tricks to do that.
I find AP/FSD extremely useful in the city. Infact I’ve been using it for nearly 3 years. It’s a great driving aid, esp. at night when it’s raining.
 
The 10x safer in the Master Plan is for supervised Autopilot.
I don't think the current FSD page has a safety target but I think the original page was 2x. Elon always say 2-3x I think except for HW4 which will be better.

"We've also got to make it work and then demonstrate that if the reliability is significantly in excess of the average human driver or to be allowed... um... you know for before people to be able to use it without... uh... paying attention to the road... um... but i think we have a massive fleet so it will be I think... uh... straightforward to make the argument on statistical grounds just based on the number of interventions, you know, or especially in events that would result in a crash. At scale we think we'll have billions of miles of travel to be able to show that it is, you know, the safety of the car with the autopilot on is a 100 percent or 200 percent or more safer than the average human driver"



So, in short, Master Plan, Part Deux is:
Create stunning solar roofs with seamlessly integrated battery storage
Expand the electric vehicle product line to address all major segments
Develop a self-driving capability that is 10X safer than manual via massive fleet learning
Enable your car to make money for you when you aren't using it
 
Sorry for the poor wording.
So, how close can the OEDR get to what would be required for L5 before it becomes an AV prototype?

"AV prototype" is a meaningless term you invented then demanded we explain to you. It's kinda weird.

L5 has an actual meaning. It requires a complete OEDR that works in all situations in which a human can drive.

L3 or L4 only require a complete OEDR within their ODD.

L2 has, by definition, an incomplete OEDR and thus requires a human to always supervise and perform that part of the DDT.

FSDBeta, called city streets internally by Tesla, has an incomplete ODER and Tesla has explicitly said they do not plan to change that when it goes to wide release- thus by definition it can not be more than an L2 system


I eagerly await your next misunderstanding of these basic facts you've already had explained to you numerous times.
 
  • Funny
Reactions: EVNow
My preference beyond anything else is that a feature actually works. Like I'd rather have adaptive cruise control that worked great 99.9% of the time versus lane-steering plus adaptive cruise control that was prone to phantom braking.

So I find it disappointing that Tesla worked on city streets before really getting NoA working well. Even though NoA is an L2 system it has the potential of being really useful. The usefulness is in being knowledgeable about traffic ahead to determine the best lanes to be in, and when to get in those lanes.

With City Streets I believe its going to be really tough to balance usefulness with safety. Right now its fairly safe because of how much you have to monitor it. But, as soon as it is less nerve racking people will begin to trust it it.

It's going to be difficult for city streets to be useful because the very things we want it to take over for are actually the hardest things for it to do well at. Things like handling busy stop sign controlled intersections that have lots of vehicles going different directions. It's also going to be hard for it to deal with weather like rain.

So I question whether it really can be useful. It's just going to be another feature that sounds neat, but doesn't really deliver what I want.

Auto Lights -> Doesn't turn on during rain unless its getting dark (implementation doesn't match what I want)
Auto Wipers -> Doesn't work well in some rain situations (performance doesn't match expectations)
Smart Summon -> too slow, too unreliable
Visual Autopark -> too slow
TACC -> too prone to false braking and not smooth enough
AP -> Doesn't handle merge points well. Basically it doesn't handle any situation well where the lane width widens or it has to choose a lane (single stack V11 might fix this).
ALC -> Feature works well enough to be useful, but could be improved by handling some situations better.
NoA -> Does idiotic lane changes, and doesn't handle traffic very well (V11 might fix this).

Of all those features I only give ALC a passing grade in terms of usefulness. I think that's because I expect TACC/AP to work better so it annoys me that they don't. I'm not too annoyed by Autopark or Summon as I don't really need them. Maybe its that ALC is the only feature that I as a Tesla owner can really show someone where 98% of the time it will work great.

The auto lights is probably the most disappointing one as it makes a large percentage of PNW Tesla owners look like moron, and its the simplest thing to get right. Tesla should just do what other manufactures do and tie it to wiper speed.

Long term I plan on trading my 3 for a Rivian, and I expect that I'll have way less features but the features will work (or I might skip it). Sure I could get an MB or a Lucid to have better ADAS type capabilities, but I prefer the rugged nature of the Rivian.

L4 capability for under $100K would be hard to pass up on though. L4 by its nature requires the feature to work as advertised versus the make believe nature of so many L2 features (not just Tesla). L4 is the kind of thing you have a vehicle specifically for. L3 would be a nice to have, but I wouldn't buy a vehicle specifically for that.
I do suspect many people concur, it would be interesting to see survey data of Tesla owners. NoA is fine but I don't trust it to merge at full speed onto other highways, and also on "merge to exit" interchanges. But I think it could actually get good at those, particularly because it can simultaneously look forward and back which a human can't do. Likewise ALC. I think that auto lights and wipers and autopark could be made usable. I would like another feature they don't even have -- smart mirror dimming that only dims the mirror when a headlight is shining in my eyes.

Autopark is enjoyed by those who never got the hang of parallel parking. It could outdo you on certain tight spots -- if it were good enough to trust in them. However, birds-eye 360 view would probably let a human do the parking very well so that may be more valuable.

City street nav though is the most interesting one. It's possible it's never actually a useful product, just Tesla's attempt at a stepping stone to some form of FSD in the future (if they can do it.)
 
100 percent or 200 percent or more safer than the average human driver
That certainly won't cut it, because many here, or perhaps many Tesla drivers anywhere, are already "100 percent or 200 percent or more safer than the average human driver" when they drive manually. The average human driver drives dangerously.

I guess 10 times safer will cut it.
 
  • Like
Reactions: Doggydogworld
It's either true FSD (SAE Level 5) or manual control. In between is asking too much of the human behind the steering wheel.
Since the first will be hard to accomplish for years to come (despite the promising pilots), I'd say have FSD only work situational,
to make the work load manageable. I already went into this in earlier posts where/when L5 could be engaged.
 
"AV prototype" is a meaningless term you invented then demanded we explain to you. It's kinda weird.

L5 has an actual meaning. It requires a complete OEDR that works in all situations in which a human can drive.

L3 or L4 only require a complete OEDR within their ODD.

L2 has, by definition, an incomplete OEDR and thus requires a human to always supervise and perform that part of the DDT.

FSDBeta, called city streets internally by Tesla, has an incomplete ODER and Tesla has explicitly said they do not plan to change that when it goes to wide release- thus by definition it can not be more than an L2 system


I eagerly await your next misunderstanding of these basic facts you've already had explained to you numerous times.
So if the OEDR is missing just one thing, such as the ability to recognize and respond to ice cream trucks, then it is L2?
 
So, in short, Master Plan, Part Deux is:
Create stunning solar roofs with seamlessly integrated battery storage
Expand the electric vehicle product line to address all major segments
Develop a self-driving capability that is 10X safer than manual via massive fleet learning
Enable your car to make money for you when you aren't using it
That's not the way I interpret it since Tesla does not publish any estimate of the unsupervised crash rate of Autopilot. It is confusing because "self-driving" has become a meaningless phrase, used to describe both supervised and unsupervised systems. Also Elon has said 2x-3x in many contexts since then.
 
It's either true FSD (SAE Level 5) or manual control. In between is asking too much of the human behind the steering wheel.
Since the first will be hard to accomplish for years to come (despite the promising pilots), I'd say have FSD only work situational,
to make the work load manageable. I already went into this in earlier posts where/when L5 could be engaged.
None of the levels are real, but level 5 is the least real. It's a science fiction goal, unlikely to happen any time soon, not just because it's technologically very hard but because it's not commercially necessary. Level 5 only exists to tell you "a real robocar has an ODD which is a commercially viable set of streets but not the complete set of all streets."
 
That's not the way I interpret it since Tesla does not publish any estimate of the unsupervised crash rate of Autopilot. It is confusing because "self-driving" has become a meaningless phrase, used to describe both supervised and unsupervised systems. Also Elon has said 2x-3x in many contexts since then.
He said with HW3 2x human level is possible. I think you are getting confused with that. 10x is what they want to achieve and definitely they will not try to put a 2x car in robotaxi. 10x they might.

In anycase - going back to the original post - what I said, stands.

No more than 10x below needed disengage rates ?

So, given that Tesla wants 10x better than human driving, human level is the minimum bar. I.e. some 1,000x better than now.

More likely - I can see Tesla actually having AV testers in dedicated test robotaxis after surpassing human level … May be 2x better than humans, like Waymo does now.

Obviously, they are not going to register 10s of thousands of customers as AV testers with CA - which seems to be your pipe dream.
 
So if the OEDR is missing just one thing, such as the ability to recognize and respond to ice cream trucks, then it is L2?

By definition if the OEDR is incomplete within the ODD it can't be higher than L2 (it could be lower of course- dumb cruise also has an incomplete OEDR and is L1).

It doesn't matter what, or by how much, the OEDR is missing because if it's missing anything it can not safety drive without a human acting as (at least) the OEDR.


It's either true FSD (SAE Level 5) or manual control. In between is asking too much of the human behind the steering wheel.

L4 doesn't require a driver behind the wheel though, and asks literally nothing of anybody who happens to be sitting there anyway.
 
By definition if the OEDR is incomplete within the ODD it can't be higher than L2 (it could be lower of course- dumb cruise also has an incomplete OEDR and is L1).

It doesn't matter what, or by how much, the OEDR is missing because if it's missing anything it can not safety drive without a human acting as (at least) the OEDR.
So how can your statement be true?
I am trying to change the conversation because I’m now bored by the endless bickering about whether or not FSD Beta is an AV prototype. It’s also only relevant from a legal standpoint.
Naah, it's pretty relevant from a "how soon robotaxis" or "how soon >L2 on a Tesla" standpoint too.
Tesla could just leave out an infinitesimally small part of the complete OEDR and go from L2 to L5 with a safety level 10x greater than humans instantly.

He said with HW3 2x human level is possible.
At the exact same time he said HW3 would allow robotaxis... It is confusing!
 
So how can your statement be true?

Because if the system is not autonomous it can't be a robotaxi?

You get "confused" over the weirdest stuff.

Autonomous driving system has an actual definition (and requires a >L2 system)

I guess we can add that to the list of real definitions you choose to ignore in favor of ones you make up.



Tesla could just leave out an infinitesimally small part of the complete OEDR and go from L2 to L5 with a safety level 10x greater than humans instantly.

... what?

Why would anyone want to do that?

Further-There's a bunch of differences between L2 and L5, not just limited vs complete OEDR. Limited OEDR is just one of the reasons Tesla has stated for FSDBeta being L2 and why it will never be more than that
 
. what?

Why would anyone want to do that?
To avoid AV testing regulations.
It seems like when Tesla decides to report that they are testing AVs has nothing to do with “how soon robotaxis”. Remember that they did actually report AV testing before they discovered this one weird trick to avoid it.
The only thing that’s important in all this is that the potential safety issues of “end to end L2” and AV testing are exactly the same.
 
To avoid AV testing regulations.
It seems like when Tesla decides to report that they are testing AVs has nothing to do with “how soon robotaxis”. Remember that they did actually report AV testing before they discovered this one weird trick to avoid it.
The only thing that’s important in all this is that the potential safety issues of “end to end L2” and AV testing are exactly the same.
I don't think Tesla ever reported AV miles aside from the Paint It Black scam video.
 
I don't think Tesla ever reported AV miles aside from the Paint It Black scam video.
I don’t know the details of that but they also reported it for the Autonomy Day video. I guess it’s only AV testing if it‘s done by company employees and recorded. They’ve also claimed that they do testing in jurisdictions with no reporting requirements.
 
  • Like
Reactions: Doggydogworld