Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I think the timeline was an intention lie, and that he's way too smart to be that foolish.
Exact the problem I've been talking about.

People can't believe someone so smart as Elon - who can land rockets back on ground - can be so wrong.

But if you see the history - he is consistently wrong about timelines. He seems to assume everything will go 100% right and then give the timeline - without adding any buffer for all the things that can go wrong. Invariably things go wrong - esp when doing stuff for the first time or when trying thigs that have nver been done before.

Infact if you go back and see - you will see almost every auto major claiming FSD in "5 years" - and none have it yet.
 
Or maybe lane change requires perfect conditions, which you have and we don't. It's also useless when there's traffic since it requires much more space between cars than typical drivers leave. In any case, it's an insignificant feature since it's so easy to change lanes myself. EAP works for what I want it for, but is not fully what was promised. In fact, if my car were totaled (FSM forbid!) I would replace it with one that just had AP, since the features I'd lose are ones that don't matter to me.

Whereas I'd immediately replace it with either EAP (if I could find it) or FSD, because I use NOA all the time and it works great.

Yours is the first report I've ever even read of it not "recognizing" there's other lanes on a highway to change into- so seems to be a you/your car issue primarily, not the system.


"NOA ends in X feet" does not qualify as Level 3 and you know it

Do I?

I mean, it's not L3 legally because you're technically not told it's ok to read a book while it's on the highway.

But you specifically were talking about "alerting the user it's exiting it's ODD"

How is a message that is telling you "ODD EXIT IN 600 feet" not exactly that?


L3 must be able to notify the driver of any and all conditions that will require the driver to take over. The car knows when NOA will end because the road map tells it when you're nearing the end of suitable highway.

What other conditions REQUIRE the driver to take over from NoA when it's turned on inside it's ODD (highways), other than exiting from the type of road it can handle?

I've driven essentially the entire range of the vehicle 100% on noa, with no interventions, until needing to "take over" when it told me NOA was ending because it was exiting the highway to visit a supercharger.



True L3 allows the driver to stop paying attention, and alerts the driver in advance any time the driver will need to take over. It's not L3 just because one kind of situation is recognized. It must be able to recognize every kind of situation.


Again, what OTHER situations are there for NoA that it needs to have them take over, but does not, alert the driver to?
 
This is how I see the whole setup

Autopilot and FSD are software suites, technically I think FSD is a subset of Autopilot

Autosteer on City Streets, which is what's being tested right now, is merely one component of this software suite

Further components will be added to Autopilot/FSD and those components will go through the same iterative testing and validation that we're currently seeing with Autosteer on City Streets -- this is essentially what was told to the California DMV

Autosteer on City Streets, even when this testing etc is done and it's freely released to the full fleet of subscribers, will still be a Level 2 driver assist system requiring hands on the wheel and eyeballs pointed through the windshield

The end goal is to have all these different functions -- some that I don't think even exist yet -- firing together to achieve SAE Level 3-5 autonomy
 
Last edited by a moderator:
What other conditions REQUIRE the driver to take over from NoA when it's turned on inside it's ODD (highways), other than exiting from the type of road it can handle?

Absolutely anything else that it cannot handle. I'm not going to continue repeating myself here. You are clearly the only person here who thinks that EAP is level 3.

Goodbye and all the best to you.
 
Absolutely anything else that it cannot handle. I'm not going to continue repeating myself here.[

Awesome!

Maybe instead of repeating yourself you can actually answer the question.

What situations, while staying on the highway, can NoA "not handle" to the point if it were 'really" L3 it would prompt the human to take over.


You insisted these exist but can't give examples despite being asked multiple times.



You are clearly the only person here who thinks that EAP is level 3.

Nice strawman!

I never said it was L3. In fact my most recent post contains the phrase " it's not L3 legally"


Maybe you have trouble answering questions because you don't really read what people are actually writing?



My original statement was in my highway usage it's "been basically an L3 system in all sense but legal liability" for the 90% of my daily driving that is highways.

Legally it's L2, because I am responsible for "paying attention and being ready to intervene"

But as I mention, I don't ever have to intervene when it's in its ODD (the highway). And it tells me when it leaves the highway that NoA is ending, and thus I will need to actually start doing stuff actively again (well, it did before FSDBeta anyway).

Thus you get to basically an L3 system within its ODD.... just not legally because they don't tell you it's ok not to actively pay attention.



Heck- I even mentioned the one actual functional reason you DO still have to pay attention today- an inability to reliably detect and react to parked vehicles partly in a lane (like on a shoulder, but not far enough).

They fix that- and there's no reason at all (other than not wanting legal liability) they couldn't say it's L3 that day.


YOU were the one insisting there's "other" things it can't handle within that ODD. Yet can't cite any.
 
Awesome!

Maybe instead of repeating yourself you can actually answer the question.

What situations, while staying on the highway, can NoA "not handle" to the point if it were 'really" L3 it would prompt the human to take over.


You insisted these exist but can't give examples despite being asked multiple times.





Nice strawman!

I never said it was L3. In fact my most recent post contains the phrase " it's not L3 legally"


Maybe you have trouble answering questions because you don't really read what people are actually writing?



My original statement was in my highway usage it's "been basically an L3 system in all sense but legal liability" for the 90% of my daily driving that is highways.

Legally it's L2, because I am responsible for "paying attention and being ready to intervene"

But as I mention, I don't ever have to intervene when it's in its ODD (the highway). And it tells me when it leaves the highway that NoA is ending, and thus I will need to actually start doing stuff actively again (well, it did before FSDBeta anyway).
It's
Thus you get to basically an L3 system within its ODD.... just not legally because they don't tell you it's ok not to actively pay attention.



Heck- I even mentioned the one actual functional reason you DO still have to pay attention today- an inability to reliably detect and react to parked vehicles partly in a lane (like on a shoulder, but not far enough).

They fix that- and there's no reason at all (other than not wanting legal liability) they couldn't say it's L3 that day.


YOU were the one insisting there's "other" things it can't handle within that ODD. Yet can't cite any.
The beauty of Tesla's system is that it can do almost everything some of the time (it will in even occasionally stop for cars partially in the lane!). I think one could argue that NoA is a prototype L3 system in the same way that FSD Beta is a beta of L5 FSD. It's strange that you argue that FSD Beta is missing a bunch of features needed to be L5 but that NoA has almost everything needed for L3.
Obviously they would both be orders of magnitude less safe than the average human if you were to operate them without a driver.
 
The beauty of Tesla's system is that it can do almost everything some of the time (it will in even occasionally stop for cars partially in the lane!). I think one could argue that NoA is a prototype L3 system in the same way that FSD Beta is a beta of L5 FSD. It's strange that you argue that FSD Beta is missing a bunch of features needed to be L5 but that NoA has almost everything needed for L3.

It's not strange, it's true.

NoA needs basically ONE feature and it'd be functionally L3, lacking just Tesla announcing it as such- and it's a feature Tesla has specifically said they plan to add.

City Streets lacks features you'd need for L3, and Tesla explicitly said they do not plan to add them to it

Not sure how you're mixing those up.


Obviously they would both be orders of magnitude less safe than the average human if you were to operate them without a driver.

L3 still requires a driver physically in the car....so you appear to be arguing a point nobody made.


THAT said- I'd bet money NoA is safer than a human even if you treat it like an L3 system, and only take back over when it leaves its ODD.... (and the city streets would NOT be safer).
 
  • Funny
Reactions: AlanSubie4Life
City Streets lacks features you'd need for L3, and Tesla explicitly said they do not plan to add them to it
What feature is FSD Beta missing for it to be L5?
L3 still requires a driver physically in the car....so again you appear to be arguing a point nobody made.
Of course. Obviously I meant without a driver monitoring the system.
THAT said- I'd bet money NoA is safer than a human even if you treat it like an L3 system, and only take back over when it leaves its ODD.... (and the city streets would NOT be safer).
I doubt that. I have had to take over for NoA many times (and I only use it on the Interstate). How many of those would have resulted in a collision? I have no idea. I sure don't think it could do anywhere close to the 1.2 million miles between serious collisions (>12mph) that non-AP equipped Tesla drivers do.
 
We all know NoA rams into cars, trucks, signs and divider rails, so it certainly is not L3 capable at the moment. If it will be L3 capable in the future, Tesla still can call it L2 for legal reasons.

Some scenarios I can think of as special to a Level 3 highway only ODD:

weather; rain, snow, sun, fog. Alert and ask driver to take over, but stay in lane and reduce speed while waiting for driver. Heavy crosswinds. Black ice/freezing rain, but how to identify?

Debris and needed evasive manoeuvres must be handled autonomously.

Emergency vehicles coming from behind must be handled autonomously. But depending on rear view range, driver might be alerted. A human can spot those strobes pretty far away.

"Rettungsgasse" on European autobahn could be reason for alert, but car must stop autonomously and position itself correctly.

Identifying a recent accident, that is not blocking the lane, should result in speed reduction and alert, but car must be able to stop and evade.

Any roadwork signs or lane closed signs, but sometimes they show up pretty close and car must dhandle these autonomously. Other times there are "one lane blocked 2 km away" and car has time to alert driver.

Other vehicles stopped on the side, not blocking lane, with or without police, depending on the range, car must fix this in auto.

Detour signs would probably show up a few kms away and could be alerted to driver.

Any complex highway to highway transition it could alert driver to help. (NoA does this?)
 
  • Like
Reactions: S4WRXTTCS
What feature is FSD Beta missing for it to be L5?

For about the 9th time in the thread- read the CA DMV emails- they go into specifics about why the system is L2, is explicitly meant to be L2 and they have no intention of adding what is missing to go above it

Stop asking me when Tesla has already provided this info.


Of course. Obviously I meant without a driver monitoring the system

Not sure why you meant that, since it has nothing to do with the SAE driving level.

L3 requires a driver physically in the vehicle to take over when a vehicle leaves its ODD.


I doubt that. I have had to take over for NoA many times (and I only use it on the Interstate).

Why?
 
For about the 9th time in the thread- read the CA DMV emails- they go into specifics about why the system is L2, is explicitly meant to be L2 and they have no intention of adding what is missing to go above it

Stop asking me when Tesla has already provided this info.
Why don't you think those same specifics apply to NoA?
Not sure why you meant that, since it has nothing to do with the SAE driving level.

L3 requires a driver physically in the vehicle to take over when a vehicle leaves its ODD.
Fine, L3 does not require the driver to perform any part of dynamic driving task while the system is operational. That's what I meant by without a driver.
Most recently it swerved towards the non-lane between the two double yellows here (it was really a swerve and not the usual dumb lane centering). Of course it probably would have recovered because this particular case wasn't a gore. Mostly I disengage for phantom braking and when it doesn't respond to stopped traffic up ahead (I'm sure it would eventually but that seems like a good way to get rear-ended). I don't actually use NoA but I do user initiated auto lane changes and I've learned to keep a firm grip on the wheel because sometimes it swerves back for no reason. I've probably only done about 10k miles on Autopilot which is a super small sample size. The only way to definitively say it would have crashed (without actually crashing) is to have a crazy close call which just isn't likely to happen in 10k miles when I'm disengaging at the earliest sign of trouble.
1636936840497.png
 
Why don't you think those same specifics apply to NoA?

Because it's a completely different code base, and a not-even-remotely-comparable ODD?

Better question would be why do you think they do apply?


Fine, L3 does not require the driver to perform any part of dynamic driving task while the system is operational. That's what I meant by without a driver.

Sure.

NoA hasn't required me to perform any task at all in the last tens of thousands of miles I've used it until it leaves its ODD (ie it prompts me "NoA ending in X feet" as it exits the highway)



Most recently it swerved towards the non-lane between the two double yellows her

What/where is that? I've never anything like that on a highway before...



. I don't actually use NoA

But are trying to explain to us how it works and how you have problems with...a thing you just said you don't use?
 
  • Funny
Reactions: AlanSubie4Life
Except the original system design was not vision only

Why are you so tied to exactly how something is in one moment?

Sure in the beginning it was vision plus frontal radar, and now its vision only. What difference does that really make when Vision is the bulk of it? We know Elon was never going to go with Lidar, and we know it was highly unlikely Elon was going to go with Radar on the sides/rear.

Furthermore you know Elon isn't a Sensor fusion proponent like myself.

So it really doesn't make one lick of difference.

Just like what Tesla tells the dmv about the current iteration of FSD Beta today doesn't mean Tesla doesn't have the full ambition of having FSD as a L4 system eventually. That its the end goal.

As to EAP it doesn't really matter that in my experience it sucks, and in your experience its great. What matters if how useful it is for the majority of the people, and it doesn't seem to work all that well.

Even Elon makes jokes about how Smart Summon sucks.
 
Exact the problem I've been talking about.

People can't believe someone so smart as Elon - who can land rockets back on ground - can be so wrong.

But if you see the history - he is consistently wrong about timelines. He seems to assume everything will go 100% right and then give the timeline - without adding any buffer for all the things that can go wrong. Invariably things go wrong - esp when doing stuff for the first time or when trying thigs that have nver been done before.

Infact if you go back and see - you will see almost every auto major claiming FSD in "5 years" - and none have it yet.

I don't think this is a case of getting the timeline wrong.

This is a case where Elon didn't tell consumers what the game plan was.

With the rockets he used his own money to iterate the design over and over until it worked. It was a completely different kind of problem though where they had control over their own destiny.

With Self driving cars no one has control over their own destiny because the public has a say in what's allowed on the roadway.

With FSD he sold something before he could deliver on it, and he made promises he couldn't keep.

He's been using the "I'm always wrong with timelines" to buy himself time ever since.

For Elon fans under the influence of the Elon reality distortion field this is fine. They'll fall for the "he landed rockets" while ignoring the fact that the problem was much different, and it was an entirely different engineering challenge with different engineers that figured that out.

But, for most other people Elon has simply lost creditability regardless of whether it was intention or unintentional.
 
NoA needs basically ONE feature and it'd be functionally L3, lacking just Tesla announcing it as such- and it's a feature Tesla has specifically said they plan to add.

What feature is that?
What L3 ODD? Traffic Assist L3 or up to 80mph freeway L3?

I don't believe the current version qualifies because
  • Phantom braking -> Tesla Vision version is really bad
  • Lack of Smoothness during low speed traffic -> Can't see the point of releasing something too uncomfortable to actually use.
  • Too many lane change cancelations triggered by phantom vehicles.
  • Out of date Navigation giving incorrect information to the NoA system
  • Doesn't respond to blinkers from cars ahead changing lanes. I often have to cancel to let them in
  • Bad logic in determining appropriate times to change lanes before an exit. It will often put the blinkers on when there is car right next to me when it could have easily gotten over when there wasn't anyone there
  • Doesn't seem to have good logic on when the rear camera is dirty. Heck it doesn't even seem to use it.
  • No debris recognition/avoidance.
 
  • Informative
  • Like
Reactions: daniel and Jeff N
Whew-- lots to cover here- consolidating replies to all 3 of your posts.



Why are you so tied to exactly how something is in one moment?

Because it's the moment people keep inventing fictional FRAUD AND SCAM narratives for.

So pointing out what happened, when, is directly relevant.

It's not surprising you wouldn't appreciate this since you're on the other side of the discussion.


Sure in the beginning it was vision plus frontal radar, and now its vision only. What difference does that really make when Vision is the bulk of it?

Because vision was not the bulk of it back in late 2016

Radar was.

Tesla from late 2016 said:
After careful consideration, we now believe it (radar) can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar.

Then, later, they realized no- it can't.

So they decided vision-primary with radar backup would be "the" solution.

Then, later, they realized- no, it can't.

So now they're trying vision only.


The system has fundamentally changed multiple times at a very basic design philosophy level.

So the narrative they "knew" they'd be where they are today, still at L2, is provably false.


THAT is why points in time matter.

Furthermore you know Elon isn't a Sensor fusion proponent like myself.
[

Again this is outright false.

Elon is a "best solution for the problem" proponent.

For most of the entire history of AP that included sensor fusion.

For SpaceX it still does

But for this specific situation they eventually found sensor fusion too limiting, and have moved to something else.

Fairly recently in fact.

So your trying to pretend it was "always" the plan from the beginning just isn't' gonna work in the face of actual facts.


So it really doesn't make one lick of difference.

See above- it makes all the difference[/B]

Or did you not notice the word "timeline" in the title of the thread?

So yes, the order, in time, which thing happened makes a great difference.


Just like what Tesla tells the dmv about the current iteration of FSD Beta today doesn't mean Tesla doesn't have the full ambition of having FSD as a L4 system eventually. That its the end goal.


...what?

"FSD" the end product isn't what is discussed in the CA DMV docs.

City Streets, which some call FSDBeta, is. And is explicitly only, ever, intended as L2


That's not the same thing


As to EAP it doesn't really matter that in my experience it sucks, and in your experience its great. What matters if how useful it is for the majority of the people, and it doesn't seem to work all that well.


Again, your basic claims are inaccurate.

EAP is far more than smart summon, and I'd suggest you'd be hard pressed to find any threads on here in recent couple of years where "the majority" of people don't think it works well.




I don't think this is a case of getting the timeline wrong.

This is a case where Elon didn't tell consumers what the game plan was.

Case in point.

If the plan changed, fundamentally, multiple times, you can't ALSO claim he "knew" this is where we'd be now.



With the rockets he used his own money to iterate the design over and over until it worked. It was a completely different kind of problem though where they had control over their own destiny.

With Self driving cars no one has control over their own destiny because the public has a say in what's allowed on the roadway.


... what?

Rockets are more regulated than l2 driving systems today.

In many states even L5 driving systems are less regulated than rockets





What feature is that?

The one it's missing?

Recognizing a vehicle parked partly in its lane/not fully pulled over to the shoulder.


What L3 ODD? Traffic Assist L3 or up to 80mph freeway L3?

I guess we can add "owners manual" to the list of basic reading on the topic you haven't done :)

The ODD for NoA is limited access divided highways (ie using on/off ramps, no intersections or cross traffic, no oncoming traffic).


I don't believe the current version qualifies because
[*]Phantom braking -> Tesla Vision version is really bad

On city streets code it's annoying (though it's usually only a 1-3 mph slowdown in recent versions)

I experience virtually none on highways using the legacy wide release NoA code though... pretty much only time I get hard braking there's nothing phantom about it... (for example if I'm in the right lane approaching an on-ramp merge lane and someone is getting over in front of me- the car might brake hard there but hardly "phantom" the cause is right in front of your eyes.

[*]Lack of Smoothness during low speed traffic -> Can't see the point of releasing something too uncomfortable to actually use.

Me neither.... good thing they didn't. Again no issues.


[*]Too many lane change cancelations triggered by phantom vehicles.

You can adjust how aggressively it chooses to make these.... though even then the only lane aborts I can recall experiencing in any recent version are when a car is approaching at speed from the rear after cresting a road rise or something.


[*]Out of date Navigation giving incorrect information to the NoA system

This isn't a system limitation at all- it's a mapping problem.


[*]Doesn't respond to blinkers from cars ahead changing lanes. I often have to cancel to let them in


The braking incident I mention involved his blinker being on to merge in front of me.

FSDBeta visualizations actually SHOW the turn signals being seen/understood.


[*]Bad logic in determining appropriate times to change lanes before an exit. It will often put the blinkers on when there is car right next to me when it could have easily gotten over when there wasn't anyone there

FINALLY a thing that has actually happened to me :)

That said- this is pretty easy to train better in the long run- and not a "isn't capable of the task" thing, it's a "could do the thing better" thing.


[*]Doesn't seem to have good logic on when the rear camera is dirty. Heck it doesn't even seem to use it.

It does use it of course. I'm not sure what "logic" you're looking for it to have regarding any dirt on it?

Then again, the primary rear-relevant sensors would be the side ones, since they see approaching cars in the lanes you'd be changing into.


[*]No debris recognition/avoidance.


That's in the beta of course, but you can expect that to come to highways with the single stack in V11- so it's a largely solved problem.
 
What legal reason do you imagine supports your claim you can't discuss the fact you filed a suit?

My comments here were an explanation why you do not see lawsuits discussed here:
a) either there are no disgruntled Tesla customers,
b) none of the disgruntled customers have taken legal action,
c) legal actions taken are either ongoing (see below), or
d) legal actions have been settled outside court where part of the settlement agreement is keeping dispute confidential.

Regarding ongoing legal action: Anyone who have ever hired a competent legal representation has surely got the general advice not to discuss specifics related to ongoing legal action publicly. Reason is simple, your public statements can be used against you in court if the dispute ends up there.

My guess is that
a) There are lot of disgruntled customers (this thread has a good sample of them)
b) Some of those have initiated legal action (LinkedIn lists 400+ people in-house legal team at Tesla, many surely working on customer disputes)
c) With the fast growth Tesla has, I would large share of the initiated disputes to be ongoing
d) Any publicly tried customer dispute is risky for Tesla: (i) Press is extremely interested in the company and quality/delivery/.. problems have potential to blow up to disproportional size in headlines, (ii) any lost lawsuits may serve as encouragement for other customers with similar issue to initiate their own lawsuits. Thus, I would expect Tesla to be eager to settle and pay for silence.
 
L3 has nothing to do with legislation. Its a SAE guideline and the guideline specifies what constitutes as Level 3 and NOA fails the checklist.
It has nothing to do with laws or legalities.


I see you (as usual) only read half the post you're replying to.

Here's the relevant half you skipped:

" because you're technically not told it's ok to read a book while it's on the highway."


If Tesla did tell you it was ok to read a book while it's on the highway that would be (at least) level 3

Because only at L3 (or higher) is it ok for the driver to not be actively paying attention.


Legality is very relevant for liability BTW. Currently the driver, as an L2 system remains legally liable for anything the vehicle does.

In L3 and higher systems there'd be some expectation of legal liability attaching to the maker of the system that tells you it's ok to not pay attention.

Some states also codify driving levels and the rules around them often using SAE language, but that's getting pretty far afield of what I actually said and why I said it.
 
My comments here were an explanation why you do not see lawsuits discussed here:
a) either there are no disgruntled Tesla customers,
b) none of the disgruntled customers have taken legal action,
c) legal actions taken are either ongoing (see below), or
d) legal actions have been settled outside court where part of the settlement agreement is keeping dispute confidential.

none of that prevents someone from telling us there IS (or was) a lawsuit.

NDAs in settlements often require you don't disclose the terms of the settlement.

They don't prevent you from disclosing there WAS one.

Likewise it being "ongoing" doesn't prevent you from mentioning the suit exists- suggesting otherwise is nonsense. You'll often see people saying they can't comment on the DETAILS on an ongoing suit. They have no reason to pretend there IS no suit though.

Indeed, the existence of legal filings are generally public record- even when details of a case are sealed. Likewise for a public company someone knowing there IS a suit can be a form of pressure for them to settle it- as even you admit. So keeping it a secret doesn't serve the customer at all.


Yet nobody can find any examples of these suits they (you in this case) THINK are widespread regarding FSD.

There's like the 2 brothers (still ongoing) and... not much else.


So seems like it's either A or B.
 
Last edited: