Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
You have no idea what you are talking about.

I mean, you're the dude who keeps insisting the SAE standard isn't law anywhere even after being shown the literal text of a law that does exactly that.


Every AV law around the world (Japan, Germany, UK, EU, US) points you to SAE standards.

This is outright false of course- like so many of your claims.

Many do NOT do that.

Some others do.


For example Florida defines self driving vehicles without a single mention or citation of the SAE standard


While NV does use the standard as the law in the state (already quoted to you)





Do you ever post things that take more than 30 seconds and google to debunk?
 
Seems suspicious that Consumer Reports would jump to conclusions or make misleading statements like: "David and Sheila Brown, who had been married for 52 years, are killed in Saratoga, Calif., after their Tesla veers off a highway. Court documents show that Autopilot was active at the time of the crash."

Vs U.S. NTSB closes probe into fatal Tesla 2020 California crash
"The NTSB said the driver was operating a 2019 Tesla Model 3 with the driver assistance system Autopilot engaged but was manually pressing the accelerator pedal causing the vehicle to go into override mode when it struck the rear of a minivan."
 
I have yet to see an example of a single accident, ever, caused by phantom braking.

I see a lot of people who THINK it's REALLY BAD and will cause one.

But the one time someone here actually measured with an accelerometer, it was about 0.2g.... or roughly the braking just from regen.

Anyone who hit you from that (and again there appear 0 examples of any actual accidents) would be at fault for illegal tailgating.


Unexpected braking can FEEL far worse than it is, because it's unexpected. That appears to mostly be what's going on here. Uncomfortable, not actually dangerous.

But by all means if you've got evidence of a bunch of Tesla phantom braking accidents, let's see it.

I said nothing about phantom braking. Not sure why you think the above is a response to what I wrote. What I wrote was that if there are even just one or two times a year when any TMC member has had to disengage autopilot, then statistically autopilot is not operating safer than a human driver and is therefore not at a level suitable for Level 3 certification.

And further that Tesla calls its systems Level 2 because they are not yet ready to be considered Level 3. Tesla is concerned with safety. It makes the safest cars ever tested. And none of its driver-assist systems is safe enough for Level 3. As a Tesla fanboy I am very glad that for all its hot air about FSD time lines, Tesla knows, acknowledges, and publishes that it's systems are level 2.

I do not trust Tesla to make good on promises to deliver as-yet nonexistent features. I do trust Tesla to make and deliver safe cars, and to not authorize any of its systems as Level 3 until they are actually safer than human drivers. (Which is why I don't expect L3 from Tesla any time soon.)
 
I said nothing about phantom braking. Not sure why you think the above is a response to what I wrote.

Because the entire "majority of users" discussion came from what S4WRXTTCS wrote, specifically mentioning phantom braking.... even citing to a link where 2 guys complained about only that issue as "evidence" of said majority.

That's the discussion your post jumped into for context.

But replace that with any other NoA "safety" issue and it doesn't change much- nobody seems able to cite system-at-fault accidents when this topic comes up.



What I wrote was that if there are even just one or two times a year when any TMC member has had to disengage autopilot, then statistically autopilot is not operating safer than a human driver and is therefore not at a level suitable for Level 3 certification.

This is pretty obviously nonsense, statistically.

First you'd need to prove there would've BEEN an accident in any of those cases.

Otherwise it's just a nervous human making a mistake (ie same or potentially worse result than leaving the system running because they didn't trust it)

Also, nothing in the SAE spec requires a system to have any specific accident rate in order to qualify as L3, so not sure how it's even relevant to that.


I don't disagree at all with the rest of your remarks regarding Teslas deep commitment to safety, though there's also an obvious liability risk once you certify your car as safe for L3 or higher as well-- thus I'd expect anyone (and maybe even MORESO Tesla) to require a much higher than "safer than a human" standard before they take on that risk of liability.

This doesn't really change the fact I think NoA is quite close (with just the improvement I mention) to being capable of L3 performance at least as good as a human-- because it seems we both agree Tesla is going to wait on a much higher standard than that before saying it's ok.
 
Seems suspicious that Consumer Reports would jump to conclusions or make misleading statements like: "David and Sheila Brown, who had been married for 52 years, are killed in Saratoga, Calif., after their Tesla veers off a highway. Court documents show that Autopilot was active at the time of the crash."

Vs U.S. NTSB closes probe into fatal Tesla 2020 California crash
"The NTSB said the driver was operating a 2019 Tesla Model 3 with the driver assistance system Autopilot engaged but was manually pressing the accelerator pedal causing the vehicle to go into override mode when it struck the rear of a minivan."



The guy who posted that is a well known FUDster.

NTSB has been VERY critical of Tesla, if even THEY didn't find any fault or issue with em you know it was pretty obviously not a car issue.
 
Again-If the issue is "NOA works great on highways but sometimes the map makes it think it's not on a highway" then NOA isn't the problem... the map is.

NoA is reliant on underlying components

NoA isn't TACC, but it relies on it
NoA isn't AP, but it relies on it
NoA isn't Maps, but it relies on it.
NoA isn't Auto Lane Change, but it relies on it.

Essentially NoA isn't really anything, but all the underlying parts put together.

All those components have to work extremely well.

As to the majority that was a flawed choice of words for both of us. We're probably just interpreting what we want from various sources. Like I posted that article and you immediately said it was FUD, and for me it confirmed my suspicions from what I've been seeing lately in various threads.

Ultimately it doesn't matter what the majority thinks, but what matters is if it can do the job. If there is a significant number of owners where its screwing up then obviously it can't do the job.
 
Seems suspicious that Consumer Reports would jump to conclusions or make misleading statements like: "David and Sheila Brown, who had been married for 52 years, are killed in Saratoga, Calif., after their Tesla veers off a highway. Court documents show that Autopilot was active at the time of the crash."

Vs U.S. NTSB closes probe into fatal Tesla 2020 California crash
"The NTSB said the driver was operating a 2019 Tesla Model 3 with the driver assistance system Autopilot engaged but was manually pressing the accelerator pedal causing the vehicle to go into override mode when it struck the rear of a minivan."

The CR article was posted on the 11th, and the probe results were just released today so I don't see anything suspicious about it.
 
Because the entire "majority of users" discussion came from what S4WRXTTCS wrote, specifically mentioning phantom braking.... even citing to a link where 2 guys complained about only that issue as "evidence" of said majority.

And you cited Threads on TMC within the last few years as your evidence that the majority of people are happy with NoA.

As to 2 guys.

People are free to make their own judgement.
 
Because the entire "majority of users" discussion came from what S4WRXTTCS wrote, specifically mentioning phantom braking.... even citing to a link where 2 guys complained about only that issue as "evidence" of said majority.

Here is another source, and this time from a TMC poll. In this poll a whole 1 vote said they never had phantom braking.

If we also allow "does not feel aggressive" we get another 26 votes.

So that's 27 out of 72.

Hmmm.

Not exactly a majority of people saying phantom braking isn't a big deal, and to make it extra funny I was one of the votes that said "does not feel aggressive" which was true at the time. So heck I voted for you, and you still lost.

 
Last edited:
Here is another source, and this time from a TMC poll. In this poll a whole 1 vote said they never had phantom braking.



What interesting there is if you look only at results from radar cars it's pretty close to even....while for vision only cars it's OVERWHELMINGLY bad.

I've mentioned before some concern about their getting vision-only to parity with radar+vision going slowly and this seems to reinforce that concern.


FWIW as someone with a radar car, who is now theoretically on vision only as part of FSDBeta, I still haven't noticed any real problematic "phantom" braking at all on NoA/Highway driving.


There's certainly lots of braking in city streets.... though it's way better in recent versions than when I first got it... Now it's more like reasonably often 1-3 mph slowdowns, pretty reliably if you crest a hill or come around a corner or anything else where visibility isn't ideal ahead.... whereas before it was a much harder slowdown.
 
Because it's a completely different code base, and a not-even-remotely-comparable ODD?

Better question would be why do you think they do apply?
I wonder what will happen when they go to single stack. Will NoA continue to be basically a L3 system or will it be obviously L2 like FSD Beta?
I always figured FSD Beta was superset of NoA.

NoA hasn't required me to perform any task at all in the last tens of thousands of miles I've used it until it leaves its ODD (ie it prompts me "NoA ending in X feet" as it exits the highway)
I'm impressed. You are extraordinarily lucky. It's hard for me to imagine using Autopilot for that distance without having to disengage. Though it seems like an L3 system would have to go many multiples of that to achieve human level safety.
What/where is that? I've never anything like that on a highway before...
https://www.google.com/maps/place/33°39'49.1%22N+117°47'50.1%22W/@33.66364,-117.7977996,241m/data=!3m2!1e3!4b1!4m14!1m7!3m6!1s0x80dcde7313cf89a3:0xa594fd03d9382684!2sUniversity+Dr,+California!3b1!8m2!3d33.6574767!4d-117.8304621!3m5!1s0x0:0x3c4e8f92d8de506b!7e2!8m2!3d33.66364!4d-117.7972449
But are trying to explain to us how it works and how you have problems with...a thing you just said you don't use?
I guess I assumed that when staying in a lane Autosteer and TACC work the same way as they do when using NoA.
 
I wonder what will happen when they go to single stack. Will NoA continue to be basically a L3 system or will it be obviously L2 like FSD Beta?
I always figured FSD Beta was superset of NoA.

I think that depends on where you think the shortcomings of city streets are and how much an issue they'd be in a different situation...

From my use so far it's been 2 main areas:

1) Perception issues seem to mainly stem from:
a) The location of the side facing forward cams in the b-pillars limiting visibility at some occluded intersections (and why i think they eventually need more forward mounted side cams to ever get city streets above L2)
b) Lack of persistent memory of objects- which allegedly is an issue they already have some code to address, we'll see.

2) Planning/policy especially for turns- this could be related to 1a, it can't always see well enough where the future path of the turn is so it's hesitant or just gives up partway through-- this is one of those things (for city streets) NN training SHOULD get much better at over time, but AFAIK a lot of this is only now starting to transition to that from traditional code


So how does that relate to single stack?

1a is basically a non-issue for NoA ODD on highways. You don't need to see around blind corners and check for cross traffic to make decisions- highways don't typically HAVE blind corners and they don't [B[ever[/B] have cross traffic.

1b would probably help a LITTLE bit on NoA for lane changing consistency- but it'd be anywhere from a neutral to tiny improvement

2) Again you don't really make turns like this on highways... I guess it might improve tight turn ramp/interchange behavior a little?


But overall, the major issues city streets have that make it far less reliable than NoA are issues that simply don't happen in NoAs ODD... so moving to single stack should be overall a non-issue or maybe a tiny improvement to what is (for me anyway) already a REALLY good system in real life use.




I'm impressed. You are extraordinarily lucky. It's hard for me to imagine using Autopilot for that distance without having to disengage. Though it seems like an L3 system would have to go many multiples of that to achieve human level safety.


My daily drive (when I had one pre covid, I still do it 1 day a week now though) was ~75 miles... ~70 of that interstate highways.

On ramp to off ramp just like it says on the tin, no issues ever with one exception I can actually think of.... and now that you bring it up it's one way single stack might IMPROVE noa :)

There was an overpass bridge on the route for a while that in the far left lane was missing the line seperating the lane and the shoulder.

So for the period that was true- AND if you happened to be in the far left lane for that 100 feet or whatever of the route, it'd do that thing where it thinks your lane got SUPER wide and being to drift a little to the left for a second before "seeing" the paint post-overpass and snapping back into the lane center.

It wasn't really a big safety concern because it was only for a second or two--- but city streets seems to do FAR better staying where it should for a second or two when markings vanish... so presumably that WOULD help in that weird situation.


Anyway- overall the route is fairly easy (mostly on one interstate, then an interchange to a second for a few miles, then the reverse the other way) and well marked.... so the system works great.

I've also taken a number of road trips, but none more than maybe 300-400 miles in a day, and usually primarily interstates there too, where again it's been pretty flawless....all have been in the US southeast (most driving in NC specifically-- but some in SC, GA, VA, and TN).


I've been plenty of human-driven wrecks in that time, but NoA has never come remotely close to being in an accident.



https://www.google.com/maps/place/33°39'49.1%22N+117°47'50.1%22W/@33.66364,-117.7977996,241m/data=!3m2!1e3!4b1!4m14!1m7!3m6!1s0x80dcde7313cf89a3:0xa594fd03d9382684!2sUniversity+Dr,+California!3b1!8m2!3d33.6574767!4d-117.8304621!3m5!1s0x0:0x3c4e8f92d8de506b!7e2!8m2!3d33.66364!4d-117.7972449

Oh that's weird as hell... it looks like an existing lane STAYS an existing lane with double yellows on either side at a glance

Obviously it MEANS for you to go left or right depending if you want the HOV lane or not- but that's not obvious with JUST the lines. Great example of an edge case to train an NN on though- once it understands that specific marking it should work perfectly from then on.



But to be fair- this is a marking error in violation of Federal highway marking standards-

HOV lanes are supposed to use a double WHITE, not a double YELLOW.
 
None of this would prevent actually finding out about the lawsuit - right ? IIRC, all lawsuits are public - not the content, but the existence of lawsuits (unless specifically sealed by the judge ?).

i would expect almost all of the disputes to be settled before suit is filed or in arbitration. Filing a suit has the extra hassle of contesting the binding arbitration clause.
 
  • Disagree
Reactions: Knightshade
I think people who are angry with Tesla/Elon assume Elon was lying when he said FSD would be ready by end of the year etc. They don't seem to be able to accept (what I think is) the fact that Elon was simply wrong. Multiple times.

It is ok to be wrong about being able to deliver if you offer to return the purchase price paid And apologize.

It is not ok to keep the money and not deliver the purchased feature.
 
i would expect almost all of the disputes to be settled before suit is filed or in arbitration. Filing a suit has the extra hassle of contesting the binding arbitration clause.


Naah.

Small claims court is explicitly exempted from the arbitration requirement, and AFAIK in nearly every state you'd be allowed to sue for the full price of FSD in small claims.
 
Are arbitrations public data too ?
Nope. This is why it's (IMO) an affront to the Constitutional protection of a right to trial. (And yes, I reognize that right is in the criminal law context.) The only ones who win with forced arbitration are the big fish that force it on the little fish, and who also control who actually judges the case against them.
 
  • Like
Reactions: t3sl4drvr
Nope. This is why it's (IMO) an affront to the Constitutional protection of a right to trial. (And yes, I reognize that right is in the criminal law context.) The only ones who win with forced arbitration are the big fish that force it on the little fish, and who also control who actually judges the case against them.
Oh well - thats not the only place where the big fish win over little fish. Happens with every legislation too.
 
  • Like
Reactions: rxlawdude