Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
To be clear, FWIW Huang was the one guy who died on AP actually using it someplace it was supposed to work

So it's the one case where it demonstrated an actual failure of the system to operate someplace it was meant to.

THAT said- he still had, based on recorded data, plenty of time to have intervened and either braked or steered the car safely if he'd been paying attention as the system also requires- so I'm still going to call that driver error in the end (and I expect the lawsuits will work out that way too).... but I certainly don't put him in the same class of feature misuse as someone who uses the system in a place Tesla repeatedly informs you it's explicitly not intended to be used at all
This is true technically. I have been on test drives from the Tesla store where I was encouraged to turn on EAP on surface streets. This is human nature. When I had my EAP trial I did it all the time. I suspect so do you Knightshade. If Tesla REALLY wanted us to use EAP only on highways without crossing traffic, they would have the feature unavailable on other roads, as in Cadillac's supercruise system.
So my point is even smart guys can make poor decisions based on previous experiences with the EAP system. You get comfortable and then when a failure point/case arrives you are not ready to take over in time.
Even Elon has said that most EAP accidents happen to experienced users.
 
  • Like
Reactions: Matias and OPRCE
If you were correct, RADAR would not work for literally any purpose. In nearly every situation you can plausibly think of, the vast majority of surfaces will NOT be in a position to reflect a signal directly back towards the receiver. The only reason RADAR works at all is because when that signal hits, the signal does scatter to some degree. If it just reflected straight or nearly straight, it would approximately never work.

I am talking specifically about the flat side of the trailer. All objects are a closed surface, therefore some part of the surface is tangental to the radar. The side of the trailer is not necessarily one of them. The lower curve is, the chassis is, however the part we tend to see is not a great target.

Think about an ATC tower, for example. It is aiming a signal at an aircraft, and nearly the entire surface of the plane is pointing at an angle that would reflect the signal away from the receiver. Yet we know that works despite only a fraction of a percent of the area being lined up to reflect in the right direction, if that.

Planes have surfaces tangental to the radar and there are no other objects in the sky to generate returns, so picking them up is not difficult. Commercial and VFR rated private
aircraft also have radar repeaters/ transponders that emit when the ATC pulse hits them. That is what provides the data on the ATC screen. However, that runs on a secondary radar system.


Similarly, when police use RADAR to clock a car, there are approximately NEVER any surfaces that aren't slanted in a direction that would deflect the signal away from them. Yet because a car is mostly metal, the skin of the car acts like a sort of isotropic radiator, reflecting a portion of the signal in every direction.
The front/ rear of the car has tangental surfaces, especially the license plate (part of the reason for front plates). Speed radar would also contain gain control that works to pick out any returns in the speed band.


Per Wikipedia, the things that affect how visible an object is to RADAR include:
  • Material
  • Size (both absolute and relative)
  • Incident angle (emphasis mine)
  • Reflected angle, which depends on the incident angle (emphasis mine)
  • Polarization relative to the target's orientation
You're right that a truck presents a much weaker signal than if it were at ground level, but that doesn't mean you won't be able to see it at all with RADAR. If you can't see it at all — particularly at a large distance, where the angle of incidence is small — something is very, very wrong.

The further away you are, the less spread you need from the return to get it to the radar, sure. However, the stength of signal drops to the 4th power for every doubling of distance also.
If the surface is smooth (based on wavelength) and not tangential, there will not be much/ any return.

That is part of why corner reflectors are used to mark low return objects. Radar Basics - corner reflectors
 
suicidal.

Suicidal is right. The system was designed and behaves as intended and designed. Its all assistive, YOUR the DRIVER of any current automobile on the planet for public roads anywhere Tesla included and more so.

Anything happens outside of the ordinary, its your fault 1000%. No court case, no money and probably no life or life of others.

IF the car was meant to stop for any object it would never move. Your eyes are the eyes of the car.

Tell that to everyone buying a Tesla today and they would probably walk away from it. Most do not understand what is actually happening.
 
Suicidal is right. The system was designed and behaves as intended and designed. Its all assistive, YOUR the DRIVER of any current automobile on the planet for public roads anywhere Tesla included and more so.

Anything happens outside of the ordinary, its your fault 1000%. No court case, no money and probably no life or life of others.

IF the car was meant to stop for any object it would never move. Your eyes are the eyes of the car.

Tell that to everyone buying a Tesla today and they would probably walk away from it. Most do not understand what is actually happening.

I bought the Tesla knowing:
That Some features were BETA and not intended to work flawlessly
That if "Anything happens outside of the ordinary", it would be my fault as the DRIVER
That my "eyes are the eyes of the car"
That I should not turn on Autopilot and then stop paying attention to grab something off the floor or watch a movie

Ohh and I DID read the manual

Fords are unreliable
Toyota's have runaway acceleration
Lots of cars have bad airbags that may through shrapnel into your neck

We should all just live in a cave....as part of the rock overhead crack and falls, crushing our heads.

People need to take responsibility for their actions. Nothing has changed except people's interpretation of what is going on and that is the problem. You are getting into a 2 ton missile, treat it properly, and respect the technology for what it is.
 
He also reportedly had complained about the cars behavior at that exact location.
Complained to who?, Tesla does not work for the DOT. The DOT does not respond to Tesla or for that matter to the public at large as far as I know.

This gentleman was/is like all the others, he is at fault. More so that he knew it was an area where his assistive AP was not able to manage it.

He reported/complainded to his family, experienced it multiple times, I am sorry completely, its a tragedy, and its exactly what these cars do, they lull you into the feeling its just ok today, for a minute, a few seconds, its not okay. Maybe thats how the families attorneys are building a case. What a tragedy, like the one that started this thread.

Now the Attenuator was previously damaged by another accident and did not exist for the most part. Now that may have saved him had it been intact.

SO there may be a case of sorts for the family against the CA DOT or whoever is in charge of that roadway. Its a long arduous drawn out process that for the all it may or may not bring the life of that man is gone forever.
 
Complained to who?, Tesla does not work for the DOT. The DOT does not respond to Tesla or for that matter to the public at large as far as I know.

This gentleman was/is like all the others, he is at fault. More so that he knew it was an area where his assistive AP was not able to manage it.

He reported/complainded to his family, experienced it multiple times, I am sorry completely, its a tragedy, and its exactly what these cars do, they lull you into the feeling its just ok today, for a minute, a few seconds, its not okay. Maybe thats how the families attorneys are building a case. What a tragedy, like the one that started this thread.

Now the Attenuator was previously damaged by another accident and did not exist for the most part. Now that may have saved him had it been intact.

SO there may be a case of sorts for the family against the CA DOT or whoever is in charge of that roadway. Its a long arduous drawn out process that for the all it may or may not bring the life of that man is gone forever.


"Walter Huang's family tells Dan Noyes he took his Tesla to the dealer, complaining that -- on multiple occasions -- the Autopilot veered toward that same barrier -- the one his Model X hit on Friday when he died." I-TEAM EXCLUSIVE: Victim who died in Tesla crash had complained about Autopilot
 
  • Informative
Reactions: 1 person
Complained to who?, Tesla does not work for the DOT. The DOT does not respond to Tesla or for that matter to the public at large as far as I know.

This gentleman was/is like all the others, he is at fault. More so that he knew it was an area where his assistive AP was not able to manage it.

He reported/complainded to his family, experienced it multiple times, I am sorry completely, its a tragedy, and its exactly what these cars do, they lull you into the feeling its just ok today, for a minute, a few seconds, its not okay. Maybe thats how the families attorneys are building a case. What a tragedy, like the one that started this thread.

Now the Attenuator was previously damaged by another accident and did not exist for the most part. Now that may have saved him had it been intact.

SO there may be a case of sorts for the family against the CA DOT or whoever is in charge of that roadway. Its a long arduous drawn out process that for the all it may or may not bring the life of that man is gone forever.

My point was that the driver already knew it didn't work well there despite being a situation where it should. It seems that would be the exact opposite of lulled into feeling safe.
 
  • Like
Reactions: Msjulie
Suicidal is right. The system was designed and behaves as intended and designed. Its all assistive, YOUR the DRIVER of any current automobile on the planet for public roads anywhere Tesla included and more so.

Anything happens outside of the ordinary, its your fault 1000%. No court case, no money and probably no life or life of others.

IF the car was meant to stop for any object it would never move. Your eyes are the eyes of the car.

Tell that to everyone buying a Tesla today and they would probably walk away from it. Most do not understand what is actually happening.

This is true technically. I have been on test drives from the Tesla store where I was encouraged to turn on EAP on surface streets. This is human nature. When I had my EAP trial I did it all the time. I suspect so do you Knightshade. If Tesla REALLY wanted us to use EAP only on highways without crossing traffic, they would have the feature unavailable on other roads, as in Cadillac's supercruise system.
So my point is even smart guys can make poor decisions based on previous experiences with the EAP system. You get comfortable and then when a failure point/case arrives you are not ready to take over in time.
Even Elon has said that most EAP accidents happen to experienced users.

Some points to unpack here:

1. Tesla's behavior of autopilot changes, release to release. This means that under certain conditions, it improves, and under certain circumstances, it regresses. I have one case open with the advanced autopilot team right now to correct a regression. These changes can make it more difficult for the operator to predict the course of the vehicle's behavior. The car looks and feels the same, but does not always operate identically.

By way of example, my neighbor, who drives the Model X, already got into a minor Autopilot related accident because the vehicle moved in a way he did not anticipate. He reacted as quickly as he could, reducing the damage, and Tesla did not charge him for the labor to repair his Model X. He was perplexed as to the behavior of the Tesla and spent a month working with Tesla to identify what led up to the collision. He and I now operate autopilot with the same level of discipline we did at the very beginning of our ownership, which is to expect the unexpected. When driving in this way, it is unclear whether AP is our assistant, or whether the assistant is us.

2. Some of us have experienced a complete shutdown of the autopilot system while engaged. I have experienced this behavior nearly a dozen times and have documented it on video and posted about it here: AP disengaged while driving - Radar Failure (release 2019.8.3). These types of failures are examples of systems not operating "as intended," and it is Tesla's responsibility to correct.

What is surprising to me is given the number of users who've experienced this issue and travel this heavily trafficked route how long it is taking Tesla to address. Currently, we are at two months, and I have been in touch with Tesla engineering regularly.

It would not be accurate to say that AutoPilot (or any system) works as intended 100% of the time. Tesla must get as close to this number as possible, but given the number of APE failures in this release, I don't think we're even hitting 98% uptime. I've had drives lasting over an hour where it was entirely unavailable, and when it fails, it goes down for 10-15 minutes. Tesla likely needs to get to 99.99 or 99.999 percent uptime.

If given a choice, I would prefer the system to fail gracefully over unanticipated actions.

3. Recently, Tesla announced that Elon is getting more actively involved with the AP team, and there have already been additional staff changes made.

When Elon gets directly involved, it is an indication that he is not comfortable with the pace or quality of the work previously left to others. He has set an extraordinarily high bar - which is to achieve FSD this year, and for this goal to be realized, it is clear that the pace of development must accelerate. Tesla is up against several significant constraints (putting aside external, and financial constraints):

- Hardware (Both with the limits of HW 2.5, as well as new challenges arising from HW 3)
- Software (Feature enhancement and stability)
- NN training (achieving performance on par, or preferably better than human). I would imagine streamlining field data incidents to NN training is a critical achievable.
- Addressing what may be an unlimited number of corner cases

Tesla will also have to deal with technical debt and sustaining engineering for older hardware that can't get the updates Tesla has planned for its newer vehicles. Watching this development process unfold is perhaps the most interesting activity the industry has seen.
 
"Walter Huang's family tells Dan Noyes he took his Tesla to the dealer, complaining that -- on multiple occasions -- the Autopilot veered toward that same barrier -- the one his Model X hit on Friday when he died." I-TEAM EXCLUSIVE: Victim who died in Tesla crash had complained about Autopilot

This is both sad, and perplexing.
There are a couple of locations around my neck of the woods, where Tesla AP fails repeatedly and reliably. So I do my best to avoid driving on AP in those areas.
  1. One is a Y-intersection on my daily commute path, where AP either follows the car ahead and goes left or right, depending on where that car is going, or steers straight down the middle, towards a concrete island. Submitting numerous "bug reports" to Tesla from that intersection have achieved nothing, so I've trained myself to disengage the AP near that location. Unless I forget.
    • *Suggestion to Tesla* : enable "always disengage AP at this GPS location" feature!
  2. Second is an intersection cresting a hill, also on my daily commute path. Half the time the AP tracks straight, as it should. The other 50% of the time it does something random and stupid: it either phantom brakes, starts twitch turning left/right/left until it picks a direction, disengages by itself, or about 5% of the time aims straight for a giant oak tree 45-degrees to the right and accelerates towards it. Again, repeated "bug reports" to Tesla from that intersection have gone unanswered, so the best I can do is grab hold of the steering wheel and prepare to counter-act AP meltdown.
    1. *Suggestion to Tesla* : same as above.
I've also experienced two instances of AP and cruise-control shutting down mid-drive on the current 2019.12.1.2 code base (see pic below).

Until AP reliability and quality improves significantly, some form of an "auto disengage here, always" geo-tag setting would be one way to minimize probability of AP-induced accidents.

IMG_20190519_204721.jpg



Because this use case is someplace the feature is explicitly not intended to work with AP. Ever.

Not sure how that's still unclear to anyone.

I'm afraid that's not a helpful.
If you read Tesla manual's AP section warnings (page 64 and onward), their legal department had conveniently CYA-ed and disclaimed EAP functionality from working correctly in pretty much all possible situations. The following are direct quotes from the Tesla Model 3 manual:
  • Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. (p. 68)
  • Never depend on Traffic-Aware Cruise Control to slow down the vehicle enough to prevent a collision.
  • Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model 3 to slow down unnecessarily or inappropriately.
  • Although Traffic-Aware Cruise Control is capable of detecting pedestrians and cyclists, never depend on Traffic-Aware Cruise Control to adequately slow Model 3 down for them.
  • Do not use Traffic-Aware Cruise Control on city streets or on roads where traffic conditions are constantly changing.
  • Do not use Traffic-Aware Cruise Control on winding roads with sharp curves
  • Traffic-Aware Cruise Control cannot detect all objects and, especially in situations when you are driving over 50 mph (80 km/h), may not brake/decelerate when a vehicle or object is only partially in the driving lane or when a vehicle you are following moves out of your driving path and a stationary or slow-moving vehicle or object is in front of you.
  • Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.
  • Autosteer is not designed to, and will not, steer Model 3 around objects partially or completely in the driving lane.
  • Many unforeseen circumstances can impair the operation of Autosteer. Always keep this in mind and remember
    that as a result, Autosteer may not steer Model 3 appropriately.
  • Never depend on Auto Lane Change to determine an appropriate driving path.
  • Auto Lane Change on city streets or on roads where traffic conditions are constantly changing and where bicycles and pedestrians are present.
  • Navigate on Autopilot may not recognize or detect oncoming vehicles, stationary objects, and special-use lanes such as those used exclusively for bikes, carpools, emergency vehicles, etc. Remain alert at all times and be prepared to take immediate action. Failure to do so can cause damage, injury or death.

There is more where this came from, but I think you get the picture - Tesla's has crafty lawyers.
To summarize - if you follow the word of the manual, you should pretty much never ever EVER use EAP:
  • Autosteer may not steer
  • Traffic-aware cruise control may not be traffic or obstacle aware
  • Auto Lane Change is not for changing into lanes
  • Navigate on Autopilot may navigate into things and kill you

If you die, or kills someone else, while using EAP, it's always your fault.
:D

You've all been warned.
:eek: :rolleyes:


a
 
Last edited:
This is both sad, and perplexing.
There are a couple of locations around my neck of the woods, where Tesla AP fails repeatedly and reliably. So I do my best to avoid driving on AP in those areas.
  1. One is a Y-intersection on my daily commute path, where AP either follows the car ahead and goes left or right, depending on where that car is going, or steers straight down the middle, towards a concrete island. Submitting numerous "bug reports" to Tesla from that intersection have achieved nothing, so I've trained myself to disengage the AP near that location. Unless I forget.
    • *Suggestion to Tesla* : enable "always disengage AP at this GPS location" feature!
  2. Second is an intersection cresting a hill, also on my daily commute path. Half the time the AP tracks straight, as it should. The other 50% of the time it does something random and stupid: it either phantom brakes, starts twitch turning left/right/left until it picks a direction, disengages by itself, or about 5% of the time aims straight at a giant oak tree 45-degrees to the right and accelerates towards it. Again, repeated "bug reports" to Tesla from that intersection have gone unanswered, so the best I can do is grab hold of the steering wheel and prepare to counter-act AP meltdown.
    1. *Suggestion to Tesla* : same as above.
I've also experienced two instances of AP and cruise-control shutting down mid-drive on the latest code 2019.12.1.2 base (see pic below).

Until AP reliability and quality improves significantly, some form of an "auto disengage here, always" geo-tag setting would be one way to minimize probability of AP-induced accidents.

View attachment 410184





I'm afraid that's not a helpful.
If you read Tesla manual's AP section warnings (page 64 and onward), their legal department had conveniently CYA-ed and disclaimed EAP functionality from working correctly in pretty much all possible situations (the following are direct quotes from the Tesla Model 3 manual):
  • Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. (p. 68)
  • Never depend on Traffic-Aware Cruise Control to slow down the vehicle enough to prevent a collision.
  • Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model 3 to slow down unnecessarily or inappropriately.
  • Although Traffic-Aware Cruise Control is capable of detecting pedestrians and cyclists, never depend on Traffic-Aware Cruise Control to adequately slow Model 3 down for them.
  • Do not use Traffic-Aware Cruise Control on city streets or on roads where traffic conditions are constantly changing.
  • Do not use Traffic-Aware Cruise Control on winding roads with sharp curves
  • Traffic-Aware Cruise Control cannot detect all objects and, especially in situations when you are driving over 50 mph (80 km/h), may not brake/decelerate when a vehicle or object is only partially in the driving lane or when a vehicle you are following moves out of your driving path and a stationary or slow-moving vehicle or object is in front of you.
  • Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.
  • Autosteer is not designed to, and will not, steer Model 3 around objects partially or completely in the driving lane.
  • Many unforeseen circumstances can impair the operation of Autosteer. Always keep this in mind and remember
    that as a result, Autosteer may not steer Model 3 appropriately.
  • Never depend on Auto Lane Change to determine an appropriate driving path.
  • Auto Lane Change on city streets or on roads where traffic conditions are constantly changing and where bicycles and pedestrians are present.
  • Navigate on Autopilot may not recognize or detect oncoming vehicles, stationary objects, and special-use lanes such as those used exclusively for bikes, carpools, emergency vehicles, etc. Remain alert at all times and be prepared to take immediate action. Failure to do so can cause damage, injury or death.

There is more where this came from, but I think you get the picture - Tesla's has crafty lawyers.
To summarize - if you follow the word of the manual, you should pretty much never ever EVER use EAP:
  • Autosteer may not steer
  • Traffic-aware cruise control may not be traffic or obstacle aware
  • Auto Lane Change is not for changing into lanes
  • Navigate on Autopilot may navigate into things and kill you

If you die, or kills someone else, while using EAP, it's your fault anyway.
:D

You've all been warned.
:eek: :rolleyes:


a

So just some comments about your specific scenarios(I don't care that you are using AP on side streets, it is good testing for the well aware/prepared)... About #1, Yes there are some places that are tricky and work needs to be done. #2 I have more of a problem with your complaint...Yes I see the issue but I would say a human driver maintaining the speed limit going over the crest of a blind hill for the first time ever could be just as erratic. Does the road curve, which way, is there a slow/stopped object that I can't see? My only point is that I definitely understand the car behaving erratic in scenario #2 and it is an "edge case"(though not totally edgy) that needs a lot more figuring out. While I disagree with the whole HD mapping concept which Tesla does too, i do believe that some kind of map data should be used in conjunction with the normal NN decision process...for more SA.

As for your comment about the manual and lawyers disclaiming away all functionality... yep that's the way it works and Tesla isn't the only one doing it.
 
#2 I have more of a problem with your complaint...Yes I see the issue but I would say a human driver maintaining the speed limit going over the crest of a blind hill for the first time ever could be just as erratic. Does the road curve, which way, is there a slow/stopped object that I can't see?

Road goes straight over the hill.
There is usually a red light at that intersection, so I crest it at way below the speed limit, following a train of cars ahead of me.

Not really a complaint from me, as much as sharing observations about two types of situations where AP tends to get predictably confused, and requires manual over-ride. There is a slight chance that a person new to Tesla ownership is perusing this thread, and this note might save them a few blood pressure spikes.


As for your comment about the manual and lawyers disclaiming away all functionality... yep that's the way it works and Tesla isn't the only one doing it.

Indeed.
BTW, this is not at all unusual for an company doing business in the US to CYA itself with absurd legalese disclaimers (e.g.: don't put your head in a plastic bag, etc).

My comment was directed at the folks who insist that we use AP/EAP inappropriately, and if we only read the manual, we would know better.

Alas, if you read and follow the manual to the letter, you would NEVER ever use EAP.
;)

a
 
  • Like
Reactions: Matias
I'm afraid that's not a helpful.

It is if you understand the difference between:

This feature may not be 100% reliable in this situation so you need to pay attention in case it has an issue (ie handling a highway where the lane markings suddenly vanish)

and

This feature is explicitly not intended for use in this situation at all and shouldn't be expected to work properly there at all (using AP in places with cross-traffic for example).



That's a pretty wide gap.



If you die, or kills someone else, while using EAP, it's always your fault.

Of course it is.

In a level 2 driver assist system the driver is always responsible for what the car does

That's kind of the point.

That doesn't change until a system is L3 or higher- which nothing Tesla currently sells to consumers is.
 
Last edited:
Road goes straight over the hill.
There is usually a red light at that intersection, so I crest it at way below the speed limit, following a train of cars ahead of me.

Not really a complaint from me, as much as sharing observations about two types of situations where AP tends to get predictably confused, and requires manual over-ride. There is a slight chance that a person new to Tesla ownership is perusing this thread, and this note might save them a few blood pressure spikes.




Indeed.
BTW, this is not at all unusual for an company doing business in the US to CYA itself with absurd legalese disclaimers (e.g.: don't put your head in a plastic bag, etc).

My comment was directed at the folks who insist that we use AP/EAP inappropriately, and if we only read the manual, we would know better.

Alas, if you read and follow the manual to the letter, you would NEVER ever use EAP.
;)

a

I think what some people on here take issue with is when people post specifically to COMPLAIN about some great big "problem" or that "Autopilot caused me to..., or almost caused...". I could care less if someone uses the system in environments not as designed, go ahead, and please go ahead and bring up for a technical conversation as to why AP may have acted a certain way, but don't come and start with an inflammatory title about how messed up AP is and that YOU almost crashed your car because YOU weren't paying attention. And if you come with an inflammatory title and woe is me statement, don't expect to be able to just try and turn it around into a civilized discussion because you already ruined it.

"you" in my comments was not directed at you afadeez. :)
 
  • Like
Reactions: afadeev
There is more where this came from, but I think you get the picture - Tesla's has crafty lawyers.
To summarize - if you follow the word of the manual, you should pretty much never ever EVER use EAP:

Not really. It’s an ADAS system. You are supposed to be 100% driving and steering the car (without overriding the torque sensor of course). You are also supposed to be monitoring traffic in front/behind/to the side constantly and monitoring following distance constantly (though the AP will actually servo it). It is only there for assistance. The intent is that sometimes autopilot may catch things that you miss. That is one way safety is improved when AP is in use - when it is used properly without over-reliance of course.

It’s a tricky balance, I know. But that is the current capability of AP. It is not intended to do any driving for you; it is only assisting. You are still in charge; that way you can catch any errors it makes, no problem.

Used this way, on long trips, and in traffic, it reduces driver workload and fatigue from micro adjustments. If used properly that is the second way it can improve safety. But the driver is the neural net responsible for constantly avoiding obstacles and avoiding hitting other vehicles.

It is a difficult ask from Tesla to expect users to fully understand and to not rely on it too heavily, but that’s my impression of the capability. As you say, the known limitations (which are 100% real - it is not CYA as far as I can tell) are listed in the manual.
 
Last edited:
  • Like
Reactions: OPRCE
afadeev said:
I'm afraid that's not a helpful.
It is if you understand the difference between:

This feature may not be 100% reliable in this situation so you need to pay attention in case it has an issue (ie handling a highway where the lane markings suddenly vanish)

and

This feature is explicitly not intended for use in this situation at all and shouldn't be expected to work properly there at all (using AP in places with cross-traffic for example).

That's a pretty wide gap.

It would be, if the distinction was real.
Can you please refer me (and the rest of us on this forum) to any Tesla links that document that distinction?
Not what you think, believe, or would like to see, but what Tesla declares as 100% appropriate situations for using EAP?

I certainly did not see any such references after re-reading the Tesla Model 3 manual earlier today, but perhaps, there is another AEP-specific document that clarifies this.

Tesla TM3 manual is written to CYA for all possible EAP issues under all conceivable scenarios, and it makes no distinction to which you are alluding. Cross-traffic or not, traffic in lanes (partially or fully) or empty road, highways speeds or not, straight roads or curvy - there are disclaimers that declare that EAP may not work under any and all of the above scenarios.

Which is a right thing to document if you are a Tesla lawyer.
But it is an extremely unhelpful set of instructions to read if you are a Tesla owner.

Per TM3 manual, nothing in EAP is expected to work right.
If it does - consider yourself lucky. But expect a fail, and be ready to take over. At all times.

Which is fine with me.
But I think it's grossly unreasonable to blame another driver for having used EAP under wrong circumstances and gotten into an accident.

There are no right circumstances to rely on EAP, per TM3 manual.
Caveat emptor.
 
It would be, if the distinction was real.

It is. it's right there in the manual.

You quoted it, but appear to have not understood it?



Can you please refer me (and the rest of us on this forum) to any Tesla links that document that distinction?

You already quoted the manual on it. Go back and re-read it if you don't remember what it said.


Not what you think, believe, or would like to see, but what Tesla declares as 100% appropriate situations for using EAP?

Ibid.



I certainly did not see any such references after re-reading the Tesla Model 3 manual earlier today, but perhaps, there is another AEP-specific document that clarifies this.

No, that's it. Spelled out pretty clearly.


Tesla TM3 manual is written to CYA for all possible EAP issues under all conceivable scenarios, and it makes no distinction to which you are alluding.

I mean, it explicitly does.

It tells you where AP might not be 100% reliable. That's the "pay attention" part

Then it tells you what situations the system is intended or not intended, to be used at all (the oncoming traffic/cross traffic type situations).

It's pretty trivial to tell which things are which.



Per TM3 manual, nothing in EAP is expected to work right.

It says no such thing. You appear to have a lot of trouble parsing the manual though.

But I think it's grossly unreasonable to blame another driver for having used EAP under wrong circumstances and gotten into an accident.


Then you continue to be factually incorrect.

In fact the first time this happened the NHTSA did exactly that and blamed the driver for using AP under the wrong circumstances- explicitly stating the car worked as designed and as Tesla clearly indicated to the owner in multiple ways ahead of time.


There are no right circumstances to rely on EAP, per TM3 manual.

This, again, is grossly and factually false.
 
Status
Not open for further replies.