Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
The article that surfaced yesterday of the Tesla beginning a lane-change into oncoming traffic was frightening.

This does not seem very frightening. I would kind of expect it to fail like this.

there is a general consensus that it was released too soon, they should consider adding confirmation back until its had a chance to improve.

This option already exists.

To me, I just ask myself whether I would be using ULC NoA (or even NoA, or AP!) in the pictured situation (an undivided highway with one lane each direction, the most dangerous type of road), and the answer is "of course not". And then I move on. It's obviously outside the ODD.

I don't happen to like ULC NoA currently, but I also am glad that it is available for those that wish to use it as intended. For sure it's a tough balance for Tesla to determine what level of safety (and quality) is acceptable before a release. But in the end it's a driver assistance tool and that gives them a lot of leeway. As someone else mentioned, in other cars, you can't turn on standard cruise control and just expect it to stop if there is an obstacle. It's just not intended to work that way! As a level 2 system, AP is no different really than cruise control (well, it's really level 1 vs. level 2 I suppose - but the same in the sense that the driver is 100% in complete control), though it's probably a bit safer when used correctly.
 
Last edited:
When it comes to the loss of life, this is a dire situation. The article that surfaced yesterday of the Tesla beginning a lane-change into oncoming traffic was frightening.

View attachment 410832

If Tesla wants to show ownership of the situation, and there is a consensus that it was released too soon, they should consider adding confirmation back until it has a chance to improve. To me, this will take real leadership and a commitment to user safety.

I didn't know NoA worked on undivided highways?

Is that Photoshop or something?
 
When it comes to the loss of life, this is a dire situation. The article that surfaced yesterday of the Tesla beginning a lane-change into oncoming traffic was frightening.


So... yet another example of someone using AP on a road it is explicitly not supposed to be used on

User. Error.

This ain't hard to understand.
 
To me, I just ask myself whether I would be using ULC NoA (or even NoA, or AP!) in the pictured situation (an undivided highway with one lane each direction, the most dangerous type of road), and the answer is "of course not". And then I move on. It's obviously outside the ODD.
Actually NoA is not supposed to activate on unsupported streets, which means either it was malfunctioning or the street is considered OK for NoA. Here's what the manual says:

"Navigate on Autopilot activates and deactivates as appropriate, based on the type of road you are driving on. For example, if Navigate on Autopilot is turned on and Autosteer is active when you reach a supported controlled-access road, Navigate on Autopilot automatically becomes active."
 
Actually NoA is not supposed to activate on unsupported streets, which means either it was malfunctioning or the street is considered OK for NoA. Here's what the manual says:

"Navigate on Autopilot activates and deactivates as appropriate, based on the type of road you are driving on. For example, if Navigate on Autopilot is turned on and Autosteer is active when you reach a supported controlled-access road, Navigate on Autopilot automatically becomes active."

Sure. But again, the driver is always 100% responsible for correct operation. That's all that matters to me. I'm not silly enough to think that Tesla would get all the maps right! That wouldn't make any sense. There isn't any company out there that could do that 100% correctly and have it continue to be correct 100% of the time.
 
Sure. But again, the driver is always 100% responsible for correct operation. That's all that matters to me. I'm not silly enough to think that Tesla would get all the maps right! That wouldn't make any sense. There isn't any company out there that could do that.
Honestly, I'm getting sick and tired of all the "you're holding it wrong" excuses. In this case the user used it exactly as instructed in the manual: you turn NoA on when you start your route, and it activates and deactivates automatically depending on the current road. If Tesla can't get the maps right, they shouldn't promise that it functions that way.
 
  • Disagree
Reactions: MP3Mike
Honestly, I'm getting sick and tired of all the "you're holding it wrong" excuses. In this case the user used it exactly as instructed in the manual: you turn NoA on when you start your route, and it activates and deactivates automatically depending on the current road. If Tesla can't get the maps right, they shouldn't promise that it functions that way.

Ok. Again, I don't really trust anyone and I have low expectations of everything, so maybe that's why I am ok with it.
 
Sure. But again, the driver is always 100% responsible for correct operation. That's all that matters to me. I'm not silly enough to think that Tesla would get all the maps right! That wouldn't make any sense. There isn't any company out there that could do that 100% correctly and have it continue to be correct 100% of the time.

Last I heard FSD is not available yet.
 
Honestly, I'm getting sick and tired of all the "you're holding it wrong" excuses. In this case the user used it exactly as instructed in the manual:

No, he didn't.

He was using autopilot on a 2-way, undivided road.

Something the manual explicitly tells you it is not intended to be used for

In this case the user used it exactly opposite of as instructed in the manual.

So personally I'm sick and tired of people who don't bother reading the manual.
 
No, he didn't.

He was using autopilot on a 2-way, undivided road.

Something the manual explicitly tells you it is not intended to be used for

In this case the user used it exactly opposite of as instructed in the manual.

So personally I'm sick and tired of people who don't bother reading the manual.

Quoting what others posted from the user manual:

"Navigate on Autopilot activates and deactivates as appropriate, based on the type of road you are driving on. For example, if Navigate on Autopilot is turned on and Autosteer is active when you reach a supported controlled-access road, Navigate on Autopilot automatically becomes active."

So should the system deactivate Navigate(as appropriate) while it sees a 2-way undivided road?

If not, then how does driver know when the system sees it as appropriate, and when the system does not see at all, and need driver to revert back to the user manual to find out what case is not intended to be used for?

This kind of monitoring will lead to(or might already have caused) accidents.
 
Quoting what others posted from the user manual:

"Navigate on Autopilot activates and deactivates as appropriate, based on the type of road you are driving on. For example, if Navigate on Autopilot is turned on and Autosteer is active when you reach a supported controlled-access road, Navigate on Autopilot automatically becomes active."

That's great. But Autosteer shouldn't have been active on that road in the first place- thus NoA shoud've never had the opportunity to engage.

ALSO from the manual-

Owners manual said:
Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver..... Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present...Autosteer is intended for use only by a fully attentive driver on freeways and highways where access is limited by entry and exit ramps.

(above from pages 73-74 of current online version)


So should the system deactivate Navigate(as appropriate) while it sees a 2-way undivided road?

If not, then how does driver know when the system sees it as appropriate, and when the system does not see at all, and need driver to revert back to the user manual to find out what case is not intended to be used for?

This kind of monitoring will lead to(or might already have caused) accidents.

I agree people who don't bother to read the manual and actually learn where AP is intended for use or not cause accidents.

That's the fault of the drivers though- not the car.

Which is exactly what the NHTSA found the first time an idiot used it in a place with two-way and cross traffic and ran underneath a tractor trailer on AP....and same thing they'll find again this time since the cause was the same.

User. Error.




Now- all of the above said- when Tesla actually rolls out the advanced (post-EAP) features of FSD, where it actually supports use with 2-way traffic, it will obviously need to be able to recognize such with vision and radar, rather than relying on mapping to judge what kind of road it's on.... because of exactly this kind of scenario with map data.

This seems one of those things the much larger NN and greater computer power of HW3 would solve pretty handily though.... (and the fact 2.x obviously can't would be among the reasons they explicitly tell you the system isn't intended for such roads today)
 
Status
Not open for further replies.