Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Is this a "Houston, we have a problem" moment for Tesla?

This site may earn commission on affiliate links.
I wrote this statement:

TrevRex said:
The ability to correct the line of the Kia without disconnecting self steering ie say you see a pothole ahead that you will hit staying in the centre of the lane. Just steer around it then head back towards the centre and self steering reapplies. It was even possible to leave the system on all the time and have it apply itself any time you wanted ie around town etc. Also taking a racing line around sharper corners was especially nice to achieve I think or moving over pre-emptively to overtake large trucks then back to self steering in the centre of the lane. A great and confidence inspiring system I think.

Over in this thread I started:


Now I noticed someone had exported my statement into another thread via the alerts:


As I was curious to find out why my statement was exported I read the thread. Now the part of that thread that first got my attention was a quote from this NHTSA document:


The quote (see the bottom of the document) that really got my attention was:

"Peer Comparison Data gathered from peer IR letters helped ODI document the state of the L2 market in the United States, as well as each manufacturer’s approach to the development, design choices, deployment, and improvement of its systems. A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities. Unlike peer L2 systems tested by ODI, Autopilot presented resistance when drivers attempted to provide manual steering inputs. Attempts by the human driver to adjust steering manually resulted in Autosteer deactivating. This design can discourage drivers’ involvement in the driving task. Other systems tested during the PE and EA investigation accommodated drivers’ steering by suspending lane centering assistance and then reactivating it without additional action by the driver. Notably, the term “Autopilot” does not imply an L2 assistance feature, but rather elicits the idea of drivers not being in control. This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation. Peer vehicles generally use more conservative terminology like “assist,” “sense,” or “team” to imply that the driver and automation are intended to work together, with the driver supervising the automation."

Now in case others don't know:

NHTSA National Highway Traffic Safety Administration a U.S Federal Government Agency
ODI Office of Defects Investigation
L2 Level 2 autonomy
L2 peers Car manufacturers like Kia or any others that use Level 2 autonomy other than Tesla ie Tesla's peers
PE Preliminary Evaluation
EA Engineering Analysis

Now I will be putting on my Design Engineer's hat, yes I am a Design Engineer, that has to consider Government regulations for the equipment I design and produce and try and keep my bias as a Tesla owner out of any more commentary I make on this subject.

I hope others can try to do the same ie keep your Tesla bias out of this thread and evaluate this scientifically.

I have to help some other fellow workers that I employ now but will try to get back later.
 
Last edited:
  • Like
Reactions: johnchidgey
Interesting question
(And not having driven a Kia)

While the disengagement in a Tesla can be annoying, personally I think actively having to give a computer control is safer.. in a similar way to the way two pilot aircraft will talk to each other as to who has control.
 
  • Like
Reactions: Jules22
I had and drove a Kia Sportage for two months and had a number of issues. The main one was that "Lane Assist" or the mechanism that autosteers engaged every time I started the car. I could find no way of turning it off permanently. On top of that, it did not drive like I do. It wandered from side to side in the lane. It didn't ever quite go over the lane marking but following drivers must have thought I was drunk. It was quite bad and unnerving. Taking control of the steering wheel involved a reasonable amount of force and it did not dis-engage, I just had to override it. I far prefer the performance of Tesla autosteer. Yes, I do occasionally wish there was a way to miss a pothole and not have it disengage, but I'm used to it and think "no big deal". I do not know if Kia has improved, but I certainly rated it as poor. Indeed, there were a lot of other things I found poor but not relevant to L2 autonomy.
 
It wandered from side to side in the lane.
Yeah, this seems to be quite common with almost all other lane keep functions, even the recent video of the Mercedes EQS, I almost felt sea sick just watching it attempting to maintain a lane and seeing how often they drift out of the lane even on the slightest bends, you would be constantly disengaging if they didn't let you help it try to steer.
 
Ok and onward I go with my analysis of this report: https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf
that I think maybe important to me as a owner of Model Y LR and the fact I let my family drive it I wish to protect them as much as possible from any nuances that maybe applicable to Tesla's Autopilot/TACC.

Now I will firstly clean up my hurried copy and paste I did yesterday:

1716427001118.png


NHTSA National Highway Traffic Safety Administration a U.S Federal Government Agency
ODI Office of Defects Investigation
L2 Level 2 autonomy
L2 peers Car manufacturers like Kia or any others that use Level 2 autonomy other than Tesla ie Tesla's peers
PE Preliminary Evaluation
EA Engineering Analysis
IR Information Request

I will not make any comments on this till I checked the rest of the report.
 
Tesla does it different. Yup, makes sense, that's what they do.

Is it non-complient with some rule? Not yet. If the rules change? They'll change it to be compliant. It's just software.

I prefer the non-Tesla way, btw.
 
Ok important points here to me are:
1 Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones.
2 The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.
 
Now from above Engineering Analysis EA22002 is now where I need to go to chronologically assess all the data.
See: https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF

1716430504229.png

Ok Engineering Analysis is opened.

1716431158088.png


Crash attenuator truck mentioned here is interesting I think.

IMO the 2nd paragraph about sending Information Request letters to Tesla and 12 other manufacturers and the 3rd paragraph are important points :

On October 12, 2021, NHTSA sent two additional sets of requests to Tesla: (1) an IR letter to obtain information on the company’s changes to subject vehicles’ functionality through software updates intended to improve the detection of emergency vehicle lights in low light conditions; and (2) a Special Order (SO) to request information concerning Tesla’s use of nondisclosure agreements with consumers whose vehicles were included in a Full Self-Driving (FSD) “beta” release program.

I have to go as I am needed elsewhere.