Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

( LDA ) will save lives.

This site may earn commission on affiliate links.
It is too early to say whether on average, it is helpful, or harmful. Good ideas are sometimes far more complex to implement than to conceive of, and poor implementations can feel like a good deed getting punished.

I have a different question here, which is whether or not Tesla can be held liable for steering 'corrections' made that contribute to an accident.

The case is different from many we've seen and heard about where an operator may have removed their hands from the wheel while Autopilot was engaged. In this case, we have an active steering change that may conflict with the operator's intention or direction while their hands are on the wheel. In my experience, it's possible to override the steering, but given a situation where an operator pulls the car to the left to avoid an obstacle; they may not anticipate needing to try a second time or fight lane assist features which attempt to reposition the car into the original lane.

To me, this feels a bit like a grey area, and I'm curious to know if this has ever been tested in court with any of the other manufacturers. I understood that here in the United States in 2004, "Toyota added a Lane Keeping Assist feature to the Crown Majesta which can apply a small counter-steering force to aid in keeping the vehicle in its lane" and that in Japan, Nissan introduced this feature as early as 2001. (Wikipedia) See: Lane departure warning system - Wikipedia
 
  • Like
Reactions: SammichLover
Yes, "This ain't no 'Boeing 737 Max' type of situation." :p

Why do you say that? I think this is exactly the same sort of thing. A device that is intended to assist and make driving a Tesla safer has potential to create accidents when there is a problem with either the sensors or just plain errors in the software.

I have had my car brake on the highway because it saw a highway overpass and it's shadow or from seeing something I never even recognized. Whatever it saw wasn't a car on a collision path because there were no cars in front of me. Nothing in the roadway. Nothing along the side of the road. My worry is about the cars behind me in those situations.

Maybe the car saw those three dots that the car thinks are lane markings... I'm just sayin'...
 
  • Funny
Reactions: Msjulie
I found the following study on the effectiveness of LDW on PubMed, I am copying the abstract here:

OBJECTIVE:
To evaluate the effects of lane departure warning (LDW) on single-vehicle, sideswipe, and head-on crashes.

METHOD:
Police-reported data for the relevant crash types were obtained from 25 U.S. states for the years 2009-2015. Observed counts of crashes with fatalities, injuries, and of all severities for vehicles with LDW were compared with expected counts based on crash involvement rates for the same passenger vehicles without LDW, with exposure by vehicle series, model year, and lighting system standardized between groups. For relevant crashes of all severities and those with injuries, Poisson regression was used to estimate the benefits of LDW while also controlling for demographic variables; fatal crashes were too infrequent to be modeled.

RESULTS:
Without accounting for driver demographics, vehicles with LDW had significantly lower involvement rates in crashes of all severities (18%), in those with injuries (24%), and in those with fatalities (86%). Adding controls for driver demographics in the Poisson regression reduced the estimated benefit of LDW only modestly in crashes of all severities (11%, p < 0.05) and in crashes with injuries (21%, p < 0.07).

CONCLUSIONS:
Lane departure warning is preventing the crash types it is designed to address, even after controlling for driver demographics. Results suggest that thousands of lives each year could be saved if every passenger vehicle in the United States were equipped with a lane departure warning system that performed like the study systems.

PRACTICAL APPLICATIONS:
Purchase of LDW should be encouraged, and, because drivers do not always keep the systems turned on, future efforts should focus on designing systems to encourage greater use and educating consumers about the benefits of using the systems.​


Interestingly, of the roughly 50 studies I could find on PubMed, one that stood out stated that:

Designers of active safety systems that provide autonomous lateral control should consider that a substantial proportion of drivers at risk of lane drift crashes are incapacitated. Systems that provide only transient corrective action may not ultimately prevent lanedeparture crashes for these drivers, and drivers who do avoid lane drift crashes because of these systems may be at high risk of other types of crashes when they attempt to regain control. Active lane-keeping assist systems may need to be combined with in-vehicle driver monitoring to identify incapacitated drivers and safely remove them from the roadway if the systems are to reach their maximum potential benefit.


 
I found the following study on the effectiveness of LDW on PubMed, I am copying the abstract here:

OBJECTIVE:
To evaluate the effects of lane departure warning (LDW) on single-vehicle, sideswipe, and head-on crashes.

CONCLUSIONS:
Lane departure warning is preventing the crash types it is designed to address, even after controlling for driver demographics. Results suggest that thousands of lives each year could be saved if every passenger vehicle in the United States were equipped with a lane departure warning system that performed like the study systems.

PRACTICAL APPLICATIONS:
Purchase of LDW should be encouraged, and, because drivers do not always keep the systems turned on, future efforts should focus on designing systems to encourage greater use and educating consumers about the benefits of using the systems.​

While this study looked at the improvement in crash and injury data overall for cars with LDW, we are discussing here the Tesla LDW software and specifically the demonstrated bugs in the system. Has anyone done a study on the moving target of the Tesla LDW system?
 
Use that turn signal and you'll notice all those gripes about the system will go away ;-)
It is not clear this is true for ELDA (it is for LDA).

I believe I had ELDA fire off while my signal light was on, or at the very least soon after the signal light was on. Also, unlike with LDA, the manual's description doesn't mention that the signal light will cause it not to trigger, and if you look at what the manual says it is doing, what is it trying to prevent, it doesn't actually make any sense for it to do so as it would stop working for a large portion of its potential use cases.
 
I'm going to chime in and say I actually really love this feature. Have had it on my past number of vehicles and I think its great! Especially for those moments when you happen to veer a little too close to the line. Use that turn signal and you'll notice all those gripes about the system will go away ;-)
+1

I’m getting used to it, it’s amazing how many times I get corrected. Here’s a video I took of how aggressive it takes over the wheel, shifted my body and phone to the side, note the blue lines on the screen.

943141A5-D1B2-4686-ABAA-56F0EDEB251C.jpeg

Fred
 
Last edited:
  • Like
Reactions: brandonee916
Ok are we talking about LDA or ELDA? I get that an LDA warning is helpful (had that in previous car as well.) But this ELDA feature doesn't seem ready. Fine test it out, but it should be easily disabled..the driver should have the option to permanently disable it once in drive, not have to turn it off before each drive (if that's what you prefer.)
 
  • Like
Reactions: GSP
While this study looked at the improvement in crash and injury data overall for cars with LDW, we are discussing here the Tesla LDW software and specifically the demonstrated bugs in the system. Has anyone done a study on the moving target of the Tesla LDW system?

There are fewer than ten studies mentioned on PubMed involving an mention of Tesla. You can read them here:

tesla driving - PubMed - NCBI. You'll need to skip over the non-relevant hits.

My sense is that we have not had enough time for more granular studies of Tesla's implementation to surface. Most of the studies I've found so far that cite Tesla specifically have been very small in scope, often involving less than 20 drivers.

The Effect of Partial Automation on Driver Attention: A Naturalistic Driving Study. - PubMed - NCBI - 10 Participants
Self Driving and Self Diagnosing: With Emerging Technology, Your Car May Soon Serve Not Only as Personal Chauffeur and Entertainment Center but as ... - PubMed - NCBI - Based on a single individual
An interview study exploring Tesla drivers' behavioural adaptation. - PubMed - NCBI - 20 Participants
Is partially automated driving a bad idea? Observations from an on-road study. - PubMed - NCBI - Single Vehicle
Machine learning, social learning and the governance of self-driving cars. - PubMed - NCBI - Based on a single incident

(You get the idea here.)

It would seem that LDW systems have a significant impact on reducing the number of accidents and that these can be further reduced by taking compromised drivers safely off the road. Tesla is slowly adding capabilities here, but so far, I don't know of a feature that will safely remove a driver and vehicle from the road through emergency intervention (for example, when automation features are not enabled). One may conclude that Emergency Lane Departure Assistance is best suited for drivers who may have had a brief lapse in attention, and are not under the influence.

It may be that at some point in the future, full automation will enable itself the vehicle believes the driver has been compromised. I get the sense that this is pretty far off and likely would not account for individuals who are intentionally trying to cause harm (road rage, terrorism, insured joy rides).
 
  • Informative
Reactions: GSP
To help work on sorting out LDA vs EDLA confusion:

Tesla Software 2019.16.3

I see a number of people mentioning and talking about what is likely LDA, and this is getting conflated with ELDA (in part via the magic of Tesla's bone-headed naming choices, no doubt :p ).

LDA is kinda annoying at times, screeching at you for not being close enough to the center of where it thinks the lane is after you have disengaged AP because you are entering a construction zone, and the lines are are weird and the lanes very tight. Yes, actual event that happened for me first time it fired. Poor first impression to say the least.

So, after a few more incidents, I turned that off indefinitely and spoke not a word about it here on this board. It is just more in a long line of "lane assist" features I've encountered over the years that I switched off. It is probably okay feature, maybe even an outright good thing, in some places. It does, for example, provide a positive answer to the question "what happens if I doze off and accidentally disengage AP while doing so?" More importantly I can change it to audio-only or even turn it off, indefinitely.

ELDA assist is something different. It isn't linked to AP [deactivation], as LDA appears to be. Toggling off reverts to back on as soon as you go into Park (whether you lift your butt out of the seat or not), requiring you to drill and scroll down to toggle it off.

More importantly it doesn't require the existence of lane lines. It appears that activating the signal light or active pressure on the steering wheel doesn't stop it from triggering. There's also no audio-only level of reaction. It is a different level of intrusive.
 
Last edited:
  • Informative
  • Like
Reactions: GSP and Runnergirl
Because it is very easy for the pilot here to override, and it doesn't going into an extended cycle of messing with the controls impending operation. It is sub-optimal when it triggers incorrectly but Tesla's design has a reasonable failsafe out here.

Easy to override? The problem with auto accidents is that they happen in a fraction of a second. Yes, that's not identical to the technical issues of the 737 MAX. The similarities are that the system is intended to improve the safety of operating the vehicles but have design flaws that I will simply say, greatly reduce the improvement of safety.

If Tesla didn't think this has issues, it wouldn't continually consider their software to be "beta".
 
  • Like
Reactions: Msjulie