Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Accident while on EAP...

This site may earn commission on affiliate links.
;)

Playing devil's advocate here for a moment, there were one or two occasions where I had to apply more energy to avoid an accident. The one I remember most clearly was shortly after I learned to drive my Audi S4 (manual). There was a pickup truck that began drifting out of the left lane - I became aware of it because of the sound of his tires on the pitted surface of the roadway. I prepared for an accident by turning down the music system and becoming more alert. The next thing I knew, his left rear tire blew, and it caused him to spin sideways down the freeway. His rate of deceleration was very high, and he also began heading towards my lane. I had only seconds to downshift and accelerate to get in between him and the retaining wall on my right. As soon as I passed him, he entered my lane, and miraculously the truck righted itself without hitting anything.

The only thing I've observed so far is that every challenge we've faced has been considerably more difficult to solve than we originally estimated. Whether this applies to the ethics or just to the technology remains to be seen.

Yup. Accidents are often complicated, unpredictable events. An autonomous car has a better understanding of the physics/geometry and faster reactions but less imagination than a human - I'm not sure how that'll work out when it comes to avoiding weird accidents.

But accidents where the car has to decide between killing one person, or five, or the driver, and has time to make that choice are more imagined than real IMHO.
 
I've experience AP going forward even if the car ahead is a foot away in bumper to bumper traffic - I was lucky to have felt the movement and stepped on the brakes. Like the dead/blind zone registers as an open road in bumper to bumper traffic. Logic error in the AP Code - camera footage should over ride as a back up. Tesla needs to check their code.
 
;)

Playing devil's advocate here for a moment, there were one or two occasions where I had to apply more energy to avoid an accident. The one I remember most clearly was shortly after I learned to drive my Audi S4 (manual). There was a pickup truck that began drifting out of the left lane - I became aware of it because of the sound of his tires on the pitted surface of the roadway. I prepared for an accident by turning down the music system and becoming more alert. The next thing I knew, his left rear tire blew, and it caused him to spin sideways down the freeway. His rate of deceleration was very high, and he also began heading towards my lane. I had only seconds to downshift and accelerate to get in between him and the retaining wall on my right. As soon as I passed him, he entered my lane, and miraculously the truck righted itself without hitting anything.

The only thing I've observed so far is that every challenge we've faced has been considerably more difficult to solve than we originally estimated. Whether this applies to the ethics or just to the technology remains to be seen.
A pickup truck sliding sideways cannot decelerate faster than an S4 therefore a self driving car could just stop to avoid the accident. Sliding sideways will result in considerably longer stopping distances than utilizing ABS.
 
Looking at the front camera video, autosteer is in play for sure. You see all the other cars are close to the right line, because that is what a normal human would do on a right turning curve when there is plenty of service road opened to the right of the line. OP's car was dead center all these time even on impact... it was on autosteer. I can't tell if OP's foot slipped and lightly pressed against the accel pedal or the sensor malfunction at the last moment from the video. The sensor was working fine earlier on the video because I am sure OP didn't set the MAX speed to be that slow. The little curve is nothing and shouldn't cause a misread. I have AP daily and some of the curves are much sharper and driving much faster.
 
A pickup truck sliding sideways cannot decelerate faster than an S4 therefore a self driving car could just stop to avoid the accident. Sliding sideways will result in considerably longer stopping distances than utilizing ABS.

I don't recall whether that thought occurred to me at the time, but I do know that slamming on the brakes has implications on freeways, and is not always the best course of action. The stopping distance of my vehicle is not the same as the stopping distance of the person behind me, etc. and sometimes the best course of action is merely to get out of the way. Some of this behavior is already implemented in AP (swerving into an adjacent lane vs. applying brakes.)

I've already been in one situation where AP moved me over and in one where emergency breaking stopped me. In the case where I was moved over, it was not because of the presence of a car, but the position of a movable cement barrier (the kind that forms long chains.) It would not have been necessary to move, and I corrected the vehicle in the process. In that case - no harm was done. This caused me to think about the opposite situation where AP is trying to make a corrective action which would have been advisable, and I override it and get into an accident anyway.

I've experience AP going forward even if the car ahead is a foot away in bumper to bumper traffic - I was lucky to have felt the movement and stepped on the brakes. Like the dead/blind zone registers as an open road in bumper to bumper traffic. Logic error in the AP Code - camera footage should over ride as a back up. Tesla needs to check their code.

This is a good reason why having more than one forward facing radar is a good idea. Its difficult to fall back on Camera data because as far as I know, they are challenged with accurately predicting distance.
 
Do we know whether or not Tesla's T&C's are robust enough that no matter what AP does, the driver is nevertheless responsible for the operations of the vehicle? I ask because a close friend of ours who drives a Model X was recently driving in the middle lane of a three-lane freeway. Almost immediately after he passed a car in the left lane, his Model X moved into and struck that very same vehicle. The impact occurred between his rear left wheel and the vehicle's front right bumper. Tesla had AP engaged - the expected behavior was to continue in the center lane. There was no turn-signal engaged, and no lane change was initiated by the driver, or suggested by NoA. I'll see if he'll be willing to release the front facing video. It happened just prior to the three-camera recording feature.
 
Why not just stop the car if the radar fails? Adding redundant radar seems like a waste of money.

We have to be very careful when eliminating (or failing to leverage) secondary sensor systems on automated platforms. I think we can establish that visual data is not good replacement for radar data when computing distance. Sometimes when components fail, they don't fail completely - they can provide unreliable sensor data, phantom data, or be temporarily blinded. Placing them at opposite ends of the front bumper might provide enough distance between them to raise the reliability to five nine's (99.999%). We can't get near this level of reliability without redundancy. Hopefully, Tesla is observing to what the outcome of the Boeing investigation has shown so far. If the target is full automation, It is rarely cheaper in the long run to cut corners. HWv3 specs show connectors for a secondary backup radar. I don't believe a secondary radar from Continental AG would drastically change the cost of the vehicle. In Boeings case, they produced aircraft with secondary Angle of Attack Sensors but offered the integration of that sensor data as a $50,000 add on. On a $100,000,000 plane, in retrospect, they probably should have just provided that as part of the basic package.
 
We have to be very careful when eliminating (or failing to leverage) secondary sensor systems on automated platforms. I think we can establish that visual data is not good replacement for radar data when computing distance. Sometimes when components fail, they don't fail completely - they can provide unreliable sensor data, phantom data, or be temporarily blinded. Placing them at opposite ends of the front bumper might provide enough distance between them to raise the reliability to five nine's (99.999%). We can't get near this level of reliability without redundancy. Hopefully, Tesla is observing to what the outcome of the Boeing investigation has shown so far. If the target is full automation, It is rarely cheaper in the long run to cut corners. HWv3 specs show connectors for a secondary backup radar. I don't believe a secondary radar from Continental AG would drastically change the cost of the vehicle. In Boeings case, they produced aircraft with secondary Angle of Attack Sensors but offered the integration of that sensor data as a $50,000 add on. On a $100,000,000 plane, in retrospect, they probably should have just provided that as part of the basic package.
This isn't an airplane, the only thing that is redundant on most cars is the gas pedal position sensor. I'm having trouble envisioning a scenario where a radar system could fail in an undetectable way (but I'm no radar expert!). If it did you would actually need triple redundancy to be able to detect which radar was bad.
I think the idea that autonomous vehicles need a high level of redundancy is misguided. Cars break down all the time, it's not that big of a deal. You just need the vehicle to use the last data it got from the broken sensor to safely stop. I think the number of scenarios where this wouldn't work are extremely small.
P.S. The Boeing scandal is horrible and some people should probably go to jail.
 
This isn't an airplane, the only thing that is redundant on most cars is the gas pedal position sensor. I'm having trouble envisioning a scenario where a radar system could fail in an undetectable way (but I'm no radar expert!). If it did you would actually need triple redundancy to be able to detect which radar was bad.
I think the idea that autonomous vehicles need a high level of redundancy is misguided. Cars break down all the time, it's not that big of a deal. You just need the vehicle to use the last data it got from the broken sensor to safely stop. I think the number of scenarios where this wouldn't work are extremely small.
P.S. The Boeing scandal is horrible and some people should probably go to jail.

We could put it this way: Tesla is making the claim that they're the safest cars on the road. For this claim to hold true once full automation becomes available (even level 4), they're going to have to implement redundancy - the two are not only correlated but deeply tied to one another. If I had a choice of a car that had backup systems, you can bet I'd buy it, even if it meant the difference between 99.9% and 99.999% safe. In translation, a car driving 24 hours a day would have 1 minute and 26 seconds of outage time vs .9 seconds over the day. Over the year or the life of the vehicle, this can add up to real differences. I just wouldn't risk the life of my family and loved ones, let alone anyone else around me. To your point, there is undoubtedly IS a level of diminishing returns, especially so when it comes to cost and complexity, but to me, two well-positioned radars make good sense.

There are two situations which came to my attention recently - the first was my own in which either the AP system crashed, or the radar failed to provide accurate data while AP was enabled and suddenly disengaged while the vehicle was turning (we aren't entirely sure, but the behavior suggests that it was the AP system which crashed).

The second is a situation which is occurring around airports where suddenly there is radar interference. Properly shielding, grounding, and providing a secondary Radar would help to alleviate this and I believe there is one user on this forum who is in the process of having his Tesla modified to solve for this. It is not my understanding that Tesla is adding a secondary radar, but we will hear the full details soon.
 
We could put it this way: Tesla is making the claim that they're the safest cars on the road. For this claim to hold true once full automation becomes available (even level 4), they're going to have to implement redundancy - the two are not only correlated but deeply tied to one another. If I had a choice of a car that had backup systems, you can bet I'd buy it, even if it meant the difference between 99.9% and 99.999% safe. In translation, a car driving 24 hours a day would have 1 minute and 26 seconds of outage time vs .9 seconds over the day. Over the year or the life of the vehicle, this can add up to real differences. I just wouldn't risk the life of my family and loved ones, let alone anyone else around me. To your point, there is undoubtedly IS a level of diminishing returns, especially so when it comes to cost and complexity, but to me, two well-positioned radars make good sense.

There are two situations which came to my attention recently - the first was my own in which either the AP system crashed, or the radar failed to provide accurate data while AP was enabled and suddenly disengaged while the vehicle was turning (we aren't entirely sure, but the behavior suggests that it was the AP system which crashed).

The second is a situation which is occurring around airports where suddenly there is radar interference. Properly shielding, grounding, and providing a secondary Radar would help to alleviate this and I believe there is one user on this forum who is in the process of having his Tesla modified to solve for this. It is not my understanding that Tesla is adding a secondary radar, but we will hear the full details soon.
I just don’t think very much redundancy is required to achieve safety “far in excess of human drivers” which is Tesla’s stated goal. The steering motor redundancy that the Model 3 has makes sense but as long as sensor failures can be detected there should already be enough redundancy between the 3 forward facing cameras and the radar.
 
  • Like
Reactions: MP3Mike
Imagine that you're driving a car with a windshield that could be turned black at any time. Now imagine that you're driving your commute normally but with the knowledge that it could happen at any time. If it happens what are the chances that you could stop the vehicle without running into anything? I would bet that you could do it nearly 100% of the time and a computer has the advantage of perfect concentration and perfect memory.
 
  • Like
Reactions: MP3Mike
Of course hardware failures and crashed/looping software must be detected, and it's not difficult to do. The full weight of redundancy, with the laughable cost of entire systems on a chip, is easily done -- N-fold. But it's terribly complicated by the simple fact that for a moving car in traffic there is no "failsafe exit".

On machines I've designed, like process control, materials handling, rail or rope guided people transporters, there's a 99.99% applicable common exit path, a simple failsafe: just STOP! Cars in traffic? Not so. Sometimes that's the worst action. Still, in low speed situations like the OP's, a bonehead Flash Lights & STOP might go a long way in mitigating damages. It's a bit hard to understand this accident, if the AP was fully functional.

Daniel, forgive me, but "People (at Boeing) should probably go to jail" is IMHO a toxic idea, unless the solution is to put enough people in chains to bring society to a complete halt. It worries me to hear such thoughts - that and all contemporary finger-pointing, because it reflects a growing disinterest in seeing situations from enough "cameras" to understand them.

Clearly something went wrong with systems on those 2 aircraft and people died. It's extremely unlikely that it was malicious. Or that they must be packing every plane with even more layers of sensors & systems that pilots haven't mastered. It's a bit like that with our Teslas.
 
hello all, I read somewhere in this thread a question of what happens when you don’t take action on the auto steering warning to apply slight pressure on the steering wheel, and there was no definite answer to that at least for the subsequent replies I could follow, I have always wondered what will it do, weather it is smart enough to pullover maybe, so I decided to try that, unfortunately on a 1 hour drive! This is what happened, mind you I have EAP and I am on 2019.5.15 SW. The first couple warnings were gentle, blue screen on top of the auto-steering visual, and a light beep, after that were couple more warnings but more with more beeps, and, finally, the screen shows red steering wheel and continuous beeps and the car slowed down (in the freeway), and that’s when I took over, only to find out that AP was disabled for the remainder of the drive. I hope that in the future, the action taken by AP against inattentive driver, would not be to abandon the driver to force him into taking control, only because what if the driver was unconscious due to a stroke or something, in which case the safest action would be to pull over maybe and if driver not responsive still, maybe auto dial an emergency contact, I mean all the capabilities to do all that are already existing, and this would also be a good action to take even for those who are actually just ignoring the warning.
 
I've watch the video *didn't bother reading through the comments its to many* IMG_6295 were you on navigate on AP? Around 15 sec is where it seem AP turned off, but what bothers me is emergency brake didn't active.

on the side note, when my car was only 4 weeks old and was involved in a parking lot accident. Got the quote today, body shop is saying 5,000$ for this repair. Yes i had to wait 2 months for this repair *had to look for Tesla Certified body shop* Of course insurance is paying for it, but really 5 grand for this? there no pain damage btw.
IMG_0945.JPG
 
Clearly something went wrong with systems on those 2 aircraft and people died. It's extremely unlikely that it was malicious. Or that they must be packing every plane with even more layers of sensors & systems that pilots haven't mastered. It's a bit like that with our Teslas.
I said probably. I’m sure we’ll see what kind of emails were going around after the first accident. The fact that they were already working on a software fix before the second accident makes me think that they did know the full scale of the problem. We’ll see.

Stopping in traffic is fine. Cars break down in traffic all the time. As long as it’s not a common occurrence it seems like it won’t be a big enough problem to justify the extra cost of redundancy.
 
  • Like
Reactions: MP3Mike
@Finkleson "but really 5 grand for this? there no pain(t) damage btw". That's NUTS. Even for replacing the entire trunk lid! Who's the bandit here? It's how our hospitals charge $500 for a bowel movement. Maybe get a roving undocumented to "good enough" it for $100, pocket the $5k, spend it on FSD (you know, so you can send the Tesla to go pick up the kids in school downtown and bring them home).