The other guy was in a lighter weight car which would have had an effect on his survival but of the same accord I am guessing that the Tesla Model X has a sturdier cabin and better safety rating than the Prius given all the head on collisions and accidents I've seen the Model X survive. Might be wrong but I'd be surprised if not. I'm glad the guy lived from his accident but also have to wonder if being DUI didn't factor into it.
Glad Dan is on the reporting. I hope Dan investigates how many other cars have crashed here and what the driver's outcome was.
Do we know how the Prius crashed? In other words, how was the kinetic energy of the vehicle (and driver) dissipated during the crash?
Coming to an abrupt stop in a head-on collision (especially when hitting an immobile object like the concrete barrier) puts enormous G-forces on the driver's body. And the heavier the car, the more kinetic energy it will have if it stops abruptly.
Good observation,
@mongo. The
original CHP tweet stating "driving at freeway speeds on the gore point dividing the SR-85 carpool flyover and the carpool lane on US-101 southbound collided with the attenuator barrier and caught fire" seems to indicate that they initially believed that the vehicle was
driving in the gore point before colliding with the barrier. Whether it was AP or driver error that caused the vehicle to be in that non-lane, we don't know for sure. But based on this info, the front-end damage severity, and the trajectory of the wreckage I strongly believe that the car was not swerving or making avoidance maneuvers before impact.
The media tends to embellish things. I would not be surprised at all if the "driver lost control" statement was added by a reporter or news editor. Isn't that what they always say when a car collides with a stationary object?
I mentioned in a previous post that Walter (the driver) was a personal friend of mine. Now that it's out in the media about his prior complaints about AP, I feel comfortable sharing that I heard the same thing from a friends. Apparently he discussed with his wife and at least one other close friend as recently as the week prior to the accident that AP was drifting left at this exact particular junction on previous commutes at around the same time of day. I learned of this allegation a day prior to the I-TEAM news story. Frightening.
It's definitely shocking to me if he did indeed experience this issue with AP, why he would continue to rely on it here. I can only assume he was distracted or otherwise not paying close enough attention. This whole thing is very tragic.
First, my condolences to you and Walter's family. I've spent a lot of time thinking and reading about this accident, and I simply can't imagine what his friends and family are going through.
Second, I have a theory about why someone would intentionally continue to use Autopilot if they thought it had a serious safety issue in particular section of road.
NOTE: I am in
no way suggesting that this is what happened during this crash, but I didn't see this presented as a reason why someone would intentionally do something that they know could recreate a serious safety issue, so I wanted to share this in case it's in some way helpful.
First, a couple stories about two times I've repeatedly tried to reproduce (less serious) Autopilot behavior.
#1. When we first got my wife's Model X (AP1) in late March 2016, we nearly got in an accident in the first month assuming that the vehicle would notice a stopped car (at a stoplight) ahead of us in city traffic, specifically on a 45 MPH "expressway" with the car maybe 10-20 car lengths in front of us around a bend in the road. Boy, were we wrong. Fortunately, I hit the brakes hard enough to prevent an accident. From then on, after most software updates, I would re-test this scenario (sometimes with my wife in the passenger seat, usually to her visible discomfort) to see if this scenario had improved. After getting my own Model X (AP2) in February 2017, I continued to test this scenario. I did not recall reporting this specific scenario to my Tesla Service Center (except maybe in passing), but I never asked them to investigate it. I assumed it was a limitation of the way Autopilot worked at that point in the beta. (BTW, as of 10.4, Autopilot on MX AP2 actually
does recognize a stopped vehicle at expressway speeds, although it typically recognizes it "later" than I'm comfortable with and the MX brakes moderately hard when stopping, so I don't rely on it to "always" stop in time. I'll either disengage Autopilot or simply start reducing the TACC speed to "hint" that it should start slowing down sooner.)
#2. About 5-6 months ago, I noticed that Autopilot on my MX AP2 would recognize shoulders as lanes (both left and right shoulders in certain sections of specific highways), and that initiating a lane change would actually start the car moving into that "shoulder lane". This concerned me greatly because these "shoulder lanes" frequently narrow, especially when leading up to a bridge abutment. I was so concerned that I reported this issue to Tesla's NA Service email address (and my home service center) a few times, including taking a video with my iPhone showing the lane being detected during the route, the location and the time of day when this happened. (I never tried to change lanes while recording video.) Again, after most software updates, I would re-test the "shoulder lane" detection issue in the places I knew where I previously could reproduce it to see if it had been fixed. Fortunately, testing simply meant driving by that section of road to see if a "shoulder lane" was detected by glancing at the driver's console, nothing more. (Note: I didn't realize others had noticed this specific issue until I read this thread; I just don't have time to keep up with so many threads on the forums.)
--
So hypothetically speaking, let's suppose I found a really serious bug that I thought was a serious safety concern. Further, let's say I told my Tesla Service Center about it, and they tried to reproduce it but couldn't, or they investigated it but found no actions to take.
What would I do?
If I thought it was a serious enough safety concern, and I had seen it multiple times, one thing I would
consider doing (prior to this accident) is trying to reproduce the same conditions in the interest of improving the safety for everyone driving a Tesla vehicle, and to prove to the Service Center that there really was an issue that needed to be addressed.
Why?
As a software engineer (I am one), we're either trained or we learn (rather quickly) that the fastest way to fix a software issue is to figure out how to reproduce it, and to capture information about it while trying to reproduce it in case we did reproduce it. Once a software engineer knows how to reproduce a software issue, they can fix it. And if one has evidence of it happening (such as a video), it's much harder to refute than a verbal description of the issue.
--
Again, I am in
no way suggesting that this is what happened during this crash, but I didn't see this presented as a reason why someone would intentionally do something that they know could recreate a serious safety issue, so I wanted to share this in case it's in some way helpful.
Let's wait for the results of the investigation before jumping to conclusions about what actually happened. (I suspect that it might be over one year before we have preliminary results from the NTSB, though.)