Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
So, it may seem unusual for me, given my calling out of Tesla for things in the past, that I've been somewhat defending them on this particular accident with friends and others where the conversation has come up, essentially noting that the driver just should have been paying attention and that is the end of the discussion.

While that particular sentiment still mostly stands (pay attention, people)... on a recent 1000+ mile trip, my longest using AP2, I noticed some key differences between how AP1 and AP2 handle certain situations. One in particular I'm reasonably certain was a point of note in this crash.

Some notes:

AP1 senses cars in multiple lanes simultaneously, and also senses multiple cars in each lane when possible. AP2 seems to do a variant of this internally, but doesn't display the adjacent lanes.

Additionally, AP1 has a well defined follow-the-leader state. Autosteer on AP1 can follow a lead car almost indefinitely, regardless of lane markings. AP2 appears to be able to do this to a certain extent, but at no where near the level AP1 is capable of. (As an example, I can lock AP1 on to a car entering my housing development which has no lane markings at all, and it will follow them until they reach my house or turn off the road without issue. I've yet to get AP2 to be able to even engage in this situation, let alone actually follow another vehicle.)

Finally, AP1 takes all lanes of traffic into consideration when making decisions about the driving path. If, for example, you're following a vehicle in the left lane of a highway, along side a lane to the right, and there are indistinct markings that would diverge to the left (such as the case with this fatal crash), AP1 will weigh the position of the lead car and adjacent cars against the suspected lane markings, and in this case it will always continue on the path that the lead car is taking (or cars detected in an adjacent lane, if no lead car) unless the lead car is also diverging from the other lane. AP2, however, doesn't seem to take the lead car position or the position of the other vehicles in other lanes into account as heavily when making these decisions, and will continue to follow the lane marking it thinks is correct despite the position of other cars. (In the complete absence of lane markings, AP2 will tend to follow a lead car, but not nearly as accurately as AP1.)

This was VERY obvious to me on my long trip through sections I've driven with AP1 dozens of times, including long term construction areas with terrible lane markings (old markings, poor markings, no markings, etc), where AP1 would do commendably well most of the time, but AP2 didn't know its head from its tail.

All of that said... heres where I'm probably going to catch a lot of flak.

I think Tesla should probably be held accountable on this accident.
I firmly believe, given my extensive experience with the systems both as a user and hacker, that if the Model X involved in this accident had been equipped with AP1 and not AP2... this accident would not have occurred.

And herein lies the issue. Tesla knowingly released a system that was and still is inferior to its predecessor in almost every conceivable functionality. This is a system that has potentially fatal safety considerations and deficiencies compared to the already existing system. Instead of continuing with the existing system, they decided to throw all of that out the window and allow new customers to operate a less capable system in order to move their autonomous development in-house for greater control, and thus greater profit, despite the system being provably less functional than the system that already existed years prior. They let their need to get away from Mobileye trump customer satisfaction and ultimately customer safety.

In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. They released a system that was incomplete and incapable compared to the system that was already available and sold, with the main point being that the older system would have been extremely unlikely to have encountered the same issue that lead to this fatal accident. Instead they rushed out hardware with incomplete software to new customers, knowing full well that it was not as capable as the original.

As an enthusiast and overall supporter, I'm sure I and most others could forgive Tesla for a temporary lack of feature parity between AP1 and AP2... but it's been well over a year (18 months now?) without even coming close to AP1 parity, let alone "EAP", with AP2. And now that lack of parity has lead to a death, and Tesla should be ashamed of themselves.

Edit: clarified some points, fixed some typos and punctuation errors.
But didn’t MobilEye dump Tesla? So it was not possible to continue with AP1 hardware?
 
  • Like
Reactions: BigD0g
So, it may seem unusual for me, given my calling out of Tesla for things in the past, that I've been somewhat defending them on this particular accident with friends and others where the conversation has come up, essentially noting that the driver just should have been paying attention and that is the end of the discussion.

While that particular sentiment still mostly stands (pay attention, people)... on a recent 1000+ mile trip, my longest using AP2, I noticed some key differences between how AP1 and AP2 handle certain situations. One in particular I'm reasonably certain was a point of note in this crash.

Some notes:

AP1 senses cars in multiple lanes simultaneously, and also senses multiple cars in each lane when possible. AP2 seems to do a variant of this internally, but doesn't display the adjacent lanes.

Additionally, AP1 has a well defined follow-the-leader state. Autosteer on AP1 can follow a lead car almost indefinitely, regardless of lane markings. AP2 appears to be able to do this to a certain extent, but at no where near the level AP1 is capable of. (As an example, I can lock AP1 on to a car entering my housing development which has no lane markings at all, and it will follow them until they reach my house or turn off the road without issue. I've yet to get AP2 to be able to even engage in this situation, let alone actually follow another vehicle.)

Finally, AP1 takes all lanes of traffic into consideration when making decisions about the driving path. If, for example, you're following a vehicle in the left lane of a highway, along side a lane to the right, and there are indistinct markings that would diverge to the left (such as the case with this fatal crash), AP1 will weigh the position of the lead car and adjacent cars against the suspected lane markings, and in this case it will always continue on the path that the lead car is taking (or cars detected in an adjacent lane, if no lead car) unless the lead car is also diverging from the other lane. AP2, however, doesn't seem to take the lead car position or the position of the other vehicles in other lanes into account as heavily when making these decisions, and will continue to follow the lane marking it thinks is correct despite the position of other cars. (In the complete absence of lane markings, AP2 will tend to follow a lead car, but not nearly as accurately as AP1.)

This was VERY obvious to me on my long trip through sections I've driven with AP1 dozens of times, including long term construction areas with terrible lane markings (old markings, poor markings, no markings, etc), where AP1 would do commendably well most of the time, but AP2 didn't know its head from its tail.

All of that said... heres where I'm probably going to catch a lot of flak.

I think Tesla should probably be held accountable on this accident.
I firmly believe, given my extensive experience with the systems both as a user and hacker, that if the Model X involved in this accident had been equipped with AP1 and not AP2... this accident would not have occurred.

And herein lies the issue. Tesla knowingly released a system that was and still is inferior to its predecessor in almost every conceivable functionality. This is a system that has potentially fatal safety considerations and deficiencies compared to the already existing system. Instead of continuing with the existing system, they decided to throw all of that out the window and allow new customers to operate a less capable system in order to move their autonomous development in-house for greater control, and thus greater profit, despite the system being provably less functional than the system that already existed years prior. They let their need to get away from Mobileye trump customer satisfaction and ultimately customer safety.

In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. They released a system that was incomplete and incapable compared to the system that was already available and sold, with the main point being that the older system would have been extremely unlikely to have encountered the same issue that lead to this fatal accident. Instead they rushed out hardware with incomplete software to new customers, knowing full well that it was not as capable as the original.

As an enthusiast and overall supporter, I'm sure I and most others could forgive Tesla for a temporary lack of feature parity between AP1 and AP2... but it's been well over a year (18 months now?) without even coming close to AP1 parity, let alone "EAP", with AP2. And now that lack of parity has lead to a death, and Tesla should be ashamed of themselves.

There was similar comments made after the last CES presentation:

Hey guys,

So, I was watching the Mobile eye CES2017 presentation,
and I'm starting to get a little worried about the state of Teslas level 3 & 4 autopilot progress.


Maybe it is time to start working on an AP 3 version,
combining AI capability of predicting optimal driving control using GPS navigation and real time visual observation.

AP combining GPS prediction .jpg
 
So, it may seem unusual for me, given my calling out of Tesla for things in the past, that I've been somewhat defending them on this particular accident with friends and others where the conversation has come up, essentially noting that the driver just should have been paying attention and that is the end of the discussion.

While that particular sentiment still mostly stands (pay attention, people)... on a recent 1000+ mile trip, my longest using AP2, I noticed some key differences between how AP1 and AP2 handle certain situations. One in particular I'm reasonably certain was a point of note in this crash.

Some notes:

AP1 senses cars in multiple lanes simultaneously, and also senses multiple cars in each lane when possible. AP2 seems to do a variant of this internally, but doesn't display the adjacent lanes.

Additionally, AP1 has a well defined follow-the-leader state. Autosteer on AP1 can follow a lead car almost indefinitely, regardless of lane markings. AP2 appears to be able to do this to a certain extent, but at no where near the level AP1 is capable of. (As an example, I can lock AP1 on to a car entering my housing development which has no lane markings at all, and it will follow them until they reach my house or turn off the road without issue. I've yet to get AP2 to be able to even engage in this situation, let alone actually follow another vehicle.)

Finally, AP1 takes all lanes of traffic into consideration when making decisions about the driving path. If, for example, you're following a vehicle in the left lane of a highway, along side a lane to the right, and there are indistinct markings that would diverge to the left (such as the case with this fatal crash), AP1 will weigh the position of the lead car and adjacent cars against the suspected lane markings, and in this case it will always continue on the path that the lead car is taking (or cars detected in an adjacent lane, if no lead car) unless the lead car is also diverging from the other lane. AP2, however, doesn't seem to take the lead car position or the position of the other vehicles in other lanes into account as heavily when making these decisions, and will continue to follow the lane marking it thinks is correct despite the position of other cars. (In the complete absence of lane markings, AP2 will tend to follow a lead car, but not nearly as accurately as AP1.)

This was VERY obvious to me on my long trip through sections I've driven with AP1 dozens of times, including long term construction areas with terrible lane markings (old markings, poor markings, no markings, etc), where AP1 would do commendably well most of the time, but AP2 didn't know its head from its tail.

All of that said... heres where I'm probably going to catch a lot of flak.

I think Tesla should probably be held accountable on this accident.
I firmly believe, given my extensive experience with the systems both as a user and hacker, that if the Model X involved in this accident had been equipped with AP1 and not AP2... this accident would not have occurred.

And herein lies the issue. Tesla knowingly released a system that was and still is inferior to its predecessor in almost every conceivable functionality. This is a system that has potentially fatal safety considerations and deficiencies compared to the already existing system. Instead of continuing with the existing system, they decided to throw all of that out the window and allow new customers to operate a less capable system in order to move their autonomous development in-house for greater control, and thus greater profit, despite the system being provably less functional than the system that already existed years prior. They let their need to get away from Mobileye trump customer satisfaction and ultimately customer safety.

In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. They released a system that was incomplete and incapable compared to the system that was already available and sold, with the main point being that the older system would have been extremely unlikely to have encountered the same issue that lead to this fatal accident. Instead they rushed out hardware with incomplete software to new customers, knowing full well that it was not as capable as the original.

As an enthusiast and overall supporter, I'm sure I and most others could forgive Tesla for a temporary lack of feature parity between AP1 and AP2... but it's been well over a year (18 months now?) without even coming close to AP1 parity, let alone "EAP", with AP2. And now that lack of parity has lead to a death, and Tesla should be ashamed of themselves.

Edit: clarified some points, fixed some typos and punctuation errors.
@wk057, I appreciate your posting this detailed analysis. I'm glad my car is AP1, but aren't there be some realistic scenarios where the version of AP biased towards car-following (AP1) produces an accident that the other version would not? Specifically, what if the car immediately ahead of you was moving into the gore (the non-lane) trying to get onto 101 at the last minute. This kind of maneuver is not that uncommon. It could draw your car after it into the gore, if yours had AP1. Then if the car ahead swerved right sharply onto 101 at the last minute, your AP1 car could be heading straight at the barrier. If you had AP2, your car might have stayed in the lane to the left of the gore, avoiding an accident. In other words, follow-the-leader may not always be the desirable choice.

Again, thanks for the analysis.
 
Last edited:
  • Like
Reactions: bhzmark and mongo
Who do you think the demo of the Volvo is? They are known and always have been for safety over looks and have always attracted those most interested in safety. Your exact argument for why you apparently think Tesla's aren't safe (even though crash test data proves you dead wrong) is the same argument for Volvos...except that their cars are also slower, unsexy, not progressive and not a temptation to push to the limits, unlike Teslas.

The BMW 5 series is a faster sexier car and as far as deaths go, is every bit as safe as the XC90. from 2012 through 2015 no deaths occurred in a 5 series.
The common factor is that these cars are driven by the same demographic. 35-55yr old people who make 250k+ per year. Very safe drivers.
 
So, it may seem unusual for me, given my calling out of Tesla for things in the past, that I've been somewhat defending them on this particular accident with friends and others where the conversation has come up, essentially noting that the driver just should have been paying attention and that is the end of the discussion.

While that particular sentiment still mostly stands (pay attention, people)... on a recent 1000+ mile trip, my longest using AP2, I noticed some key differences between how AP1 and AP2 handle certain situations. One in particular I'm reasonably certain was a point of note in this crash.

Some notes:

AP1 senses cars in multiple lanes simultaneously, and also senses multiple cars in each lane when possible. AP2 seems to do a variant of this internally, but doesn't display the adjacent lanes.

Additionally, AP1 has a well defined follow-the-leader state. Autosteer on AP1 can follow a lead car almost indefinitely, regardless of lane markings. AP2 appears to be able to do this to a certain extent, but at no where near the level AP1 is capable of. (As an example, I can lock AP1 on to a car entering my housing development which has no lane markings at all, and it will follow them until they reach my house or turn off the road without issue. I've yet to get AP2 to be able to even engage in this situation, let alone actually follow another vehicle.)

Finally, AP1 takes all lanes of traffic into consideration when making decisions about the driving path. If, for example, you're following a vehicle in the left lane of a highway, along side a lane to the right, and there are indistinct markings that would diverge to the left (such as the case with this fatal crash), AP1 will weigh the position of the lead car and adjacent cars against the suspected lane markings, and in this case it will always continue on the path that the lead car is taking (or cars detected in an adjacent lane, if no lead car) unless the lead car is also diverging from the other lane. AP2, however, doesn't seem to take the lead car position or the position of the other vehicles in other lanes into account as heavily when making these decisions, and will continue to follow the lane marking it thinks is correct despite the position of other cars. (In the complete absence of lane markings, AP2 will tend to follow a lead car, but not nearly as accurately as AP1.)

This was VERY obvious to me on my long trip through sections I've driven with AP1 dozens of times, including long term construction areas with terrible lane markings (old markings, poor markings, no markings, etc), where AP1 would do commendably well most of the time, but AP2 didn't know its head from its tail.

All of that said... heres where I'm probably going to catch a lot of flak.

I think Tesla should probably be held accountable on this accident.
I firmly believe, given my extensive experience with the systems both as a user and hacker, that if the Model X involved in this accident had been equipped with AP1 and not AP2... this accident would not have occurred.

And herein lies the issue. Tesla knowingly released a system that was and still is inferior to its predecessor in almost every conceivable functionality. This is a system that has potentially fatal safety considerations and deficiencies compared to the already existing system. Instead of continuing with the existing system, they decided to throw all of that out the window and allow new customers to operate a less capable system in order to move their autonomous development in-house for greater control, and thus greater profit, despite the system being provably less functional than the system that already existed years prior. They let their need to get away from Mobileye trump customer satisfaction and ultimately customer safety.

In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. They released a system that was incomplete and incapable compared to the system that was already available and sold, with the main point being that the older system would have been extremely unlikely to have encountered the same issue that lead to this fatal accident. Instead they rushed out hardware with incomplete software to new customers, knowing full well that it was not as capable as the original.

As an enthusiast and overall supporter, I'm sure I and most others could forgive Tesla for a temporary lack of feature parity between AP1 and AP2... but it's been well over a year (18 months now?) without even coming close to AP1 parity, let alone "EAP", with AP2. And now that lack of parity has lead to a death, and Tesla should be ashamed of themselves.

Edit: clarified some points, fixed some typos and punctuation errors.

Good argument. I would hesitate to say full responsibility, but definitely some responsibility. At some point, you can't push everything onto the driver and claim
beta testing. Maybe Tanaka should have labelled their airbags as being in beta testing to shirk any responsibility. I know that is ridiculous but we are
closing in on such a thing happening.

Everyone can internet argue all they like, but what it comes down to is how this is portrayed in front of a jury.

Ladies and Gentleman... is it reasonable to believe that one would assume their car wouldn't drive into a stationary object?
Would you assume that Automatic Emergency Braking, that is marketed as detecting soft and hard stationary objects and stopping the car... would have worked
and saved this man's life? We have seen others recreating the same error in both of these systems. These drivers were paying full attention and barely stopped
in time. It is easy to suggest the driver was distracted, but we do not know for sure. Perhaps he was checking his side and rearview mirrors. He took his eyes off
the road for 3 or 4 seconds and that was enough time for this disaster to occur. No distraction there, simply a part of driving.

Tesla will look terrible tearing apart the driver.

This day is coming and they need to own it.
 
Excellent intelligent analysis. Best so far on this site and thanks for xposting to find this better thread.

But

I don't think you have sufficient data to conclude that because AP2 is worse in this scenario, that it is overall statistically worse that AP1, especially at it's current level.

It is highly likely that AP2 is better than AP1 in some different scenarios --- a semi crosswise blocking the road perhaps....

But I agree that following the car in front should probably be more heavily weighted.

And caltrans should do much much better to mark lane lines, mark gore zones, and reset attenuators promptly.

Caltrans is still way more blameworthy than Tesla for this fatality.

I think Tesla should probably be held accountable on this accident.
I firmly believe, given my extensive experience with the systems both as a user and hacker, that if the Model X involved in this accident had been equipped with AP1 and not AP2... this accident would not have occurred
 
Excellent intelligent analysis. Best so far on this site and thanks for xposting to find this better thread.

But

I don't think you have sufficient data to conclude that because AP2 is worse in this scenario, that it is overall statistically worse that AP1, especially at it's current level.

It is highly likely that AP2 is better than AP1 in some different scenarios --- a semi crosswise blocking the road perhaps....

But I agree that following the car in front should probably be more heavily weighted.

Sometimes the lane markings are wrong (the construction crash, and 101).
Sometimes the car in front is wrong (fire truck crash last second swerve, sort of).
No matter which you trust, you will be wrong sometimes, so picking one to weigh more seems unreliable.
 
Ladies and Gentleman... is it reasonable to believe that one would assume their car wouldn't drive into a stationary object?
No.

Would you assume that Automatic Emergency Braking, that is marketed as detecting soft and hard stationary objects and stopping the car... would have worked and saved this man's life?
AEB is not marketed that way.
Warning: Automatic Emergency Braking is not designed to prevent a collision. At best, it can minimize the impact of a frontal collision by attempting to reduce your driving speed. Depending on Automatic Emergency Braking to avoid a collision can result in serious injury or death.

The forward looking camera(s) and the radar sensor are designed to determine the distance from an object (vehicle, motorcycle, bicycle, or pedestrian) traveling in front of Model S. When a frontal collision is considered unavoidable, Automatic Emergency Braking is designed to apply the brakes to reduce the severity of the impact.
Page from S manual attachedaeb.PNG


We have seen others recreating the same error in both of these systems. These drivers were paying full attention and barely stopped
in time. It is easy to suggest the driver was distracted, but we do not know for sure. Perhaps he was checking his side and rearview mirrors. He took his eyes off
the road for 3 or 4 seconds and that was enough time for this disaster to occur. No distraction there, simply a part of driving.

Then they also were not paying attention. A driver in control of the vehicle would not allow following the path toward the barrier.

Tesla will look terrible tearing apart the driver.
Why would they tear apart the driver?
Was he driving?
Did the system prevent him from taking action?

Is that tearing him apart?
 
Let me clarify my earlier statements a bit. (@bonnie and others)

Don't get me wrong. I never said the driver wasn't culpable in this. I fully and wholeheartedly believe this is an excellent "Darwin Award" situation since the driver just let their vehicle drive off into a barrier without paying attention. I personally have little sympathy for that, same with pedal misapplication stuff. *That* is not Tesla's fault. In the eyes of the law and common sense, I think this is entirely the driver's fault and it should be left at that. Driver error.

I also never said the *driver* needed any AP1 experience for this to matter. *Tesla* knows full well the disparity between AP1 and AP2, and must know the safety considerations and drawbacks of AP2 vs AP1.

The main point is that I think Tesla loses right the to the moral high ground on the matter, because of their rush to implement AP2 and ditch AP1 entirely. I'm 99.9% sure that AP1 would not have suffered from this particular error in the same situation, especially if there were any cars in front of the X in their or any other lane, in which case we wouldn't be having this discussion. (Edit: A long time contact at Tesla, who wishes to remain anonymous, obviously, confirmed the same to me as I was writing this.) Tesla above all should be well aware of the disparity between their original system and their new one. I feel pretty badly for the developers that worked on these systems and know this, too, and may even feel somewhat responsible. It's Tesla's arrogance, IMO, that bears responsibility for this crash. Had they not burned their bridges with Mobileye and continued developing a sensible system we'd likely already be pretty close to FSD capability by now. Instead, Tesla decided to set back functionality and safety by years in developing their own system.

I have enough technical data to know exactly how AP1 handles these situations. See some of my older posts about my augmented AP1 car. For AP2, I have the firmware for this as well, and have done analysis on many aspects of it in my work to see if a retrofit is viable and if it can be augmented with my AP1-type modifications. I know enough about the system and how it handles particular inputs from both an observational perspective as a user and from a technical perspective. I qualify many of my statements related to such reverse engineering efforts as "seems to" because there is always the chance that my analysis isn't perfect. In this case, however, I'd put the odds heavily in my favor given the data.

I had an issue with Tesla releasing AP2 cars from the start. They took huge steps backwards, and were delivering cars to people that lacked basic functionality available to all owners previously without any warning to anyone. It's taken them 18 months to get the system even close to AP1 parity, and it's still terrible in comparison in many aspects. There are relatively few areas where it meets or maybe slightly exceeds AP1 's capabilities thus far, all of which are highly subjective. From a purely objective and quantifiable view the system is still far inferior to AP1 by almost every metric. From day one AP2 cars lacked key safety features already available on AP1 cars, such as AEB, side collision protection, etc. They finally started rolling these things out, and thankfully there were no accounts of injuries or deaths in that period.

My biggest issue is that Tesla just threw the baby out with the bath water in their spat with Mobileye, and now an owner has paid the ultimate price as a direct result of that decision. Yes, the driver should have been paying attention... but the entire system should have been 2 years more advanced than original AP1 by now, also, not a year or more behind what already existed... not even counting that AP1 wouldn't have even had this particular failing in its stale development state from 2 years ago already.

Are they liable for any kind of damages as a result? Probably not. Should we as owners and enthusiasts let them continue to get away with these practices they continue to follow with massive misinformation about products, deadlines missed by huge margins, large steps backwards in tech, etc? Absolutely not.

Everyone at Tesla involved in ditching AP1 should take a step back and admit they screwed up at this point. This tragedy was preventable by them. And that's an issue to me.
 
Let me clarify my earlier statements a bit. (@bonnie and others)

Don't get me wrong. I never said the driver wasn't culpable in this. I fully and wholeheartedly believe this is an excellent "Darwin Award" situation since the driver just let their vehicle drive off into a barrier without paying attention. I personally have little sympathy for that, same with pedal misapplication stuff. *That* is not Tesla's fault. In the eyes of the law and common sense, I think this is entirely the driver's fault and it should be left at that. Driver error.

I also never said the *driver* needed any AP1 experience for this to matter. *Tesla* knows full well the disparity between AP1 and AP2, and must know the safety considerations and drawbacks of AP2 vs AP1.

The main point is that I think Tesla loses right the to the moral high ground on the matter, because of their rush to implement AP2 and ditch AP1 entirely. I'm 99.9% sure that AP1 would not have suffered from this particular error in the same situation, especially if there were any cars in front of the X in their or any other lane, in which case we wouldn't be having this discussion. (Edit: A long time contact at Tesla, who wishes to remain anonymous, obviously, confirmed the same to me as I was writing this.) Tesla above all should be well aware of the disparity between their original system and their new one. I feel pretty badly for the developers that worked on these systems and know this, too, and may even feel somewhat responsible. It's Tesla's arrogance, IMO, that bears responsibility for this crash. Had they not burned their bridges with Mobileye and continued developing a sensible system we'd likely already be pretty close to FSD capability by now. Instead, Tesla decided to set back functionality and safety by years in developing their own system.

I have enough technical data to know exactly how AP1 handles these situations. See some of my older posts about my augmented AP1 car. For AP2, I have the firmware for this as well, and have done analysis on many aspects of it in my work to see if a retrofit is viable and if it can be augmented with my AP1-type modifications. I know enough about the system and how it handles particular inputs from both an observational perspective as a user and from a technical perspective. I qualify many of my statements related to such reverse engineering efforts as "seems to" because there is always the chance that my analysis isn't perfect. In this case, however, I'd put the odds heavily in my favor given the data.

I had an issue with Tesla releasing AP2 cars from the start. They took huge steps backwards, and were delivering cars to people that lacked basic functionality available to all owners previously without any warning to anyone. It's taken them 18 months to get the system even close to AP1 parity, and it's still terrible in comparison in many aspects. There are relatively few areas where it meets or maybe slightly exceeds AP1 's capabilities thus far, all of which are highly subjective. From a purely objective and quantifiable view the system is still far inferior to AP1 by almost every metric. From day one AP2 cars lacked key safety features already available on AP1 cars, such as AEB, side collision protection, etc. They finally started rolling these things out, and thankfully there were no accounts of injuries or deaths in that period.

My biggest issue is that Tesla just threw the baby out with the bath water in their spat with Mobileye, and now an owner has paid the ultimate price as a direct result of that decision. Yes, the driver should have been paying attention... but the entire system should have been 2 years more advanced than original AP1 by now, also, not a year or more behind what already existed... not even counting that AP1 wouldn't have even had this particular failing in its stale development state from 2 years ago already.

Are they liable for any kind of damages as a result? Probably not. Should we as owners and enthusiasts let them continue to get away with these practices they continue to follow with massive misinformation about products, deadlines missed by huge margins, large steps backwards in tech, etc? Absolutely not.

Everyone at Tesla involved in ditching AP1 should take a step back and admit they screwed up at this point. This tragedy was preventable by them. And that's an issue to me.

You are saying dumping AP1/ ME was Tesla's decision. How do you reconcile that against Elon Musk Says Mobileye Forced Tesla Vision "Across the Rubicon" | Inverse

“The original plan was to have a migration strategy where we have Mobileye and Tesla Vision operating at the same time, to have kind of a smooth process but But Mobileye refused to do that,” Musk said during an earnings call with investors on Wednesday. “So that forced us to re-spin the board and kind of cross the rubicon on Tesla Vision.”

Was Mobileye willing to continue working with Tesla? WSJ seems to think not
Mobileye Ends Partnership With Tesla
 
Sometimes the lane markings are wrong (the construction crash, and 101).
Sometimes the car in front is wrong (fire truck crash last second swerve, sort of).
No matter which you trust, you will be wrong sometimes, so picking one to weigh more seems unreliable.

I would modify that to focus on whether the perceived car in front is wrong RELATIVE TO the perceived lane markings.

The main way the car in front would be wrong, relative to the lane markings, is if the car deviated from the lane and drove off a cliff. Otherwise, generally, if the car in front is going there, at least there is a path of travel, perhaps blazed by the car in front, and perhaps the car in front will at least serve as an atteneutor that CalTrans cannot deny you.

In the case of the fire truck, the car in front deviating from the lane is actually correct, to avoid hitting truck, but it is a tough call to follow a car making a drastic lane change.

At least in California where CalTrans negligently maintains the lane markings, I'd prefer to trust the car in front and follow the car in front even if they deviate from the perceived lane markings (which CalTrans maintains to direct vehicles right into medieval impaling devices).
 
  • Like
Reactions: mongo
For those of you that think the accident would not have happened had autopilot not been engaged, remember it was just a week before that a Prius did the same thing (and the driver walked away.)


@wk057, since you are here, I have to ask: since Tesla was able to determine various parameters from the wreck, does that mean they can fully reconstruct happened so thy can learn enough to prevent future accidents of this type?
 
Let me clarify my earlier statements a bit. (@bonnie and others)

Don't get me wrong. I never said the driver wasn't culpable in this. I fully and wholeheartedly believe this is an excellent "Darwin Award" situation since the driver just let their vehicle drive off into a barrier without paying attention. I personally have little sympathy for that, same with pedal misapplication stuff. *That* is not Tesla's fault. In the eyes of the law and common sense, I think this is entirely the driver's fault and it should be left at that. Driver error.

I also never said the *driver* needed any AP1 experience for this to matter. *Tesla* knows full well the disparity between AP1 and AP2, and must know the safety considerations and drawbacks of AP2 vs AP1.

The main point is that I think Tesla loses right the to the moral high ground on the matter, because of their rush to implement AP2 and ditch AP1 entirely. I'm 99.9% sure that AP1 would not have suffered from this particular error in the same situation, especially if there were any cars in front of the X in their or any other lane, in which case we wouldn't be having this discussion. (Edit: A long time contact at Tesla, who wishes to remain anonymous, obviously, confirmed the same to me as I was writing this.) Tesla above all should be well aware of the disparity between their original system and their new one. I feel pretty badly for the developers that worked on these systems and know this, too, and may even feel somewhat responsible. It's Tesla's arrogance, IMO, that bears responsibility for this crash. Had they not burned their bridges with Mobileye and continued developing a sensible system we'd likely already be pretty close to FSD capability by now. Instead, Tesla decided to set back functionality and safety by years in developing their own system.

I have enough technical data to know exactly how AP1 handles these situations. See some of my older posts about my augmented AP1 car. For AP2, I have the firmware for this as well, and have done analysis on many aspects of it in my work to see if a retrofit is viable and if it can be augmented with my AP1-type modifications. I know enough about the system and how it handles particular inputs from both an observational perspective as a user and from a technical perspective. I qualify many of my statements related to such reverse engineering efforts as "seems to" because there is always the chance that my analysis isn't perfect. In this case, however, I'd put the odds heavily in my favor given the data.

I had an issue with Tesla releasing AP2 cars from the start. They took huge steps backwards, and were delivering cars to people that lacked basic functionality available to all owners previously without any warning to anyone. It's taken them 18 months to get the system even close to AP1 parity, and it's still terrible in comparison in many aspects. There are relatively few areas where it meets or maybe slightly exceeds AP1 's capabilities thus far, all of which are highly subjective. From a purely objective and quantifiable view the system is still far inferior to AP1 by almost every metric. From day one AP2 cars lacked key safety features already available on AP1 cars, such as AEB, side collision protection, etc. They finally started rolling these things out, and thankfully there were no accounts of injuries or deaths in that period.

My biggest issue is that Tesla just threw the baby out with the bath water in their spat with Mobileye, and now an owner has paid the ultimate price as a direct result of that decision. Yes, the driver should have been paying attention... but the entire system should have been 2 years more advanced than original AP1 by now, also, not a year or more behind what already existed... not even counting that AP1 wouldn't have even had this particular failing in its stale development state from 2 years ago already.

Are they liable for any kind of damages as a result? Probably not. Should we as owners and enthusiasts let them continue to get away with these practices they continue to follow with massive misinformation about products, deadlines missed by huge margins, large steps backwards in tech, etc? Absolutely not.

Everyone at Tesla involved in ditching AP1 should take a step back and admit they screwed up at this point. This tragedy was preventable by them. And that's an issue to me.

In favor of what you say, I remember in a quarterly call or release after the MobilEye relationship fell apart that Elon said that the original plan was to have both vision systems in their new cars and to cut over from MobilEye's once the Tesla system was ready for prime time. But then MobilEye refused to allow this, so AP2 cars shipped with the Tesla hardware and firmware only. The fact that Tesla felt a need to continue to include the Mobileye system, with its extra cost, in their cars shows that they realized that their own system was not adequate at that time. When they made that decision I'm sure they thought it would have been adequate fairly soon, but things have a way of slipping.
 
In favor of what you say, I remember in a quarterly call or release after the MobilEye relationship fell apart that Elon said that the original plan was to have both vision systems in their new cars and to cut over from MobilEye's once the Tesla system was ready for prime time. But then MobilEye refused to allow this, so AP2 cars shipped with the Tesla hardware and firmware only. The fact that Tesla felt a need to continue to include the Mobileye system, with its extra cost, in their cars shows that they realized that their own system was not adequate at that time. When they made that decision I'm sure they thought it would have been adequate fairly soon, but things have a way of slipping.

I have the opposite interpretation. Tesla wanted to keep using ME while developing AP2, but ME wouldn't allow it (likely since ME realized it would be replaced eventually). So Tesla had to either stay bound to ME forever, or split off, even though they had nothing in place to do so with.
 
I would modify that to focus on whether the perceived car in front is wrong RELATIVE TO the perceived lane markings.

The main way the car in front would be wrong, relative to the lane markings, is if the car deviated from the lane and drove off a cliff. Otherwise, generally, if the car in front is going there, at least there is a path of travel, perhaps blazed by the car in front, and perhaps the car in front will at least serve as an atteneutor that CalTrans cannot deny you.

In the case of the fire truck, the car in front deviating from the lane is actually correct, to avoid hitting truck, but it is a tough call to follow a car making a drastic lane change.

At least in California where CalTrans negligently maintains the lane markings, I'd prefer to trust the car in front and follow the car in front even if they deviate from the perceived lane markings (which CalTrans maintains to direct vehicles right into medieval impaling devices).

If we are talking cliff edges and such I agree. I was thinking of the cases where a car is going in the wrong place (like the gore point) and then ducks back to the correct spot. For the fire truck, the lead vehicle seems like it was too fast and close to the fire truck, thus putting the Tesla in a bad situation. Basically situations where the Tesla is in the uncorrect spot that seems valid.

I'm probably experiencing software engineer paranoia with rare edge cases..
 
Also, I recall AP1 when it was just rolled out in 2015. There were plenty of times where I had to disengage it and make major driver input decisions. But going through that exercise in 2015 likely innoculated me from the much higher expectations that new buyers now have.
 
  • Like
Reactions: bmah
I have the opposite interpretation. Tesla wanted to keep using ME while developing AP2, but ME wouldn't allow it (likely since ME realized it would be replaced eventually). So Tesla had to either stay bound to ME forever, or split off, even though they had nothing in place to do so with.
If we accept as gospel everything Elon has said about the split with MobileEye, there is a third choice: not to offer a new system until it is ready. Tesla perhaps was "forced" not to use ME any longer, but that is not the same as Tesla being "forced" to do anything they did with AP2.

At any rate, taking what Elon says as gospel seems like a shaky standpoint on which to build a case.

3 months. 6 months. Etc.
 
If we accept as gospel everything Elon has said about the split with MobileEye, there is a third choice: not to offer a new system until it is ready. Tesla perhaps was "forced" not to use ME any longer, but that is not the same as Tesla being "forced" to do anything they did with AP2.

At any rate, taking what Elon says as gospel seems like a shaky standpoint on which to build a case.

3 months. 6 months. Etc.

Sure: his side, their side, the uninterpreted truth.
I dig.
But the do nothing option would still run afoul of wk's it doesn't have the AP1 feature set complaint.

From day one AP2 cars lacked key safety features already available on AP1 cars, such as AEB, side collision protection, etc. They finally started rolling these things out, and thankfully there were no accounts of injuries or deaths in that period.

Along with assertion that they should have kept ME code (not sure how they were supposed to do that)
Everyone at Tesla involved in ditching AP1 should take a step back and admit they screwed up at this point. This tragedy was preventable by them

Those are the points I'm pushing back on.