Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
"The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator."

"the adaptive cruise control follow-distance set to minimum"

This is a cherry picked detail that really has nothing to do with the cause of the accident. Follow-distance is irrelevant when no car is being followed. Also, is Tesla advising that minimum follow distance is inappropriate for normal freeway use?"

Follow-distance is also tied into response distance from objects. Yes miminum follow distance of 1 is inappropriate for any spead above 15-25 mph imho.

"
Then, later, while it was still under the party-silence agreement, Tesla made the following statement to the press:

"The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so."

Well, on a clear day, with 150 yards of visibility, an attentive driver would have avoided the collision. So this is relevant also.
 
  • Like
Reactions: T34ME
are there more or less collisions (and more or less serious injuries etc.) with or without AP? Even in its current state of development?

The answer depends on how you define AP. If you define AP as the AP1 or AP2 hardware suite (and all software/functionality dependent on it), then the answer is almost certainly that "AP" increases safety.

However, that definition of AP includes a whole bunch of passive safety features that are undeniably increasing safety; things like AEB and FCW and BSM.

When most people talk about AP, they aren't using the term that broadly. Usually AP is referring to the function where both advanced cruise control (TACC) and lane keeping (AS) are active at the same time.

I have no idea whether the simultaneous TACC/AS function has a statistically significant effect on safety. As far as know, no one outside of Tesla does. The data just isn't out there.

So we don't know if this functionality adds to safety when compared with a car with all the other safety features, but without the ability to activate AS.
 
  • Disagree
Reactions: bhzmark
In reply to Economite reference to professionals doing the testing, is he referring to the Uber "safety driver" who was clearly texting or surfing the web at the time of impact?

I don't argue that Uber was doing a good job of implementing its testing program. Uber strikes me as being far more reckless its autonomous testing/development than other players.

However, it makes no sense to me to argue that because Uber hired an inattentive/incompetent tester, it makes sense to just use the general Tesla ownership population as testers, without giving them any guidelines for how to use the product and without creating a testing protocol of which they are aware.

Effective testing requires that testers have a specific set of behaviors to test and also that they record actual data.
 
Probably not a good idea to rest your legal expertise on LegalZoom.

Defaming The Dead | FindLaw

Says the guy who quotes findlaw... :)

[not actually being critical of you here, I just found the irony irrestistable. And I haven't read the links and have no idea what the actual law is on this type of defamation claim, so I am neither saying that I think your legal interpretation is right, nor that it's wrong.]
 
Last edited:
CBS did an interview with Elon Musk while he drove a model 3 last week. Tellingly, even he took his hands off the wheel while using AP (check at 2:27). Kind of makes you wonder whether he really is expecting that drivers will keep there hands on the wheel when they use AP, or if Tesla is basically using the "keep your hands on the wheel" manta so that it can blame users when they "misuse" AP in exactly the same manner as Elon does. Also... Note the exchange between the two reporters starting at 3:28 about Musk being unconcerned about having his hands off-wheel.

Tesla CEO Elon Musk addresses autopilot system safety concerns: "We'll never be perfect"
 
Last edited:
  • Informative
Reactions: alcibiades
CBS did an interview with Elon Musk while he drove a model 3 last week. Tellingly, even he took his hands off the wheel while using AP (check at 2:27). Kind of makes you wonder whether he really is expecting that drivers will keep there hands on the wheel when they use AP, or if Tesla is basically using the "keep your hands on the wheel" manta so that it can blame users when they "misuse" AP in exactly the same manner as Elon does. Also... Note the exchange between the two reporters starting at 3:28 about Musk being unconcerned about having his hands off-wheel.

Tesla CEO Elon Musk addresses autopilot system safety concerns: "We'll never be perfect"

Or perhaps, Musk has conditioned himself to know when to trust the system and when to take over, he remains attentive to the vehicles path at all times during the interview. I believe we all do the same, and rely on our own understanding and driving experiences to determine what the car can and cannot do. Best to treat it as a very inexperienced driver, until it has proven itself reliably for several years, just as we do with our children when teaching them the rules of the road.
 
http://www.epcgroup.com/wp-content/uploads/H472_Sept2015.pdf

...
BIG SAVINGS WITH SMART CUSHION
In an earlier submission (for the California DOT nomination to the AASHTO Technology Implementation Group on 9 September 2011) Caltrans gave the following statistics based in their 5 year experience from November 2006 to August 2011 (when there were approximately 140 units installed on California roads):
✔ Estimated saving on frontal impacts is $2.7M. Additional side impact savings are estimated at $1.4M+.
✔ An estimated 370 crew dispatches were not required because of no damage on side impacts.
✔ For estimated repairs, there are savings on frontal impacts and side impacts when compared to alternate attenuators.
✔ Savings can be significant due to the low cost of repair parts (approximately $40), decreased repair time (usually under 30 minutes) and reduced worker exposure.
✔ It is possible to repair the attenuator during incident management thereby eliminating a future site visit and lane closure.
At the time of writing this article, there are currently more than 300 SMART CUSHION SCI100 units in use in California.
...

When they say "It is possible to repair the attenuator during incident management thereby eliminating a future site visit and lane closure. ",
it sounds like they are suggesting it would be good to reset the smart cushion when coming out to clean up debris from the most recent accident.
( Instead what we see is some orange cones left behind, and crews coming out weeks later, which leaves the gore point extra dangerous during that time... )

...The Nevada DOT Engineer stated that:
1. The DOT’s main consideration is getting crews in and out as quickly and as safely as possible when resetting or repair a system; and
2. to select the type of system to be installed the main factors are survivability and lifetime costs, with life-cycle costs the key consideration.
...
. The tool kit is in supervisors trucks;
. Ability to repair or reset during initial accident call;
. Reduced system down time;
. Minimal out of service to the travelling public equals a safer highway system
...

Hopefully Caltrans trains more people in resetting these Smart Cushions so that they can be done more quickly. Resetting them "during initial accident call" sounds prudent to me.

...“Our goal is to reset the damaged SMART CUSHION prior to the accident being cleaned up. This is accomplished 90% of the time if crews are on duty and if we are notified (of the accident).” “We average at least three resets or repairs a week in Las Vegas. SMART CUSHION units can be reset numerous times with proper inspections and maintenance. We have SMART CUSHIONs that have been reset 20 times without any major repairs before they show signs of needing to be replaced.”...
 
Last edited:
Or perhaps, Musk has conditioned himself to know when to trust the system and when to take over, he remains attentive to the vehicles path at all times during the interview. I believe we all do the same, and rely on our own understanding and driving experiences to determine what the car can and cannot do. Best to treat it as a very inexperienced driver, until it has proven itself reliably for several years, just as we do with our children when teaching them the rules of the road.

Here's a kind of interesting quote from Musk, stating something that I forgot he had admitted:

"One of the ironies that we’ve seen is counter intuitive and a lot of people on the consumer watchdog sites and in some cases on regulatory sites have assumed that Autopilot accidents are more likely for new users. In fact, it is the opposite. Autopilot accidents are far more likely for expert users. It is not the neophytes. It’s the experts.


They get very comfortable with it and repeatedly ignore the car’s warnings. It’s like a reflex. The car will beep at them, they tug the wheel, the car will beep at them, they tug the wheel, and it becomes an unconscious reflex action. So we will see half a dozen or more, sometimes as many as 10 warning in one hour continuously ignored by the driver. We really want to avoid that situation."

Transcript: Elon Musk’s press conference about Tesla Autopilot under v8.0 update [Part 2]
 
The recording I saw of the Uber accident was split screen, one showing the video out front of the car and the other showing the driver. The driver was CLEARLY reading something on his phone, and had a complete surprised (and horrified) reaction at the split-second of the accident.
Just an FYI, but the driver in the Uber incident in AZ was female.

Here's a kind of interesting quote from Musk, stating something that I forgot he had admitted:

"One of the ironies that we’ve seen is counter intuitive and a lot of people on the consumer watchdog sites and in some cases on regulatory sites have assumed that Autopilot accidents are more likely for new users. In fact, it is the opposite. Autopilot accidents are far more likely for expert users. It is not the neophytes. It’s the experts.

They get very comfortable with it and repeatedly ignore the car’s warnings. It’s like a reflex. The car will beep at them, they tug the wheel, the car will beep at them, they tug the wheel, and it becomes an unconscious reflex action. So we will see half a dozen or more, sometimes as many as 10 warning in one hour continuously ignored by the driver. We really want to avoid that situation."

Transcript: Elon Musk’s press conference about Tesla Autopilot under v8.0 update [Part 2]

Interesting. I've probably logged over 10,000 miles on Autopilot and have never had it beep at me. I guess I'm doing a good job at paying attention.
 
  • Like
Reactions: EVie'sDad
the data is not there . . . we don't know if this functionality adds to safety when compared with a car with all the other safety features, but without the ability to activate AS.

Read the NHTSA report. They specifically distinguish Autosteer from other AP functions. And review other data.

"Figure 11 shows the rates calculated by ODI for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation."


upload_2018-4-16_20-6-51.png


You specifically would really benefit from reading the NHTSA report.

It also says: "Autosteer is designed for use on highways that have a center divider and clear lane markings. The system does not prevent operation on any road types. The driver is responsible for deciding when the road type and other conditions are appropriate for system activation. The hands-on warnings occur more frequently as a function of vehicle speed, road class, and existence of heavy traffic."

"Some crashes occurred in environments that are not appropriate for semi-autonomous driving (e.g., city traffic, highway entrance/exit ramps, construction zones, in heavy rain, and road junctions/intersections)."

"Advanced Driver Assistance Systems, such as Tesla’s Autopilot, require the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes. Automated Emergency Braking systems have been developed to aid in avoiding or mitigating rear-end collisions. The systems have limitations and may not always detect threats or provide warnings or automatic braking early enough to avoid collisions. Although perhaps not as specific as it could be, Tesla has provided information about system limitations in the owner’s manuals, user interface and associated warnings/alerts, as well as a driver monitoring system that is intended to aid the driver in remaining engaged in the driving task at all times."

https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF
 
Here's a kind of interesting quote from Musk, stating something that I forgot he had admitted:

"One of the ironies that we’ve seen is counter intuitive and a lot of people on the consumer watchdog sites and in some cases on regulatory sites have assumed that Autopilot accidents are more likely for new users. In fact, it is the opposite. Autopilot accidents are far more likely for expert users. It is not the neophytes. It’s the experts.


They get very comfortable with it and repeatedly ignore the car’s warnings. It’s like a reflex. The car will beep at them, they tug the wheel, the car will beep at them, they tug the wheel, and it becomes an unconscious reflex action. So we will see half a dozen or more, sometimes as many as 10 warning in one hour continuously ignored by the driver. We really want to avoid that situation."

Transcript: Elon Musk’s press conference about Tesla Autopilot under v8.0 update [Part 2]

I believe he owned the vehicle for about four months. I definitely would not call him am expert.. I have two and three quarter years experience, almost 30,000 miles and still remain attentive.. To do otherwise would be irresponsible and dangerous.
 
Read the NHTSA report. They specifically distinguish Autosteer from other AP functions. And review other data.

"Figure 11 shows the rates calculated by ODI for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation."

As the NHTSA report also states:

"Since it released Autopilot in October 2015, Tesla has made continuous updates to the system’s firmware that are made available to consumers as OTA updates. These updates have included changes to improve TACC, AEB and Autosteer performance, as well as adding new driver assistance safety features, such as In-Path Stationary Object (IPSO) braking and Pedal Misapplication Mitigation (PMM). In September 2016, Tesla released its 8.0 firmware update which included revisions in the driver monitoring strategy, as well as several enhancements to AEB, DBS, and TACC performance."

This means that the addition of Autosteer is not the only difference between the post-AS and pre-AS samples. The miles covered by the post-AS sample were in cars that not only added AS, but also benefitted from software improvements (not present in the pre-AS sample) to TACC, AEB, IPSO and PMM. So there is no way to know whether the reduction of crashes per million miles from 1.3 (in pre-AS sample) to 0.8 (in post-AS sample) is attributable to the presence of AS or to the improvements to TACC, AEB, IPSO, PMM or some other features. Indeed, is possible (but not provable from the information that has been made public), that AS actually increases crash rates, but this effect was washed out by the decreases in crash rate that can be attributed to the other features improved OTA.

It also says: "Autosteer is designed for use on highways that have a center divider and clear lane markings. The system does not prevent operation on any road types. The driver is responsible for deciding when the road type and other conditions are appropriate for system activation. The hands-on warnings occur more frequently as a function of vehicle speed, road class, and existence of heavy traffic."

"Some crashes occurred in environments that are not appropriate for semi-autonomous driving (e.g., city traffic, highway entrance/exit ramps, construction zones, in heavy rain, and road junctions/intersections)."

Yes... NHTSA said this, and I do not dispute their observation. The problem is that whatever standard they are using for determining what is an "environment that [is]not appropriate for semi-autonomous driving (e.g., city traffic, highway entrance/exit ramps, construction zones, in heavy rain, and road junctions/intersections)" hasn't really been stated clearly by Tesla to Tesla users. If-- aside from some broad language deep in the constantly changing user manual-- Tesla won't give drivers specific instructions about what is an "appropriate environment" how is a driver supposed to accurately fulfill their responsibility "for deciding when the road type and other conditions are appropriate for system activation." Tesla can't both (I) avoid giving clear guidelines for appropriate use and then (ii) seek to avoid responsibility when a driver uses the vehicle in a manner that Tesla/NHTSA deems is "inappropriate" via unpublished, difficult to locate, or vague standards.

Although perhaps not as specific as it could be, Tesla has provided information about system limitations in the owner’s manuals, user interface and associated warnings/alerts, as well as a driver monitoring system that is intended to aid the driver in remaining engaged in the driving task at all times.

NHTSA found that Tesla's instructions on how to use AP were "perhaps not as specific as it could be." It s hard to understand, given this finding, why Tesla has chosen not to release more specific (and visible) instructions on proper AP use. Why not develop the best possible user instructions, given that misuse can lead to death and injuries for the driver, passengers, and others on the road?

Furthermore, in NTSB's report on the Florida crash (https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1702.pdf), NTSB found:

"Today’s vehicle automation systems can assess the vehicle’s route and determine whether it is appropriate to the system’s ODD.61 But Tesla’s Autopilot remains available to the driver, even under some conditions that do not meet its ODD. This situation allows the driver to activate automated systems in locations and circumstances for which their use is not appropriate or safe. The NTSB concludes that if automated vehicle control systems do not automatically restrict their own operation to those conditions for which they were designed and are appropriate, the risk of driver misuse remains. Therefore, the NTSB recommends that manufacturers of vehicles equipped with Level 2 vehicle automation systems incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed. The NTSB further recommends that NHTSA develop a method to verify that manufacturers of vehicles equipped with Level 2 vehicle automation systems incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed. Finally, to ensure that vehicle manufacturers that do not currently produce vehicles equipped with Level 2 automation but may do so in the future are aware of the significance of this issue, the NTSB recommends that the Alliance of Automobile Manufacturers and the Association of Global Automakers notify their members of the importance of incorporating system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed."
 
  • Like
Reactions: NerdUno
Tesla did add a safe guard for when it was used on undivided streets. That safeguard was limiting the speed of AP.

Initially I believe it was set to the speed limit, but then a bunch of owners got mad. So Tesla raised it to 5mph over. The problem with this fatality accident is it happened in a location which the Tesla AP should have handled just fine. In fact the Tesla blog tried to point out how many successful times AP handled that section.

There is absolutely nothing about that section that would have concerned me unless I had bad experiences in that section .
 
Last edited:
As the NHTSA report also states:

"Since it released Autopilot in October 2015, Tesla has made continuous updates to the system’s firmware that are made available to consumers as OTA updates. These updates have included changes to improve TACC, AEB and Autosteer performance, as well as adding new driver assistance safety features, such as In-Path Stationary Object (IPSO) braking and Pedal Misapplication Mitigation (PMM). In September 2016, Tesla released its 8.0 firmware update which included revisions in the driver monitoring strategy, as well as several enhancements to AEB, DBS, and TACC performance."

This means that the addition of Autosteer is not the only difference between the post-AS and pre-AS samples. The miles covered by the post-AS sample were in cars that not only added AS, but also benefitted from software improvements (not present in the pre-AS sample) to TACC, AEB, IPSO and PMM. So there is no way to know whether the reduction of crashes per million miles from 1.3 (in pre-AS sample) to 0.8 (in post-AS sample) is attributable to the presence of AS or to the improvements to TACC, AEB, IPSO, PMM or some other features. Indeed, is possible (but not provable from the information that has been made public), that AS actually increases crash rates, but this effect was washed out by the decreases in crash rate that can be attributed to the other features improved OTA.



Yes... NHTSA said this, and I do not dispute their observation. The problem is that whatever standard they are using for determining what is an "environment that [is]not appropriate for semi-autonomous driving (e.g., city traffic, highway entrance/exit ramps, construction zones, in heavy rain, and road junctions/intersections)" hasn't really been stated clearly by Tesla to Tesla users. If-- aside from some broad language deep in the constantly changing user manual-- Tesla won't give drivers specific instructions about what is an "appropriate environment" how is a driver supposed to accurately fulfill their responsibility "for deciding when the road type and other conditions are appropriate for system activation." Tesla can't both (I) avoid giving clear guidelines for appropriate use and then (ii) seek to avoid responsibility when a driver uses the vehicle in a manner that Tesla/NHTSA deems is "inappropriate" via unpublished, difficult to locate, or vague standards.



NHTSA found that Tesla's instructions on how to use AP were "perhaps not as specific as it could be." It s hard to understand, given this finding, why Tesla has chosen not to release more specific (and visible) instructions on proper AP use. Why not develop the best possible user instructions, given that misuse can lead to death and injuries for the driver, passengers, and others on the road?

Furthermore, in NTSB's report on the Florida crash (https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1702.pdf), NTSB found:

"Today’s vehicle automation systems can assess the vehicle’s route and determine whether it is appropriate to the system’s ODD.61 But Tesla’s Autopilot remains available to the driver, even under some conditions that do not meet its ODD. This situation allows the driver to activate automated systems in locations and circumstances for which their use is not appropriate or safe. The NTSB concludes that if automated vehicle control systems do not automatically restrict their own operation to those conditions for which they were designed and are appropriate, the risk of driver misuse remains. Therefore, the NTSB recommends that manufacturers of vehicles equipped with Level 2 vehicle automation systems incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed. The NTSB further recommends that NHTSA develop a method to verify that manufacturers of vehicles equipped with Level 2 vehicle automation systems incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed. Finally, to ensure that vehicle manufacturers that do not currently produce vehicles equipped with Level 2 automation but may do so in the future are aware of the significance of this issue, the NTSB recommends that the Alliance of Automobile Manufacturers and the Association of Global Automakers notify their members of the importance of incorporating system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed."

Neither TACC or FCW were enabled until January of 2015.

My understanding is AP1 was released around the Sep 2014 time frame.

So that's 4 months of neither, and both of those are major safety features.

AEB, and Blindspot monitoring wasn't released until March of 2015. So that's 6 months without those.

So it's really only 6-7 months of data on AP1 cars before Auto-Steer was released in Oct 2015. Plus those 6 months were warmer months where people drive faster, and not the cold winter months.

Here is a link to where I got the data
Model S software/firmware changelog
 
  • Like
Reactions: bhzmark

That is a really strange magazine. The articles are unsigned and (based on the masthead) there doesn't seem to be a division between editorial (content) and publishing (business and advertising) staffs. The tone at the start of the article is kind of odd:

"SMART CUSHION proves it delivers value in more ways the one following initial impacts in Australia and New Zealand.
While in these days of tight budgetary constraints and ever-increasing demands to ‘do more with less’ it may be tempting to opt for a product or solution with a lower initial cost, when it comes to road safety barriers, ‘whole-of-life’ cost benefit analysis is a critical consideration. Simply, low initial cost does not always equate to getting a good return on the investment. This is particularly true for impact protection systems, which, by their very nature, are extremely likely to require repairs and/or replacement parts following a vehicular impact. Put simply, what may appear at the outset to be a ‘better value’ solution can, in fact, end up being an extremely expensive selection, with repair costs quickly adding up to multiples of the initial purchase price. If every impact results in a majority or even total replacement of the unit, perceived savings can soon disappear – and the costs will continue to escalate… year after year!"

This might just reflect a difference in the way magazines present things in Australia as opposed to here in the States. But I do kind of wonder whether that article (beginning on page 56) isn't actually more or less a paid advertisement.
 
If you had owned the car during this time, or you read the report carefully, you would be able to see that they distinguish between AS and other AP functions. I can't help you any further. You are blinded by confirmation bias.

there is no way to know whether the reduction of crashes per million miles from 1.3 (in pre-AS sample) to 0.8 (in post-AS sample) is attributable to the presence of AS or to the improvements to TACC, AEB, IPSO, PMM or some other features.
 
Last edited:
  • Like
Reactions: EinSV
It's worth noting that all of the NHTSA references to Teslas being safer with AutoPilot involved vehicles and data collected from AP1 vehicles. AP2 is a completely different beast, and I'm using the word advisedly. For the first 12 months after its release, I'm not sure anyone could argue with a straight face that AP2 made their Teslas safer to drive. Only in the past month or so has the technology matured to the point of being even arguably comparable to AP1. It's also worth noting that Tesla has refused to reveal which version of AP2 was in use in this California vehicle crash. Can't help but wonder if there wasn't a reason for their reluctance.
 
Interesting. I've probably logged over 10,000 miles on Autopilot and have never had it beep at me. I guess I'm doing a good job at paying attention.

And, it would seem, a healthy grip on the steering wheel :)

No doubt the overall safety will increase as the system gets better..
But I wonder if the chance of whoppers like this MX Mountain View crash might also oddly increase..?
If the system gets so good that more people are lulled into tuning out, there is always the slim chance of a catastrophic failure like this.

AP2 is a completely different beast, and I'm using the word advisedly. For the first 12 months after its release, I'm not sure anyone could argue with a straight face that AP2 made their Teslas safer to drive.

The early AP2 software may have been highly unstable and unpredictable, but I bet the drivers using it were the most attentive (and anxious) on the road at the time :eek: