Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
The barrier is a smaller cross section than even a small car so the system may ignore it to avoid false positives.
Correct. Or else, it would brake anytime you drive by those barriers at highway speeds. The margins are quite slim when you think about it.
Also, see here for what AP2's range is: Autopilot

Finally, and I cannot stress this enough: let's all pace ourselves, the NTSB is not expected to release the results of its investigation for another 6 months.
Also, shout-out to the Mods that work hard to allow us to have this discussion and keep it on track and civil. ❤️
 
20180331_110049.jpg
Agreed with mostly what's said. This accident however is alarming and I for one have been the victim of relying on auto pilot too much. I on numerous times have looked around the vehicle for items and even run autopilot well over the speed limits. I'm sure i am not the only one that has tested the limits of the system. Autopilot allowed you to travel at 150 k/hr in 80 speed zones without even turning off. Almost double the speed limit. The system definitely needs more safety features installed.
 
View attachment 291004 Agreed with mostly what's said. This accident however is alarming and I for one have been the victim of relying on auto pilot too much. I on numerous times have looked around the vehicle for items and even run autopilot well over the speed limits. I'm sure i am not the only one that has tested the limits of the system. Autopilot allowed you to travel at 150 k/hr in 80 speed zones without even turning off. Almost double the speed limit. The system definitely needs more safety features installed.

You are a victim for buying a Tesla?
You are a victim for willingly engaging autopilot?
You are a victim for telling autopilot to run at the same speed you could chosen without the use of autopilot?
You are a victim of forced and deliberate testing that violates laws and can kill innocent people?

Your testing should have resulted in jail time and an impounding of your car.

Tesla should be doing a recall on some owners. That’s the best way to prevent more autopilot “incidents”.
 
  • Funny
  • Like
Reactions: isjka and Swift
You are a victim for buying a Tesla?
You are a victim for willingly engaging autopilot?
You are a victim for telling autopilot to run at the same speed you could chosen without the use of autopilot?
You are a victim of forced and deliberate testing that violates laws and can kill innocent people?

Anything else I missed?
The article didn' claim the speed the vehicle was traveling at. How would Tesla explain that the vehicle was on autopilot and set at a maximum speed of 80 MPH. Hypothetical speaking, the driver wasn't paying attention and was now in a 60 MPH zone and the vehicle now still in autopilot was traveling well above the speed limit.

In other words. Settle down turbo
 
  • Disagree
  • Like
Reactions: xav- and dhanson865
This is a very good point. It is natural to start to "trust" AP when you drive the same route day after day, and it behaves exactly the same. It becomes very predictable...

until there is an update, and it is not.

Although who hasn't received notice of an update and been more cautious using it. We tend to get our updates after a number of people here have already received and installed theirs, and then we'll read the reviews to see what improvements people have noticed and what things might have cropped up. Frequently there's some differing of opinion. This is just the way of any updates, be it car or phone or computer.
 
Although who hasn't received notice of an update and been more cautious using it.

Well, since Tesla doesn't mention that they sometimes make significant changes to AP in the release notes, I could see an owner who doesn't follow here not knowing that. My husband didn't know that they did that until I told him to be cautious with AP after ANY update. He assumed any AP changes would be documented.
 
The article didn' claim the speed the vehicle was traveling at. How would Tesla explain that the vehicle was on autopilot and set at a maximum speed of 80 MPH. Hypothetical speaking, the driver wasn't paying attention and was now in a 60 MPH zone and the vehicle now still in autopilot was traveling well above the speed limit.

In other words. Settle down turbo

It’s a driver assistance package without the currrent ability to read and process road signs.

I drive in areas where the speed limit is higher than what Tesla thinks it is.

I as a driver need the discernment to make adjustments to use the system to obey laws and maintain safe operation of the vehicle.

What if conditions necessitated I down click the speed to below the speed limit?

Driving TOO slow as well is a danger and results in citations. If I am downclocking to 30mph in a 60mph zone due to heavy fog conditions - Tesla should just speed me up?
 
No doubt AP1 and AP2/2.5 has successfully negotiated this stretch of road successfully many times. But that doesn't mean it can't do it wrong on occasion, which is why the driver still must supervise. That's the biggest takeaway for anyone driving with AP at this stage... when we say it is a level 2 system that requires human supervision, we really do mean that.

I suspect that there were a confluence of events required to make this accident happen even though AP2.5 has successfully driven this stretch of road many times. What we don't know is how close AP2.5 has been to failing to handle it time and time again. Throw in some sun glare at exactly the right angle, throw in some movement of vehicles ahead... my guess right now which is completely speculation, is that the lead vehicle ahead of Mr. Huang changed lanes at just the right time so that the AP locked onto the wrong set of markers when the free space changed. Normally at this time of day, there are plenty of lead vehicles that it doesn't matter that the lines are poorly painted, there are no chevrons and the way the pavement grooves lead towards the concrete barrier. And if the free space didn't change, AP2.5 might have negotiated it properly if it saw lines the whole time given the demonstration by a slew of Tesla drivers.

Again, this points to how dynamic the driving environment is on a per trip basis and how people still have to supervise, even in areas where AP has worked before. If you have a hard time seeing, likely AP is also having a harder time seeing. High definition mapping as a backup needs to come for level 3 driving and that should help a lot in this particular circumstance.
 
Just caught up with a couple days and hundreds of posts. I do not mean to disparage AP in this post - if the driver was truly engaged at the critical moment, or if the driver was in the habit of disengaging AP in challenging areas like this funky intersection, it appears the accident could have been avoided. The exit was also poorly designed and maintained and the barrier should have been reset. With that said ...

Tesla’s blog says “the driver’s hands were not detected on the wheel for six seconds prior to the collision.” I think by saying this Tesla is trying to imply that he didn’t have his hands on the wheel, but the comment is very misleading. I’m sure that at most points in time my hands wouldn’t have been detected for the prior six seconds simply because my hands, one of which is always on the wheel when I use AP, didn’t apply steering torque during the prior six seconds. I sometimes set off a nag by not applying torque, even though I believe it takes much more than six seconds of an absence of torque to produce a nag. So I don’t think we can infer anything about the driver’s state of attention from that fact. We can infer more from the simple fact that he hit the barrier.

It seems like all the incidents that set off long discussions here and in the press involve hitting stationary objects: semis, fire trucks, the collapsed barrier. Someone upthread correctly pointed out that radar can detect stationary objects but such detections are thrown out because a high false positive rate would result. Imagine the car slamming on the brakes at highway speeds when you are about to pass a building on the side of the road that is positioned in a certain way, or pass an overhead road sign. If this happened once it would put me off of AP for a while, and if it happened twice I would never use AP again. Such false positives that lead to hard braking, possibly with other cars close behind, are absolutely unacceptable, from which it follows that it is up to the driver to detect and react to hazardous stationary objects.

That is much of the reason why AP is not equal to Self Driving. @Reciprocity posted a great description of what AP really is back at 1171. But the problems are 1) FSD, which some owners have paid for, is impossible by definition without shifting the responsibility of dealing with stationary objects to the car, and 2) as incidents like this one accumulate, the absence of stationary object detection without false positives may become unacceptable to regulators and customers, even for AP. I’m personally fine with accepting stationary-object responsibility, so I hope (2) does not develop.

I suspect the current radar plus camera sensor combination used in AP1 through AP2.5 is unable to do reliable stationary object detection without unacceptable false positives. It’s a big enough deal that if it could, they would already have the car doing it. To really do it I think you need either a single forward looking LIDAR around hood level, or stereo cameras and lots of compute power, or something like cameras and a projected grid of infrared points that could be seen by the camera and interpreted by software. And then you would need sensor fusion with the radar to confirm there is no false positive. Of course none of this could be realistically retrofitted to the existing fleet, including those cars where FSD has been paid for.

So I have no great conclusion other than that this is a worry, and we should all be aware of APs limitations when we use it.
 
Last edited:
I guess if Autopilot can't detect a HUGE red fire truck parked on highway, we shouldn't expect autopilot can detect the tiny concrete median for this Model X accident? I still don't understand why the radar system can't detect these stationary objects and at least try to slow down. Radar system doesn't have fast enough sampling rate at high speed driving? Not enough resolution to detect tiny object? Autopilot software not giving high enough priority for radar detection inputs? What's the deal here?

You keep saying "can't detect". It's not that the radar can't detect it, The issue is that radar detects it and a million other stationary objects. It has to be ignored by the portion of autopilot that decides what to do otherwise it would be "paralyzed by fear" and be stopping almost immediately every time you turn Autopilot on.

So the operative phrase would be that Autopilot isn't able to properly decide the difference between a solid object in it's path and one that is ahead of it but not in it's path.

Basically radar says, there is a large return at position X and the decision process is that the car doesn't know if that's important or not so it ignores that input.

Detect does not equal avoid.
 
Tesla’s blog says “the driver’s hands were not detected on the wheel for six seconds prior to the collision.” I think by saying this Tesla is trying to imply that he didn’t have his hands on the wheel, but the comment is very misleading. I’m sure that at most points in time my hands wouldn’t have been detected for the prior six seconds simply because my hands, one of which is always on the wheel when I use AP, didn’t apply steering torque during the prior six seconds. I sometimes set off a nag by not applying torque, even though I believe it takes much more than six seconds of an absence of torque to produce a nag. So I don’t think we can infer anything about the driver’s state of attention from that fact. We can infer more from the simple fact that he hit the barrier.

I already defeated the above argument easily in another thread.

If you are driving on an edge of a cliff that is to your right, applying pressure without torque to your steering wheel between 12:00 and 3:00 o clock position would prevent AP from violently steering you off the cliff.

This level of pressure and lack of torque would be insufficient to satisfy Tesla's AP nag.

So yes, one can be in control of the wheel yet fails to satisfy Tesla's driver alertness system.

Many failed sherlocks using torque as an argument for EAP failure.
 
Here the driver knew AP was problematic, but used it anyway, presumably because it is fun, or they paid a lot of money for it, or they thought it was fixed, etc.

If they were under 18, Autopilot could be legally classified as an "attractive nuisance" and Tesla held liable for lulling the driver into an involuntary zombie state, or for changing the behavior of the system silently, without warning.

Perhaps tort case law needs to be updated to extend the attractive nuisance doctrine to adults, especially with respect to new and untested technologies, where risk cannot be properly judged by a non-expert. Or when the behavior of a system like Autopilot can change drastically from release to release.
 
Last edited:
  • Like
  • Funny
Reactions: Matias and SummitX
You keep saying "can't detect". It's not that the radar can't detect it, The issue is that radar detects it and a million other stationary objects. It has to be ignored by the portion of autopilot that decides what to do otherwise it would be "paralyzed by fear" and be stopping almost immediately every time you turn Autopilot on.

So the operative phrase would be that Autopilot isn't able to properly decide the difference between a solid object in it's path and one that is ahead of it but not in it's path.

Basically radar says, there is a large return at position X and the decision process is that the car doesn't know if that's important or not so it ignores that input.

Detect does not equal avoid.

The RADARs are integrated sensors, they are used in conjunction with other other sensors.

Strong metal-level reflection, roughly width and height of a truck. In path of travel. Velocity zero.
Large red object with tail lights and a license plate, roughly rectangular, skewed into emergency lane, but blocking path.
Also passed a black SUV with a light rack, and government plate in the emergency lane.

Warn driver? Brake? Steer to avoid? Ignore because no potential threat derived from this event.

Second Case:

Radar gets a medium reflection from 150m in the possible path with zero velocity.
Camera sees beginning of a triangular region defined by outline of white.
In center is Yellow and Black reflective sign with no text, just chevrons.
Warning? Brake? Steering to avoid? No possible threat?
 
Last edited:
  • Like
Reactions: Matias
Here the driver knew AP was problematic, but used it anyway, presumably because it is fun, or they paid a lot of money for it, or they thought it was fixed, etc.

If they were under 18, Autopilot could be legally classified as an "attractive nuisance" and Tesla held liable.

Perhaps tort case law needs to be updated to extend the attractive nuisance doctrine to adults, especially with respect to new and untested technologies, where risk cannot be properly judged by a non-expert.

Please.

Tide pods are a more attractive nuisance than AP but you don't see them in court do you?

Cute argument but do you know how many contracts and signatures are needed by an adult to purchase a vehicle to begin with? Not to mention licensing needed to operate a motor vehicle to begin with.
 
Last edited:
I already defeated the above argument easily in another thread.

If you are driving on an edge of a cliff that is to your right, applying pressure without torque to your steering wheel between 12:00 and 3:00 o clock position would prevent AP from violently steering you off the cliff.

This level of pressure and lack of torque would be insufficient to satisfy Tesla's AP nag.

So yes, one can be in control of the wheel yet fails to satisfy Tesla's driver alertness system.

Many failed sherlocks using torque as an argument for EAP failure.
But if I read you right I think we are in agreement! My point is that when I and others drive with AP on, we have hands on the wheel and are in control of the car, but not registering torque. "So yes, one can be in control of the wheel yet fail to satisfy Tesla's driver alertness system." My point is that Tesla's blog seemed to imply the driver was not in control because of the six seconds, and I'm saying the blog is misleading for that reason.
 
  • Like
Reactions: Matias and d21mike
But if I read you right I think we are in agreement! My point is that when I and others drive with AP on, we have hands on the wheel and are in control of the car, but not registering torque. "So yes, one can be in control of the wheel yet fail to satisfy Tesla's driver alertness system." My point is that Tesla's blog seemed to imply the driver was not in control because of the six seconds, and I'm saying the blog is misleading for that reason.

If we held on to the wheel tight enough (we have control), AP couldn't have "willed" itself into the barrier.

If we did not hold on to the wheel tight enough (we do not have control), its possible AP could have steered into the barrier.

That's how the statement gets reconciled. Both situations do not satisfy the nag system due to lack of torque application.

Another way to say it is if you held on to the wheel tightly, any attempt of Tesla turning the wheel past a certain angle faces resistance which would disengage AP.

I think AP being disengaged at the last second would have been telling as the driver would be fight to take the car off autosteer but failed in time.
 
Last edited:
You keep saying "can't detect". It's not that the radar can't detect it, The issue is that radar detects it and a million other stationary objects. It has to be ignored by the portion of autopilot that decides what to do otherwise it would be "paralyzed by fear" and be stopping almost immediately every time you turn Autopilot on.

So the operative phrase would be that Autopilot isn't able to properly decide the difference between a solid object in it's path and one that is ahead of it but not in it's path.

Basically radar says, there is a large return at position X and the decision process is that the car doesn't know if that's important or not so it ignores that input.

Detect does not equal avoid.

I understand how AP works, I know is just level 2 but, I was speechless when I figured out and confirmed here at TMC last year that AP won’t stop against a stationary object.
you would think with today’s technology the system should be able to recognize a stationary object on the vehicle’s path/trajectory that is approaching at a high level of speed equal at your vehicle’s speed, it should definitely be able to determine that will result on a collision and act accordingly.
 
  • Like
Reactions: Matias and Icer