Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
We don't know whether the car entered the gore at its apex, or from the side. If the car didn't enter the gore until well past where the gore began, there wouldn't have been 200 meters of warning to the driver.

Given the CHP report, I think we do:
Update on collision on US-101southbound at SR-85 Blue Tesla driving southbound on US-10, driving at freeway speeds on the gore point dividing the SR-85 carpool flyover and the carpool lane on US-101 southbound collided with the attenuator barrier and caught fire

If it were at distance 1 from another car, where else could it get 150 meters of unobstructed view given the barrier is visible from much further away?

My point is that Tesla's blog seemed to imply the driver was not in control because of the six seconds, and I'm saying the blog is misleading for that reason.
All Tesla has the ability to report is what is recorded in the data stream. In this case whether driver induced torque was detected.
and the driver’s hands were not detected on the wheel for six seconds prior to the collision.
They do not say the driver's hands were not on the wheel. Explatolation/ interpretation is fully up to the reader.
 
  • Like
Reactions: SMAlset and Icer
You are wrong.

Back in 2015

Three Tide Pod Lawsuits Filed Against Proctor & Gamble

There have been thousands of lawsuits since. 14 state AGs are seeking to ban. Please keep posting but recognize that you can be wrong.

Yes sir. Thank you.

Trying to be funny but let's strike the Tide Pod argument aside:

Adults in Teslas are past the age of majority and understand they are fully responsible for what happens in the car. This is backed up by insurance liability. The driver is the "insured" and NOT Tesla.
 
  • Like
Reactions: croman
If we held on to the wheel tight enough (we have control), AP couldn't have "willed" itself into the barrier.

If we did not hold on to the wheel tight enough (we do not have control), its possible AP could have steered into the barrier.

That's how the statement gets reconciled. Both situations do not satisfy the nag system due to lack of torque application.

Another way to say it is if you held on to the wheel tightly, any attempt of Tesla turning the wheel past a certain angle faces resistance which would disengage AP.
I don't mean to offer a physics lesson but ...

"resistance which would disengage AP" IS torque. Torque is just rotary force. Motion of the wheel is not necessary for there to be torque on the wheel. Torque sufficient to disengage (or even nearly disengage) AP would certainly suffice to reset the nag system.

Sillydriver
MIT, Physics (Course 8), '81
 
  • Like
Reactions: croman
Yes sir. Thank you.

Trying to be funny but let's strike the Tide Pod argument aside:

Adults in Teslas are past the age of majority and understand they are fully responsible for what happens in the car. This is backed up by insurance liability. The driver is the "insured" and NOT Tesla.

True. Definitely the driver is responsible for the safe operation of his vehicle. Here, it appears that he failed that task. I do not think that Tesla is absolved by the fact the system is L2. Most states have comparative negligence and the law will look to assign levels of negligence to three known parties.

The driver. Tesla. Caltrans. Clearly the driver was negligent but was Caltrans also negligent in failing to have chevrons (clearly marking it as non-drivable space), replacing the crash attenuator in a prompt manner (cheap device that would've saved at least one life and damage to 3 cars), and perhaps their choice of the crash attenuator in the first place. Tesla will also be scrutinized and I believe they share in some fault for having a system potentially drive someone into a barrier at high speed.
 
I don't mean to offer a physics lesson but ...

"resistance which would disengage AP" IS torque. Torque is just rotary force. Motion of the wheel is not necessary for there to be torque on the wheel. Torque sufficient to disengage (or even nearly disengage) AP would certainly suffice to reset the nag system.

Sillydriver
MIT, Physics (Course 8), '81

You are driving straight on a road with no curves. There is a cliff to the left that you would drive into if the wheel was turned 90 degrees. Wheel is perfectly centered when your hands grip the 3 and 9 o clock positions.

You are in control even if AP is engaged.

AP is incapable of forcing that 90 degree turn if your hands were gripping 3 and 9 o clock securely.

Does that explain the point I am trying to make?

Thanks for the physics lesson, will review so I can be more accurate in my descriptions. :)
 
  • Like
Reactions: croman
I've definitely saved myself and my daughter from 2 AP mistakes by having a tight grip on the wheel but I also had an incident recently during a heavy rainstorm at night where AP changed lanes on me and I was unable to stop it until it was about 5 feet into the neighboring lane. Luckily there was no one else on the road, so perhaps my guard was diminished as compared to heavy traffic.

It wasn't that I wasn't alert, I was and always try to be, it was that AP acted so suddenly and I didn't maintain a death grip so the wheel slid and at 68mph, it was able to move 15 feet laterally in a faction of a second. So no matter how cautious someone is, AP can still cause issues. That's what I think people who 100% blame the driver are failing to appreciate as to why Tesla could still be liable despite it being an L2 system.
 
You keep saying "can't detect". It's not that the radar can't detect it, The issue is that radar detects it and a million other stationary objects. It has to be ignored by the portion of autopilot that decides what to do otherwise it would be "paralyzed by fear" and be stopping almost immediately every time you turn Autopilot on.

So the operative phrase would be that Autopilot isn't able to properly decide the difference between a solid object in it's path and one that is ahead of it but not in it's path.

Basically radar says, there is a large return at position X and the decision process is that the car doesn't know if that's important or not so it ignores that input.

Detect does not equal avoid.
I have to disagree to some extent. Remember that radar detect distance as well. When the radar sensor first receive signal of an object that is still far away (by distance), and it maybe afraid of false positive so it takes no action, this is fine. HOWEVER, when the radar keep receiving signal of an object that it KNOW is getting closer and closer and closer by distance (in the span of many seconds) to the point where the object is literally a few feet in front of the sensor during the last second before collision, and the braking system still doesn't take any action to slow down the car. That's just insanely dumb design.

If the radar ignores all or most of the signal and afraid of false positive, the emergency auto braking basically will never kick in during ANY situations where stationary objects are involved. Tesla should have put warning signs and pop up message on screen everywhere saying auto braking only works on moving objects, but it would not work in ANY stationary objects, even for a huge fire truck or a massive brick wall in front of you.
 
Last edited:
Problem with frequent OTA updates is, that AP’s behavior constantly changes. What is more alarming is, that those changes are not documented in the release notes.

This particular accident location has been traveled tenths of thousands times with AP-Teslas previously without problems. Now the newest praised version apparently has some specific glitch for this place. But there is no way a driver can know it, if the same car has previously handled that place without problems.

Tesla is playing a very dangerous game by constantly changing AP’s behavior without proper pre-testing and without documenting those changes to users.

+100.

Even with detailed release notes, however, it's very difficult for anyone to properly judge the risk of Autopilot:
  • Those who use Autopilot are generally not able to appreciate the risks. Because they use AP, they are, by definition, too dumb to properly assess them (until it's too late).
  • Those who are smart enough to assess the risks of Autopilot, don't use Autopilot.
In terms of fatalities, the safest car on the road today is a Tesla with Autopilot disabled.
 
You keep saying "can't detect". It's not that the radar can't detect it, The issue is that radar detects it and a million other stationary objects. It has to be ignored by the portion of autopilot that decides what to do otherwise it would be "paralyzed by fear" and be stopping almost immediately every time you turn Autopilot on.

So the operative phrase would be that Autopilot isn't able to properly decide the difference between a solid object in it's path and one that is ahead of it but not in it's path.

Basically radar says, there is a large return at position X and the decision process is that the car doesn't know if that's important or not so it ignores that input.

Detect does not equal avoid.
Precisely! That's why radar works so well in planes. There isn't stationary objects at 30,000 feet and you don't need a modern camera with image recognition to tel you what that large object flying at 600 mph 1 mile ahead is.

The simple problem of having to use a camera to vet an object using image recognition is a real problem. What if the collapsed attenuator is not in the list of objects that the cameras was trained with?
 
  • Helpful
Reactions: croman
I have to disagree to some extent. Remember that radar detect distance as well. When the radar sensor first receive signal of an object is still far away (by distance), and it maybe afraid of false positive so it takes no action, this is fine. HOWEVER, when the radar keep receiving signal of an object that it KNOW is getting closer and closer and closer by distance to the point where it is literally a few feet in front of the sensor during last second before collision, and the braking system still not take any action to slow down the car. That's just insanely dumb design.

If the radar ignores all or most of the signal and afraid of false positive, what's the point of putting the radar there in the first place?? The emergency auto braking basically will never kick in during ANY situations.

It is designed to reduce severity of collision in cases when the traffic in front rapidly slows, not when the car is headed for a stopped object.

Unless it is steered, the radar beam covers the entire front path and can't differentiate something straight ahead from something offset to the side. Example: a barrier you will pass next to, vs one you are headed at.

Anyone have a link for technical details of the Continental or Bosch radar system?
 
You are driving straight on a road with no curves. There is a cliff to the left that you would drive into if the wheel was turned 90 degrees. Wheel is perfectly centered when your hands grip the 3 and 9 o clock positions.

You are in control even if AP is engaged.

AP is incapable of forcing that 90 degree turn if your hands were gripping 3 and 9 o clock securely.

Does that explain the point I am trying to make?

I agree with this.
 
+100.

Even with detailed release notes, however, it's very difficult for anyone to properly judge the risk of Autopilot:
  • Those who use Autopilot are generally not able to appreciate the risks. Because they use AP, they are, by definition, too dumb to properly assess them (until it's too late).
  • Those who are smart enough to assess the risks of Autopilot, don't use Autopilot.
In terms of fatalities, the safest car on the road today is a Tesla with Autopilot disabled.

Pretty arrogant statement don’t you think? I use AP, I am aware of the risks, I think others are the same, AP is an assist program not a driverless program and Tesla makes you aware of that every time you engage AP
 
View attachment 291004 Agreed with mostly what's said. This accident however is alarming and I for one have been the victim of relying on auto pilot too much. I on numerous times have looked around the vehicle for items and even run autopilot well over the speed limits. I'm sure i am not the only one that has tested the limits of the system. Autopilot allowed you to travel at 150 k/hr in 80 speed zones without even turning off. Almost double the speed limit. The system definitely needs more safety features installed.

Thanks in part to your attitude, Tesla will come out with more changes that punish the driver and make AP less useable, while providing no benefit to any user. The existing speed limit enforcements are a catastrophe, for example when it suddenly thinks you are in a 30mph zone while cruising at 70mph on the highway.
 
  • Like
Reactions: d21mike and MXWing
I understand how AP works, I know is just level 2 but, I was speechless when I figured out and confirmed here at TMC last year that AP won’t stop against a stationary object.
you would think with today’s technology the system should be able to recognize a stationary object on the vehicle’s path/trajectory that is approaching at a high level of speed equal at your vehicle’s speed, it should definitely be able to determine that will result on a collision and act accordingly.

*Won't always.

It's been in the AP manual from the first release. You would think before you put your life in the hands of new technology that has really never been done before in the history of earth, you'd at least reading the f****** manual.
 
  • Like
  • Disagree
Reactions: Matutino and mongo
+100.

Even with detailed release notes, however, it's very difficult for anyone to properly judge the risk of Autopilot:
  • Those who use Autopilot are generally not able to appreciate the risks. Because they use AP, they are, by definition, too dumb to properly assess them (until it's too late).
  • Those who are smart enough to assess the risks of Autopilot, don't use Autopilot.
In terms of fatalities, the safest car on the road today is a Tesla with Autopilot disabled.

No. The safest car on the road is a Tesla with drivers who understand that AP is a car-following and well-marked-lane-line-keeping tool and they understand the design and design limitations of the tool.

And they know to use the AP tool to make the minor adjustments in steering and accel but remain vigilant (and more so being relieved of the burden of those minor adjustments) for the driver need to make other major driving decisions and take other major driving actions.
 
Last edited:
After countless swerves - APII last 2nd decides to swerve us onto available alternate route - or swerve us onto off-ramps, has anyone considered the possibility that something like that happened to this driver?
.

Yes, I've been arguing that this is a possibility for days but only met with "disagrees" left and right. Apparently the prevailing opinion is that even if AP slams you into a barrier at the last minute, its the driver's fault (which is nuts).
 
One of the worst things I see in the video is how the groove comes across the lane right to left, exactly at the area where the left line's paint is faded/gone.

Just really a poorly maintained area of road. One that was bad for humans and bad for line following assistance systems, where all drivers need to pay attention.

Screen Shot 2018-04-01 at 3.17.47 PM.png