Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Needless to say, AP isn't designed to handle the kinds of roads shown in your videos. And these are going to be especially difficult for autonomous, whenever that comes. Don't see many roads like these in the US. Not sure how often your cross-into-the-wrong-side-of-road strategy would be appropriate here.

I think the simple things for autonomous vehicles will cause the most grief. I see self driving vehicles as causing impossible traffic jams. Left turns will be a nightmare.
Does the autonomous vehicle know to pull up into the intersection if cars are behind it so they can go around? Will it tun left on red or just sit there waiting waiting waiting?
Will it go over the line to get around someone else turning left? Will an auton vehicle know if someone is stopping short in order to leave a space to turn left and will it be
able to see into the next lane for oncoming cars that a human would see?

I think the only way to get there is if all cars on the road are communicating with each other. How far off is that? 10 years? 15?
 
  • Like
Reactions: sillydriver
There may be many reasons but I think it's also due the lack of understanding of how buying a beta product works.

When I bought Autopilot, it only allowed a maximum speed of 45 MPH on freeway.

That's very dangerous when my neighborhood freeway CA-99 speed limit is 70 MPH and people drive much faster than that!

Supposedly I got rear ended and ran over by a speeding tractor-trailer 18-wheel truck because I was driving too slowly with an Autopilot that's designed for a maximum speed of 45 MPH, should I blame Tesla?

I could say why Tesla would allow a system to drive so slowly at 45 MPH in a 70 MPH zone.

And now, Walter's family has the same kind of question why Tesla would allow a system to drive into the gore point?

They might not understand that it takes human hours to write programming codes, to debug, to add features... and the system is not yet completed!

The "system is not completed" argument may actually put them in trouble if they argue that in court, but I could be wrong since I am not a lawyer.

I do think one scenario that Tesla could be find liable for a crash is if the car is read ended while traveling on a free flowing freeway and suddenly applied full brake because it falsely detected an obstacle and triggered the AEB. Generally the car behind is found liable, but I am curious if Tesla can be found liable if this scenario ever happened. It's also more difficult to recover quickly from a false AEB scenario because it takes a second or so to cancel AEB and then a bit more time to recover from the reduced speed.
 
The family have just hired lawyers and are preparing to file a lawsuit against Tesla for wrongful death. I find it interesting that they are not sueing Caltrain or whoever was supposed to be responsible to fix the barrier, which would probably be an easier case to win.

They probably are. Most personal injury attorneys will sue everyone under the sun to increase the pot of money. There are lots of potentially liable parties for a variety of causes of action.

What most internet attorneys in this thread fail to realize is that Walter's family's attorney and family are repeating that Walter warned/notified Tesla about this particular concern repeatedly not because of a negligence claim (which places more negligence on Walter for knowing about an issue but proceeding anyway (i.e. assuming the risk) but to build a better product liability claim (which, lets face it, is fairly easy case for a jury because the jury can easily compare what Tesla deceptively describes as EAP/FSD and the reality of the system, including this particular defect (gore point lust)). I'm 100% sure the PI atty will attempt to introduce all of the videos of AP2 trying to smash into gore points across the nation, including at that particular spot. Tesla can try the "hold your hands on the wheel thing" but then I'm sure they'll show Elon showing off AP1 and insisting its hands-free....its just not a good look. Tesla should settle this ASAP.
 
One of the few things Uber did smartly..... settling quickly. Tesla doesn't seem to realize that is also in their best interests.

Uber disabled a system that would have prevented the crash and replaced it with one that didn't. Along with having a FSD system with designated safety driver that also failed.

Tesla took a car without a collision system and added one. Imperfect? sure, but with known limits and operational parameters.
Might as well hand out checks to anyone who thinks their adaptive cruise or lane keeping system means they don't have to driver up.
 
  • Like
  • Disagree
Reactions: NerdUno and bhzmark
I don't know how many times it needs to be pointed out:

Tesla detecting hands not on wheel DOES NOT MEAN that the hand is not wheel.
My hands are always on the wheel when I am using AP2 (and I use it almost all the time), and my model X gives spurious warning every 1.5 minutes or so.

Many drivers hold steering with a light touch. Tesla is probably doing this trick to skirt blame, and fool non-Tesla owning public.
No evasive action taken. Light hand of wheel bs. He took no evasive action. Was he unconious or totally distracted
 
...The "system is not completed" argument...

I don't see how Tesla can escape that fact that its engineers are still working all the bugs out and Autopilot is no way near complete!

As I pointed out before, I drove 200 miles and I had to manually take over twice. The first one because during an auto lane change: the system freaked out because the lane markers disappeared in front of its eyes:

tBIGusG.jpg


As human, I didn't see any problem and I could figure out the lanes fine but as machine, Tesla needs to figure this scenario out! May be if they can add more High Definition Mapping system to reassure the system that there are still 2 lanes ahead even though the lane markers were gone or it needs to add Artificial Intelligence to recognize that human keeps manually correcting the system at this location so it's time to drive like human do...

The other scenario was: some told me that I didn't have to look out before flipping the stalk for auto lane change and guess what? I got honked by the incoming car from behind and a collision was avoided because I manually canceled the auto lane change:


This scenario showed that Autopilot is still in its very infancy. It couldn't even abort the lane change on its own for an incoming car from behind that barely ran faster than mine did!

Certainly, selling an incomplete, imperfect Autopilot will be raised in court and I don't see how Tesla can sell a beta product without that fact!
 
I don't see how Tesla can escape that fact that its engineers are still working all the bugs out and Autopilot is no way near complete!

No one said they are. It is driver assist, not driver replacement.

As I pointed out before, I drove 200 miles and I had to manually take over twice. The first one because during an auto lane change: the system freaked out because the lane markers disappeared in front of its eyes:

Freaked out, or followed its programming based on the painted lines?

As human, I didn't see any problem and I could figure out the lanes fine but as machine, Tesla needs to figure this scenario out!

Yes, yes they do before the system can be hands free.

The other scenario was: some told me that I didn't have to look out before flipping the stalk for auto lane change and guess what? I got honked by the incoming car from behind and a collision was avoided because I manually canceled the auto lane change:
Well that was bad advice: Read the manual next time
Warning: It is the driver's responsibility to determine whether a lane change is safe and appropriate. Auto Lane Change cannot detect oncoming traffic in the target lane, especially fast moving vehicles from the rear. Therefore, before initiating a lane change, always check blind spots, lane markings, and the surrounding roadway to confirm it is safe and appropriate to move into the target lane.

This scenario showed that Autopilot is still in its very infancy. It couldn't even abort the lane change on its own for an incoming car from behind that barely ran faster than mine did!
It doesn't claim to be able to. It states that the driver needs to check if it is safe.

Certainly, selling an incomplete, imperfect Autopilot will be raised in court and I don't see how Tesla can sell a beta product without that fact!
They do say it is not perfect, and that the driver needs to remain aware and have hands on wheel. (and RTFM!)
 
No evasive action taken. Light hand of wheel bs. He took no evasive action. Was he unconious or totally distracted
Agree. It would definitely-without-a-doubt be detected if evasive action had been attempted. It's immaterial if the wheel senses torque or not from a light touch on the wheel.

We may never know the reason he didn't take action, but I hope there is some closure for his family's sake.
 
This story has become the main thread here for a few weeks here. But I keep scratching my head as to why this accident happened in the first place.

I read in the news that he told his wife and brother that autopilot tried to collide with the concrete when he passes that exact spot. Although not conformed by Tesla, his brother said that he complained to Tesla Service Center that autopilot is not working properly at that location. He also tried to demonstrate to his wife how the autopilot failed to work ( or tried to kill him) at the exact location.

Yet, he turned on autopilot and ignored the warnings to put his hand on the wheel (with torque of course) and failed to see outside his windshield or put his foot on the the break when autopilot acted the way he expected it to act. In the first place, why on earth would anyone turn on the autopilot knowing it is trying to kill him/her at that location? What am I missing here?
 
This story has become the main thread here for a few weeks here. But I keep scratching my head as to why this accident happened in the first place.

I read in the news that he told his wife and brother that autopilot tried to collide with the concrete when he passes that exact spot. Although not conformed by Tesla, his brother said that he complained to Tesla Service Center that autopilot is not working properly at that location. He also tried to demonstrate to his wife how the autopilot failed to work ( or tried to kill him) at the exact location.

Yet, he turned on autopilot and ignored the warnings to put his hand on the wheel (with torque of course) and failed to see outside his windshield or put his foot on the the break when autopilot acted the way he expected it to act. In the first place, why on earth would anyone turn on the autopilot knowing it is trying to kill him/her at that location? What am I missing here?

That's mostly it. Although there is no data ( I have seen) that the AP gave warnings in the time immediately before the impact (though it did previously), only that his hands were not detected on the wheel which does not mean his hands were not on the wheel, only that there was no torque detected.

Regarding potential causes for your head scratching:
Outside the social/ human factors, the accident may have occurred due to the faded/ missing right line of the gore point causing the car to track the left line and remain centered in the gore point up to the pre-collapsed barrier.
 
Seems that way:

It now escalates to a public dispute where Tesla accuses NTSB that:

1) they're more concerned with press headlines than actually promoting safety
2) they repeatedly released partial bits of incomplete information to the media in violation of their own rules
3) at the same time that they were trying to prevent us from telling all the facts
4) their focus on the safest cars in America while they ignore the cars that are the least safe.

Tesla will:

1) have official complaint to Congress
2) be issuing a Freedom Of Information Act request to understand why picking on Tesla
 
  • Disagree
  • Informative
Reactions: smac and zmarty
It now escalates to a public dispute where Tesla accuses NTSB that:

1) they're more concerned with press headlines than actually promoting safety
2) they repeatedly released partial bits of incomplete information to the media in violation of their own rules
3) at the same time that they were trying to prevent us from telling all the facts
4) their focus on the safest cars in America while they ignore the cars that are the least safe.

Tesla will:

1) have official complaint to Congress
2) be issuing a Freedom Of Information Act request to understand why picking on Tesla

This is unheard of. No manufacturer/fleet operator EVER challenges the integrity of NTSB. This is because:

(a) NTSB really does have a reputation for conducting fair and complete investigations; and

(b) as long as a participant plays by NTSB’s rules, NTSB goes out of its way to (i) provide participants with an opportunity to comment on/influence investigation outcomes, (ii) give participants plenty of heads up before issuing reports, (iii) not publicly badmouth participants outside of the final reports, and (iv) spread blame around if at all possible.

NTSB isn't picking on Tesla. NTSB investigates novel transportation mishaps. Where a new technology is potentially involved in a car accident, that accident is the kind of thing NTSB investigates (here: the issues surrounding AP as well as the firefighter confusion about putting out the battery fire). The point is to identify all of the contributing factors to a complex accident, and make recommendations about how those factors could be removed/mitigated.

NHTSA is different. It tries to identify manufacturing and design defects, largely using statistical analysis of reported failures and then follow-up inspections of mechanical elements of a vehicle line.
 
This story has become the main thread here for a few weeks here. But I keep scratching my head as to why this accident happened in the first place.

I read in the news that he told his wife and brother that autopilot tried to collide with the concrete when he passes that exact spot. Although not conformed by Tesla, his brother said that he complained to Tesla Service Center that autopilot is not working properly at that location. He also tried to demonstrate to his wife how the autopilot failed to work ( or tried to kill him) at the exact location.

Yet, he turned on autopilot and ignored the warnings to put his hand on the wheel (with torque of course) and failed to see outside his windshield or put his foot on the the break when autopilot acted the way he expected it to act. In the first place, why on earth would anyone turn on the autopilot knowing it is trying to kill him/her at that location? What am I missing here?
We are all scratching our heads on this one. I can only assume a medical condition prevented him from taking the wheel.