Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

HWY101 accident..

This site may earn commission on affiliate links.
Sure those were assumptions, but 65 isn’t really as much a problem. It’s 75+, where it gets tough. The safest years are actually 30-74.

Statistically the cars you are going to have the highest rate of accidents in is your first and your last. Both will probably not be a Model S and X right now.

But in general I just wanted to give some reasons why fatalities might be lower. Not all of them might have to be true, but I think it’s safe to say that even w/o AP the cars would have much lower fatality rates, than the average car.

Sure, maybe somewhere out there there is a 16 year old guy doing drunk street races in his Model X with totally worn down brakes and tires, busted headlights and halve the airbags torn out, but probably not.

Fortunately, we have something close to an apples-apples comparison on this. There was about a 1 year gap between when cars with AP1 started being shipped and when the system was actually turned on. NHTSA found that after the system was activated, accident rates dropped by 40% for the same cars. That’s the rate of accidents per mile driven, regardless of whether AP is even turned on(suggesting it’s an underestimate of the true rate drop) and before the nags were instituted.
 
Well, numbers don't lie. Either they have less fatal accidents or they don't. I appreciate your long response but there is not dispute about the 3.7 time more safer fact. It is like it is. I could go deeper into statistics but what I read out of you mail is that you are not very familiar with it. No offense because most people are not. However allow me to say that all your arguments listed don't apply given the statistics. I learned it at University and believe me there is no point for you to make about this number.

Happy to hear that Honda 5 years ago has a safe car back in 2009 - 2012 but what does this have to do with Tesla and AP today?

Take it as that and if you make the right conclusions it may safe your life one day as well..... you never know.

I had statistics too in university, but apparently I remember that correlation isn’t the same as causation. So AP might have nothing to do with fatality rates at all.
Fortunately, we have something close to an apples-apples comparison on this. There was about a 1 year gap between when cars with AP1 started being shipped and when the system was actually turned on. NHTSA found that after the system was activated, accident rates dropped by 40% for the same cars. That’s the rate of accidents per mile driven, regardless of whether AP is even turned on(suggesting it’s an underestimate of the true rate drop) and before the nags were instituted.

Now that is a statistic that makes sense and sounds reasonable.
 
  • Disagree
Reactions: avoigt
I don't care about how others use the system and will not let it affect how I use it. Seems like whenever there's one bad apple, everything gets spoiled.

This also applies to legislation for everything that exists in government. This will bring anxiety to the mass which leads to future stupidity and waste. You can't fix stupidity.
 
How much does a Tesla with AP cost compared to the average car? Does the fact that only rich people have the Tesla with AP skew the numbers?
The numbers aren’t skewed as media doesn’t do the math. Over 3,600 other people unfortunately passed due to auto accidents that day. Those others don’t get massive clicks and views. Tesla is an easy target. Autopilot saving a person from an accident doesn’t get reported on because it doesn’t generate viewership in the same capacity. Lots of other vehicles that advertise their system with active and passive safety features crash daily, and in a year we’ve got 4 or so high profile Tesla accidents against the 40,100 fatal accidents that gained next to no media attention minus some off cases of celebrity or one off “newsworthy” cases. The folks who knee-jerk react to stories like these hone in on a single case that justifies their techno panic and claim that the tech is killing us while not batting an eye to all of the human error, distracted, or drunk driving deaths that occurred thousands of times that same day. Autopilot in its current state is designed to be used as an assistive feature. Accidents can and will happen with any product, but we should learn from it and not chasitise the technology, but use the info to make the situation safer, from Tesla and all the way down to the fact that the section that was hit was not properly repaired to absorb another impact from an earlier accident.
 
The numbers aren’t skewed as media doesn’t do the math. Over 3,600 other people unfortunately passed due to auto accidents that day. Those others don’t get massive clicks and views. Tesla is an easy target. Autopilot saving a person from an accident doesn’t get reported on because it doesn’t generate viewership in the same capacity. Lots of other vehicles that advertise their system with active and passive safety features crash daily, and in a year we’ve got 4 or so high profile Tesla accidents against the 40,100 fatal accidents that gained next to no media attention minus some off cases of celebrity or one off “newsworthy” cases. The folks who knee-jerk react to stories like these hone in on a single case that justifies their techno panic and claim that the tech is killing us while not batting an eye to all of the human error, distracted, or drunk driving deaths that occurred thousands of times that same day. Autopilot in its current state is designed to be used as an assistive feature. Accidents can and will happen with any product, but we should learn from it and not chasitise the technology, but use the info to make the situation safer, from Tesla and all the way down to the fact that the section that was hit was not properly repaired to absorb another impact from an earlier accident.

But that includes some crap car in southeast Asia transporting 8 people with no airbags, or even seatbelts.

Sure, that doesn't say Tesla AP is unsafe, but I don't think it really safes as many lives as some think here. 40% reduction in accidents is a pretty good statistic, though. Not sure if that also includes AEB, or just AP, but still not bad.

And you have a point that Tesla gets a lot of media attention, but I wouldn't say that's a bad thing. I mean as any random person what they think of Tesla and they will know the company, know the cars are electric, but probably also think the car can drive itself.

Ask them about Cadillac and they won't know about super cruise, which is a pretty equal AP system. So it has it's benefits that Tesla is known, but it also has it's downsides. Maybe Tesla could limit the downsides by educating the public more about AP, but I don't think that's necessary right now. The average Joe and Jane won't buy a Tesla for at least another year anyways and at some point their friends with a Tesla will educate them about the limits of AP.

Headlines like these are only a problem for the people shorting, or longing stock. In a month nobody will talk about this. And this won't lead to different legislation, Uber has done that already.
 
One would hope so. It is probably learning algorithms to be able to handle every non predictable and not repeatable situation rather than learning location based pre existing activities. Might there be room to learn just a few locations though like these dividers?

There was a discussion quite some time (years) ago about this. Originally there were indications that fleet learning was in use. Then there was something to the effect that Tesla gets way too much data, and sifting through it was a huge challenge that they hadn't really tackled. I haven't dug through the forum to try and find the discussion - that would be something of an easter egg hunt.

From my own observation of the autopilot's behavior, I can see no indication whatsoever of any fleet learning. The local HOV lanes drive the autopilot nuts - at every transfer point the lanes widen and narrow again, and at each transition the autopilot jerks the wheel violently. It's never learned not to do this.

Image1.jpg


So I don't believe this "fleet learning" thing was ever actually implemented.
 
  • Informative
  • Like
Reactions: Olle and TrafficEng
Any thoughts? How is Tesla approach even supposed to work here, the previously smashed safety barrier is for sure unrecognizable for the camera(s) and neural networks. If the radar does not detect the object there is a single point of failure that can cause the accident.
What neural network! Isn't a 3 legged dog still a dog! If Beta Pilot cannot recognize an obstruction in the road that is as wide as 2 small children, Eon Musk should stop touting it. I hope the NHTSA recalls this defective product.
 
Great discussion.. Was the driver 100% at fault for not watching the road? YES! Did the system fail and drive into a solid object? YES! Very similar to the Uber AZ accident in that, yes, the system should have totally avoided the accident but also had the driver been paying attention, he/she could have easily avoided the accident as well.

As for the hands on steering wheel nag, I'm working on a similar system and I can say it certainly takes a level of software to get the detection correct... If the wheel torque detect is not sensitive enough then you get warnings even when your hands are on the wheel but if you set the torque detect too sensitive than it can read 'phantom' hands on wheel when the AP system torques the wheel (when your hands are off the wheel) and the mass of the wheel itself creates a false positive hands on wheel signal. Need decent filtering to hands both cases.
 
  • Like
Reactions: mblakele
I wonder what the warning on the MCU was for.

"Hold steering wheel"

That's flashing for just over 9 seconds, up until the point at which driver overrides and AP disengages (Bing-Bong just audible)

So the driver ignored the warning for 9 seconds AND was hand-holding the camera, so not properly in control in the first place, and failed to divert the car, which would have been easy with one hand on the wheel, so came to a halt in the median which seems pretty dangerous to me (if this was a test rather than "not paying attention").
 
This video may highlight what happened in the Hwy 101 incident.

It doesn't highlight any such thing, IMHO. The lane markings are certainly not the same: In your case (assuming you shot the video?) the "left" lane line for the lane you were in, vanished, and the Tesla did what it was designed to do, follow the lane line to the left as the "lane" widened. Obviously it was not the left lane line for the lane you were in...

It's not clear to me why this is being compared to the 101 accident. It's also not clear to me why, in this case, the car path was not manually corrected immediately and directed to the proper lane. Was this something done intentionally to (incorrectly) compare to the 101 accident? If this was not done intentionally, the question is, why were you not paying attention to the road?
 
  • Like
Reactions: mongo
Just so there is no fuzz in understanding by Tesla or anyone else, I have had numerous occasions when autopilot went into search mode in a lane and when changing lanes. This is particularly likely when using a diamond lane where the lane markings are obliterated or ratty on the asphalt shoulder of the left side. It then weaves as far as threatening to hit the divider wall and run over the botts dots on the other side. It requires wrestling back control of steering or canceling autopilot, so I am always on the alert. Traveling at 65+ mph leaves little time to react if caught off guard.
 
  • Like
Reactions: croman
"Hold steering wheel"

That's flashing for just over 9 seconds, up until the point at which driver overrides and AP disengages (Bing-Bong just audible)

So the driver ignored the warning for 9 seconds AND was hand-holding the camera, so not properly in control in the first place, and failed to divert the car, which would have been easy with one hand on the wheel, so came to a halt in the median which seems pretty dangerous to me (if this was a test rather than "not paying attention").
I have never seen AP put an error ! on the MCU before.
 
I'd like to add something more general to this discussion that applies also to driving without any driver assistance features.

It seems as nearly nobody gives a damn about posted speed limits anymore. Going 5mph over the speed limit is kinda common behaviour and socially accepted. And in this case you can clearly see a 45mph limit posted well before that fatal divider! This tells me, the authority who did put that sign up thinks "that's a dangerous section"!

But in the now posted video the AP was set to 59mph and neither did the other cars slow down to 45mph. Also from what Tesla stated about the accident (5 seconds and 150 meters of unobstructed view on divider) you can calculate that the crashed car was going around 67mph! That's more than 20mph over the speed limit.

What does that mean? Well, simple physics: 1st, the kinetic energy would have been half if he went 45mph. Just for comparison, hitting an immovable concrete divider with 45 is like hitting ground after falling from the 7th floor. Hitting it with 67mph is like falling from the 15th floor. 2nd, with 45mph there would have been additional 2.5 seconds time for the driver, EAP or AEB to maybe recognise the divider and react.

So, in the end it's your life and the lives of others you risk just to gain a few seconds on your way!