Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla may have been on Autopilot in California crash which killed two

This site may earn commission on affiliate links.
How many accidents happen because of speeding? Why are cars even able to go past 55 in New York? Do we blame the car manufacturer or capability for people that are recklessly using them wrong? They could easily limit cars max speed. Pretty sure it says not to speed in the manual.

I'm sure people crash from normal cruise control.
I'm sure people crash from looking at nav screens.
The radio is a distraction for drivers. I rear ended someone in my other car because I looked down at my radio. I didn't blame GM for giving me a Bose nav radio option.

My point is you cant blame everything. The driver has to be blamed.
 
Trains and subways are on tracks. I'm totally open to the possibility that cruise control is unsafe and should not be on cars. I think that any small increase in accidents due to driver incapacitation while using cruise control are probably more than offset by a decrease in accidents by people using cruise control to limit their speed. It seems like many people are not even open to the possibility that driver assist systems can reduce road safety.

Is your position that cruise control is more safe overall
Obviously, if traditional cruise control is making the roads less safe it should be banned.

Or that it is more safe in every instance?
Logically I can't see how cruise control could make a car less safe when used in the real world.
May sound nitpicky, but goes whether we are discussing AP neededing to be safer fleet wide or safer in every situation.
 
Is your position that cruise control is more safe overall


Or that it is more safe in every instance?

May sound nitpicky, but goes whether we are discussing AP neededing to be safer fleet wide or safer in every situation.
i think @daniel has point in that TACC never enables the driver to leave steering to the car, whereas AP enables the driver to act imprudently for a substantial amount of time before it pulls over and shuts down the car.
 
i think @daniel has point in that TACC never enables the driver to leave steering to the car, whereas AP les the driver to act imprudently for a substantial amount of time before it pulls over and shuts down the car.
Ya mean @Daniel in SD, but I was speaking to whether classic cruise control which has been on the road for decades increase or decreases safety.
Obviously, if traditional cruise control is making the roads less safe it should be banned. I haven't seen any evidence of that and it's a pretty mature technology. Logically I can't see how cruise control could make a car less safe when used in the real world.
 
Product liability laws aren't kind to manufacturers. If the product is prone to abuse or the manufacturer can foresee abuse then the manufacturer is culpable.
A Guide to California "Products Liability" Laws

Quoted for truth. Been saying it for years. Tesla knows this but allows for local road use/misuse even though they used to geofence it. Thus the user manual can't be used to excuse Tesla because they let their system be misused and it's clearly foreseeable and a question whether it's even misuse when Tesla allows it.
 
Is your position that cruise control is more safe overall


Or that it is more safe in every instance?

May sound nitpicky, but goes whether we are discussing AP neededing to be safer fleet wide or safer in every situation.
I don’t know if the existence of cruise control is safer overall or not.
If everyone used Autopilot properly it would definitely be safer than if it didn’t exist. If everyone hangs a sock of lead shot from the steering wheel and takes naps on every drive it would be thousands of times less safe. The reality is somewhere in between. One thing I believe is that publicizing the consequences of abuse will result in less abuse of the system.
 
  • Helpful
Reactions: mongo
Autosteer opens the possibility of abuse, and people being people, some people will misuse it and there will be accidents as a result. What we never see are the number of accidents that don't happen because a driver using autosteer is more mentally alert. I was more rested and alert after a 7-hour trip with EAP than I ever was on the 6-hour drive to the same location in an old-fashioned car. (Longer route to catch a supercharger.) By doing the steering for me, the car freed up a large part of my brain that would have been keeping the car in the lane, allowing me to be more attentive to situations that would require action. But all we will ever see are the reports of accidents involving the reckless people who were not paying attention, and who probably would have crashed a conventional car by texting while driving. Those accidents don't get reported because if they were there'd be no room for any other news.
 
Trains and subways are on tracks. I'm totally open to the possibility that cruise control is unsafe and should not be on cars. I think that any small increase in accidents due to driver incapacitation while using cruise control are probably more than offset by a decrease in accidents by people using cruise control to limit their speed. It seems like many people are not even open to the possibility that driver assist systems can reduce road safety.

I recall (though admit I cannot find anywhere now) a serious study (in Europe somewhere) that showed that accident rates did indeed go down when cruise control was used. Its from memory, but the study authors I think concluded that once someone used CC, they tended to kick back and let the car do more of the work, which cut down on lane changes, tailgating, and other behaviors that can precipitate an accident. I suspect this applies equally (perhaps more so) to more modern ACC systems.
 
  • Like
Reactions: bhzmark
What we never see are the number of accidents that don't happen because a driver using autosteer is more mentally alert.

EXACTLY. You need to put single incidents in perspective, and balance them with all those unreported/unnoticed times when the system does better than a human and maybe even saves lives. Like airbags, which can be dangerous. And seat belts, which can be dangerous. And every other safety system under exceptional circumstances.
 
This type of accidents will always be driver's fault in the most technical sense. However, I do see the possibility that government will eventually limit the implantation of these systems. Majority of people driving cars on the road are not professional drivers, so they lack the discipline to be a good driver and do not typically know how to act when something goes wrong. This also means they have minimal understanding of the car's systems, including something like Autopilot. And just like anything, a good system can very easily lead to very bad outcome if not used properly.

I do think this is why people out there think L3 system is bad. And in my opinion, even the L2 system as shown in many cars can lead to lapse of judgement that could result in very bad accidents. There is a chance that these bad accidents with Autopilot will occur more frequently as more Tesla gets on the road. And I can see the government either putting in some strict monitoring process when these systems are in use, or severely limiting the capability of the system. But the bottom line is, something needs to be done before more people lose their life.
 
  • Disagree
Reactions: Superendo
Let’s take the word ‘Tesla’ out of this news story and what do we have?

Late at night on the weekend closest to New Year’s Eve, a speeding car exits a freeway, flies down an off-ramp without slowing down, blows through a red light, plows into a car and kills two people.

Raise your hand if the first thought to come to your mind is “oh, that must be a semi-autonomous driving equipment malfunction.”

OR, do you think, “another damn drunk driver!”

Sadly, the latter happens about 10,000 times more often than the former. Unless the drunk kills a school bus full of children (and school bus safety is an issue), no federal investigations are ever launched.

Tesla should insist that the local police publicly rule out impairment as a factor before dragging Tesla’s name through the mud because the odds are very high (pardon the pun) that driver impairment is the real cause of these tragic deaths.
 
In my experience, NOAP quits AP mode after entering the exit ramp, so running the red light seems doubtful... Nevertheless, AP was never advertised to work with traffic lights / outside highways, so I don't see how is this not a driver's fault?
You might want to go for another ride with NOA taking an exit for you. NOA disengages, but "cruise" stays on. So - if you are not paying attention - your car will keep driving until either a car stops in front of it or it hits something. Yes, it is the drivers fault, but this has always seemed like a banana pants decision by Tesla - inviting precisely what might have happened here.
 
Let’s take the word ‘Tesla’ out of this news story and what do we have?

Late at night on the weekend closest to New Year’s Eve, a speeding car exits a freeway, flies down an off-ramp without slowing down, blows through a red light, plows into a car and kills two people.

Raise your hand if the first thought to come to your mind is “oh, that must be a semi-autonomous driving equipment malfunction.”

OR, do you think, “another damn drunk driver!”

Sadly, the latter happens about 10,000 times more often than the former. Unless the drunk kills a school bus full of children (and school bus safety is an issue), no federal investigations are ever launched.

Tesla should insist that the local police publicly rule out impairment as a factor before dragging Tesla’s name through the mud because the odds are very high (pardon the pun) that driver impairment is the real cause of these tragic deaths.
It’s possible that advanced driver assistance systems encourage people to drive impaired when they would not normally do so. That’s the first thing that came to my mind.
Has anyone said it’s not the driver’s fault? Most people here seem to be arguing with a straw man.
 
... Tesla should insist that the local police publicly rule out impairment as a factor before dragging Tesla’s name through the mud because the odds are very high (pardon the pun) that driver impairment is the real cause of these tragic deaths.

Unfortunately, Tesla has no power to compel the police to leave their name out of the report, and has no power to compel the media to leave their name out of the news. The price of success is bad publicity: The very rareness of a Tesla crash assures that it will be reported as a Tesla crash.
 
Let’s take the word ‘Tesla’ out of this news story and what do we have?

Late at night on the weekend closest to New Year’s Eve, a speeding car exits a freeway, flies down an off-ramp without slowing down, blows through a red light, plows into a car and kills two people.

Raise your hand if the first thought to come to your mind is “oh, that must be a semi-autonomous driving equipment malfunction.”

OR, do you think, “another damn drunk driver!”

Sadly, the latter happens about 10,000 times more often than the former. Unless the drunk kills a school bus full of children (and school bus safety is an issue), no federal investigations are ever launched.

Tesla should insist that the local police publicly rule out impairment as a factor before dragging Tesla’s name through the mud because the odds are very high (pardon the pun) that driver impairment is the real cause of these tragic deaths.

None of the original reporting mentioned Autopilot. It stated a Tesla hit a Honda Civic at high speed, and no word yet on if drugs or alcohol were involved and no arrests made. All very standard auto fatality local news coverage. Wasn’t until the NHTSA got involved that AP became the subject in the coverage.
 
I don't worry about the Federal Government banning Autopilot. They are advancing as fast as they can similar self driving for war machines.
They are developing self driving tanks, Humvees, troop transports, mine IED detectors, boats etc.
They want to take drivers out of the danger zones and keep them safely at keyboards. Drones are the future for battles .