Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Two reports of Teslas on AP hitting stopped vehicles in their lane on the freeway

This site may earn commission on affiliate links.
I wonder how many people using cruise control have lost concentration for a few moments and plowed into the back of a car. I'll bet that happened more often after cruise control was invented (50 years ago?) than previously. But I've never heard anyone blame that on cruise control or suggest that we ban cruise control until everyone concentrates on the road 100% of the time. But if someone does something dumb using autopilot they blame autopilot instead of the driver.
 
I wonder how many people using cruise control have lost concentration for a few moments and plowed into the back of a car. I'll bet that happened more often after cruise control was invented (50 years ago?) than previously. But I've never heard anyone blame that on cruise control or suggest that we ban cruise control until everyone concentrates on the road 100% of the time. But if someone does something dumb using autopilot they blame autopilot instead of the driver.
Absolutely. That’s the challenges that new tech and a car company that bucks the system faces.
 
Dosent really matter. All he need is events like this portrayed in the correct light to get some type of legislation banning autonomous systems in cars. Next thing you know we get an update that hobbles Autopilot until it is “proven” safe.
Notwithstanding that "Perfect is the enemy of good", can laws really be passed that ignore Autopilot statistics which show, in a Congresscritter's district, that on average, Autopilot prevents more slaughter? Any Autopilot ban can invoke a countersuit if the stats support such.
 
Notwithstanding that "Perfect is the enemy of good", can laws really be passed that ignore Autopilot statistics which show, in a Congresscritter's district, that on average, Autopilot prevents more slaughter? Any Autopilot ban can invoke a countersuit if the stats support such.
I know it's déclassé to reply to oneself, but insurance companies will ultimately become the arbiters on what is safer. Towards this, because insurers are largely driven by actuaries, Tesla should prevail.
 
  • Like
Reactions: MichaelP90DL
When legislation banning "autopilot" comes down the pike, or agency regulators prohibit it, there will be lawsuits and lots of publicity demonstrating how the feature saves lives, prevents injuries and helps avoid accidents. Good luck selling the public on the idea it's dangerous when the stats will clearly prove otherwise.
 
Then why does it cost twice as much to insure a Tesla than any other sedan or compact?. Model 3 cost double the insurance premium than my 2018 accord.

It seems that your situation isn’t typical. Our Model 3 is less expensive to insure than our 2012 Passat TDI was. There are several threads discussing Model 3 insurance rates and very few people report the high prices you’re seeing. Maybe it’s time to shop around for a better rate.
 
  • Like
Reactions: azred
I wonder how many people using cruise control have lost concentration for a few moments and plowed into the back of a car. I'll bet that happened more often after cruise control was invented (50 years ago?) than previously. But I've never heard anyone blame that on cruise control or suggest that we ban cruise control until everyone concentrates on the road 100% of the time. But if someone does something dumb using autopilot they blame autopilot instead of the driver.

Normal cruise control doesn't offer much in the way of being able to mentally checkout. Sure it could potentially happen, but its terribly basic. Adaptive cruise control on the other hand probably has caused quite a few accidents that we simply don't hear about unless someone actually admits to them.

Lane-Steering + Adaptive Cruise control accident allow a driver to mentally checkout a lot more. It also gives people a false sense of security in excusing some action they wouldn't normally other wise do. I'd be extremely surprised if most drivers haven't used features like Autopilot as an excuse to send a text, look up something, check on a kid/dog, etc.

It's hardly surprising that this kind of accident would happen.

But, we can't look at it by itself. We have to look at it from the perspective that people do dumb things even when not using these technologies. All that really matters is whether having it adds for more danger than not having it.

Now that doesn't mean regulators won't make adjustments. The Europeans forced Tesla to nerf AP, and I wouldn't be surprised if doing driver engagement with a torque sensor eventually gets banned by regulators for new vehicles capable of lane-steering.
 
  • Informative
Reactions: APotatoGod
Now that doesn't mean regulators won't make adjustments. The Europeans forced Tesla to nerf AP, and I wouldn't be surprised if doing driver engagement with a torque sensor eventually gets banned by regulators for new vehicles capable of lane-steering.

Key word being “new” vehicles.

If they try to retroactively alter cars that have already been purchased with autonomy the outcome would be pushing people into illegally modding (hacking) their cars.

THAT would be the start to us seeing some real crazy crashes happening, because the mods would be all over the place and the consequences would be unpredictable.

I’d prefer to have Tesla gathering info from the entire fleet, testing , and then distributing updates as opposed to John Smith trying to modify the code with his laptop in a garage to sell for a few bucks.
 
When I'm on AP, it's starts braking a long way before the car in front slows. Maybe if you have it at 1 car length... mine is set at 5:

Tesla on autopilot rear-ended Connecticut cop car as driver checked on dog: police

And... a fire turck:

NTSB report says California Tesla driver was using Autopilot when he hit a firetruck

This to me is the biggest problem with Autopilot.

People assume one incident is like the other.

Like you posted two examples that have entirely different ADAS systems behind them, but you posted them as if they were the same.

The one with a firetruck was an AP1 Tesla.
The one with the CT cop car was an AP2.5 Tesla (or possibly HW3).

What's interesting is relatively speaking is it doesn't happen all that much. Back with AP1 a couple years ago there was a rash of them, but not a whole lot since then. Sure it happens, but it certainly didn't seem to scale with the volume of sales.

So I think people are making a big deal about a problem where the opposite (false braking) is likely a bigger issue.
 
  • Like
Reactions: bhzmark
I very much doubt this is true. Coming up on stopped traffic is a known issue with every manufacturers’ Adaptive cruise systems because they’re designed to ignore most stationary objects they detect. You need to exercise due caution when using any system as of now. I had a couple incidents on my Volt where I had to manually slow down while on adaptive cruise while approaching stopped vehicles.

Not all adaptive cruise control systems are the same.

You're thinking of systems that rely predominately on Radar, but the Subaru system isn't radar based. Instead its a Stereo Camera based system that doesn't have any issue detecting stopped cars.

What I'd love to see if a shoot out between Tesla HW2.5 vehicles, and Subarus with eyesight to see which ones will stop for a suddenly revealed stop car. Like when the driver in the car in front of suddenly changes lanes and reveals a stopped car.

My money would be on the Subaru system.
 
Key word being “new” vehicles.

If they try to retroactively alter cars that have already been purchased with autonomy the outcome would be pushing people into illegally modding (hacking) their cars.

THAT would be the start to us seeing some real crazy crashes happening, because the mods would be all over the place and the consequences would be unpredictable.

I’d prefer to have Tesla gathering info from the entire fleet, testing , and then distributing updates as opposed to John Smith trying to modify the code with his laptop in a garage to sell for a few bucks.

I doubt anything retroactively would happen in the States. That doesn't typically happen except in extreme cases, and I don't think this is an extreme case.

BUT, and there is a but.

It did happen with AP1 when Tesla added the timed nagging to AP1 as a result of pressure from the NHTSA. It's been in both the AP1, and AP2 firmware for awhile. As a result we have seen some real crazy crashes happening as a result of torque sensor defeat devices. I am pretty sure those "mods" really are all over the place because we have videos from people sleeping with AP on, and we know that's not possible without the defeat devices.

I wouldn't rule out any retroactive stuff with Europe as they after all restricted AP2 capabilities retroactively already.
 
  • Informative
Reactions: APotatoGod

Yet, you had to go all the way to Russia to find an example. :p

Actually you could have found examples from verygreen's twitter which clearly demonstrate that Tesla has issues with detection on the side of the road.

Anyone that drives a Tesla with autopilot needs to be extra vigilant when it comes to stopped objects especially when it's on one side of the lane.
 
It’s especially the partial lane blocking that is hard to detect. Police and fire responders have a terrible habit of partially blocking a lane — where only half or less of their car is sticking out into a highway lane — making it harder to detect from a distance for both human and driver assistance technologies. They should stop that practice. They should either fully block the lane, or fully stay in the shoulder or adjacent lane.
 
Also there is a significant difference in the automatic emergency braking and maneuvering capabilities of competing cars compared to their Autopilot-like driver assists. They are not the same thing.

The latter are often more rudimentary by design compared to Tesla, while the former on the other hand are often superior to Tesla. Many car companies focus on safety features and are more conservative on convenience automation, so their lane keeping may be stupid while their emergency recognition and action are top notch. Also many are based on MobilEye which is very mature vision technology for these types of situations.

So I would not confuse what a competing ”Autopilot” does to what their emergency capabilities may be as these are two very different things. Like @S4WRXTTCS suggests, my money too would be often on the competitor rather than Tesla when it comes to emergency action (assuming a well-equipped late-generation competitor of course) and vice-versa for convenience assists where Tesla leads (assumiong late-generation well-equipped Tesla of course).

The companies simply focus on different things as well as have different technology stacks to work with.

(I am excluding autonomous cars like Waymo here. Talking about cars on the market.)
 
Last edited:
Don't mind the ramblings of Dick. He runs his mouth anytime there is a microphone nearby regardless of the topic, relevance, or facts.

The driver flat out admitted that he had it on autopilot and was not paying attention while attending to a dog in the back seat. There is no debate or question as to whether AP was on or not. So best to stop with that speculation trying to make it less of a problem.

This is an always will be a known limitation of radar. It's made very clear in the manual. The accident is no surprise, has happened before, and will happen again. The fault is simply driver inattention, made easier by autopilot.
 
Yet, you had to go all the way to Russia to find an example. :p

Actually you could have found examples from verygreen's twitter which clearly demonstrate that Tesla has issues with detection on the side of the road.

Anyone that drives a Tesla with autopilot needs to be extra vigilant when it comes to stopped objects especially when it's on one side of the lane.

There are hundreds of example, but Tesla fans just brush it off. So I usually go with the most egregious.