Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Oh my god! Thoughts people?

This site may earn commission on affiliate links.
  • Like
Reactions: pow216
Looks like driver error not vehicle error to me.

Autopilot is all well and good but it’s not an excuse to take your eye off the road or feet away from the pedals.

I’d imagine similar incidents have occurred with other vehicles using standard cruise control.

Personally I’m a bit of a control freak when driving so use Autopilot rarely, only on motorways obviously, and when I do I actually have so little trust in it that I probably pay even more attention. I use the traffic aware cruise control more as I feel more in control. I didn’t buy my Model 3 for the Autopilot features, but for the battery tech and zero emission goodness.
 
Firstly, next time please choose a more useful thread title, far too click-baity...! :rolleyes:

Secondly, this incident just confirms a number of things:

  • You need to pay attention!! This is 100% the driver's fault, not autopilot. As @CountvonC says, stationary objects and driver assistance systems just do not mix (however sophisticated Tesla's might seem). See this excellent article by Ars Technica: Why emergency braking systems sometimes hit parked cars and lane dividers
  • Autopilot should not be used on city streets / streets with traffic lights and only on dual carriageways and motorways with driver paying full attention and ready to take over as I've said in other threads (and as Tesla's own manual says in the UK/EU). I know above is a highway, but the point is that city streets are more likely to have stationary objects / a sudden stop in traffic...which driver assistance systems just aren't designed for!
 
Yep. Expected behaviour. And its not just Autopilot.

Tesla's steering torque system is too easy to abuse. Pressure has been mounting for a long time by safety bodies. Could this investigation be the final nail in the coffin? A Safety Board Faults Tesla and Regulators in a Fatal 2018 Crash Worth highlighting that the NTSB have no teeth, but quite a big bark.
Yeah, interesting.

While I have a lot of respect for everything Elon has done, and pushed forward in the industry, I do feel strongly that Tesla (and any other manufacturer) should be banned from calling anything "autopilot" or "full self-driving". It is totally misleading and is at least partly why so many people go about using their Tesla driver assistance in an unsafe way.
 
OFC we do not know all the facts, but here is a fact, just because a car has AP there will be times no matter how good it is it, cannot avoid hitting something, if you or the car in AP mode, brakes hard enough to stop the car hitting an object flying into your path, time and distance is too short, you are going to hit it what ever.

The whole point of AP in emergency cases it just may react quicker than you as a driver and save your life.
 
This looks similar to the famous American case where the guy was killed when his car ploughed into a white HGV trailer that was crossing his path.

Like that case, at first sight the driver had plenty of time to react. It just reinforces that you need to pay attention when driving, AP or not.

At least this person walked away. Hopefully the logs will show he wasn’t driving safely.
 
it's a poorly written article, that then goes on to spur a load of "this is exactly why self driving cars are bad" comments.
The facts are that autopilot and FSD are currently driver aids, and there's no way any one sitting in a tesla would assume otherwise after the torrent of warnings and informations stating just that. They may opt to ignore this, but you know full well that you need to pay attention.
Crashes like this show yet again, that human error is the biggest issue. People choose to not pay attention, to answer their phones, to drink/smoke and drive, to drive when they are too tired to visibility is too poor.
On the plus side, these people choosing to ignore the warnings and crash only help advance the technology as now Tesla can look at the crash data and help the fleet avoid this sort of incident.
 
  • Like
Reactions: stonecoldrmw
On the plus side, these people choosing to ignore the warnings and crash only help advance the technology as now Tesla can look at the crash data and help the fleet avoid this sort of incident.

You would think that, but the inability of AP to tell the difference between a stopped truck in front of you and empty tarmac when doing 70mph has been a consistent feature since day 1 of AP been in existence.

Yes the driver should have seen it, but clearly the AP/FSD code is no where near ready for driver less cars.
 
As someone pointed out on the other thread, looking closely at the front of the car, it looks like there is some front-end damage and there was speculation that radar was broken / rendered inoperative by collision damage. If so, I'd be suprised that FSD would engage if some of the necessary sensors aren't working. Having said that, a rapidly closing radar return would trigger emergency braking... if you're reliant on the front camera, maybe not so much.

Does any one know which sensors (Cameras, Ultrasound, Radar) need to be working to engage FSD?. I wouldn't be suprised if this car has been modified, possibly to disable hands-on steering nags. Even if FSD computer vision didn't see the truck, Radar should be a cross check, if it's working, albeit that the truck body might be made from composites (Stealth Truck).