This phantom braking issue is really a dangerous problem.Hot off the press here in Norway: Three-car Chain Collision on the Europa 6 Highway due to Tesla Autopilot Phantom Braking.
(In Norwegian: Kjedekollisjon på E6 etter at Tesla på autopilot bråbremset)
"Drove on autopilot.
Police chief Anne Marie Dypdahl tells Adresseavisen (Ed: the newspaper) that it was a Tesla on autopilot that caused the collision.
- It looks like a truck was coming in a southbound direction, and that the Tesla, which came in the northbound direction with autopilot on, registered an obstacle in the road. This caused it to automatically apply full brake. As a result, the Tesla was hit by a car from behind, and the car that was driving at Tesla was hit by a truck from behind.-
- It looks like it was the autopilot mode that made the Tesla slam on the brakes. The driver did not get control.-
In UK certainly the principle is you must drive so that you can maintain safe control regardless of what traffic ahead might do.
In practice, we all assume that at least when the road ahead is clear, we don't have to allow for the car in front slamming on its brakes.
However, as with many items of news, there might be more information we don't know. Maybe the Tesla had some other failure? I wonder if 'obstacle aware acceleration' was active? If so, would stamping on the accelerator override automatic braking? This is essential information. Can the driver ALWAYS override automatic control? If not, then Tesla must also be liable if it's proven that their automatic systems created elevated risk / danger that directly caused an accident.
Any way, hopefully more details will be known eventually. My concern about AP is that it allows drivers to believe they don't need to concentrate as hard on driving, when in reality nothing could be further from the truth.
Last edited: