Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Police charge Tesla driver who was going 90 mph on AP while asleep

This site may earn commission on affiliate links.

diplomat33

Average guy who loves autonomous vehicles
Aug 3, 2017
12,688
18,648
USA
Canadian police have charged a 20-year-old man with dangerous driving after he was arrested for allegedly being asleep at the wheel of a 2019 Tesla Model S while it was operating in its semi-autonomous "Autopilot" mode.

The Royal Canadian Mounted Police (RCMP) said it received a complaint of a car speeding near Ponoka, Alberta at around 4 pm on July 9.
The vehicle was traveling at more than 140 km (86.9 miles) per hour, with both front seats "completely reclined and both occupants appearing to be asleep," the RCMP said.

When a police officer approached the vehicle with emergency lights the Tesla "automatically began to accelerate" to 150 km (93.2 m) per hour, the RCMP said.

After pulling him over, the officer charged the driver, who is from British Columbia, with speeding and suspended his license for 24 hours. After further investigation, police charged him with dangerous driving and he is summoned to appear in court in December.

Canadian police charge Tesla driver who was allegedly asleep at the wheel at 90 mph - CNN
 
  • Informative
Reactions: alstoralset
If those RCMP were a bit more clever, they would have gone in front and slowed down. Or maybe that's what they mean by "pulled him over."

It looks like the blocking in front technique was not used but the regular way of chasing from behind was used.

Otherwise, how do explain that "When a police officer approached the vehicle with emergency lights the Tesla "automatically began to accelerate" to 150 km (93.2 m) per hour, the RCMP said."

Notice that Tesla's automation system only works up to 90 MPH but in here, it's 93.2 MPH which meant it's not the automation who did the speeding at that time once the 90MPH was breached.
 
maybe luck, maybe Autopilot. Or maybe this was just a prank.

If it was a prank, it was a reckless one.

AP is good in a lot of situations but it's not THAT good. It's not full self-driving. We've all been in situations where AP/NOA could not handle a situation and required a driver intervention. If one of those situations had happened during this "prank", it could have been a serious accident.
 
Lucky coincidence.

How can it be a lucky coincidence. The car isn't designed to drive in a perfect straight line and the road just so happened to be straight. It was a system designed to keep the car in lane and it functioned successfully. I'm not saying it would do this enough times to be considered perfect, but I'm not going to call a system that did what it is intended to do a coincidence.

That's like saying a person driving down the road at 150km/h and not getting into an accident is a coincidence. If we get into a situation that a Tesla is not even designed to handle and it miraculously maintains control and keeps the car and the driver safe, then we can say it was a coincidence.
 
How can it be a lucky coincidence. The car isn't designed to drive in a perfect straight line and the road just so happened to be straight. It was a system designed to keep the car in lane and it functioned successfully. I'm not saying it would do this enough times to be considered perfect, but I'm not going to call a system that did what it is intended to do a coincidence.

That's like saying a person driving down the road at 150km/h and not getting into an accident is a coincidence. If we get into a situation that a Tesla is not even designed to handle and it miraculously maintains control and keeps the car and the driver safe, then we can say it was a coincidence.

It is a lucky coincidence that they did not encounter a situation that would cause AP to fail. They would not have been able to prevent an accident if AP had failed.
 
Your definition of a lucky coincidence is a lot different than mine. I've been in Teslas that have gone well over 200km before we reached our exit without a single intervention on the exact same road that this arrest took place on in Alberta, very close to where I live. I watch videos on Tesla updates all the time to see how it is improving where they almost never need to take over either and some of the things that does cause a problem like some weird lights over highways, i've never seen in Canada.

Now I will say that they were lucky that nothing happened to make the autopilot fail, just like I would say anyone driving at 150km is lucky when nothing happens to make them lose control. But a coincidence? no I don't think so, by that logic, it must be a coincidence any time anyone driving with autopilot that doesn't fail is a coincidence. Because I don't think speed would be a big factor to make it fail, with the distance that the cameras are able to detect other cars, even if one was stopped on the highway, the car would still have enough time to stop. Actually if Autopilot did fail, unless the drivers foot was resting on the gas pedal, the car would still try to keep the car in lane until it came to a complete stop. If the drivers foot was resting on the pedal, which i've already mentioned as a possibility in a previous post, then they would probably end up in a ditch and possibly die, but I wouldn't put that as a fault on the autopilot.
 
Your definition of a lucky coincidence is a lot different than mine. I've been in Teslas that have gone well over 200km before we reached our exit without a single intervention on the exact same road that this arrest took place on in Alberta, very close to where I live. I watch videos on Tesla updates all the time to see how it is improving where they almost never need to take over either and some of the things that does cause a problem like some weird lights over highways, i've never seen in Canada.

Now I will say that they were lucky that nothing happened to make the autopilot fail, just like I would say anyone driving at 150km is lucky when nothing happens to make them lose control. But a coincidence? no I don't think so, by that logic, it must be a coincidence any time anyone driving with autopilot that doesn't fail is a coincidence. Because I don't think speed would be a big factor to make it fail, with the distance that the cameras are able to detect other cars, even if one was stopped on the highway, the car would still have enough time to stop. Actually if Autopilot did fail, unless the drivers foot was resting on the gas pedal, the car would still try to keep the car in lane until it came to a complete stop. If the drivers foot was resting on the pedal, which i've already mentioned as a possibility in a previous post, then they would probably end up in a ditch and possibly die, but I wouldn't put that as a fault on the autopilot.
I could drive drunk way more than 200km without getting in an accident. Should I?
 
...where they almost never need to take over...

An unlucky scenario would be there's a disabled stationary vehicle on the freeway and the police just happened to manually give hand signals with flares or flashlights to instruct drivers from far away to avoid that lane and the Autopilot is in that same lane driving at 90 MPH.

There have been 3 documented fatal Autopilot accidents in the US because of the "stationary" obstacle scenario: 2 in Florida hitting semi-trailer trucks and the other one in Mountain View, CA against the beginning of the cement divider in a gore point.

Each of these US government investigations has confirmed that Autopilot is not designed to brake in each of those fatal collisions due to "stationary" obstacle scenarios.

Another unlucky scenario is if there's new big road debris such as furniture that fell off and other cars in front can manually maneuver around it but when it's time for Autopilot at 90 MPH with no manual intervention, it could be fatal.

A big unrepaired pothole is not avoided by Autopilot either and a sudden flat tire or multiple flat tires with 90 MPH Autopilot might be problematic (Autosteer would be terminated due to sharp turning torque from a tire blowout).
 
I could drive drunk way more than 200km without getting in an accident. Should I?
So you are saying that you can drive drunk better than autopilot can? I highly doubt that since drunk driving claims over 10,000 lives per year and it's not something that should be taken lightly. You either don't know what you are talking about or you have done it and you only assume that you were driving perfectly fine and you had complete control. If you have driven drunk at all, you have no right to argue about safety. So instead of making up stupid scenarios that aren't real, maybe we could have a civilized conversation like adults. Now if you were to say that you can drive over 200km/h without getting into an accident, I would say that's a more reasonable comparison and no you shouldn't do that. I also never said that the Tesla should be driving at 150km/h on autopilot or that it should be on autopilot at all without supervision. I was merely stating that calling it a lucky coincidence is wrong by the very definition of the world coincidence when you factor in that the Tesla was doing exactly what it was programed to do, not taking into account that the driver must have rigged something to not have to touch the steering wheel.

There isn't a question about whether what was happening was right or wrong, it was wrong. In the eyes of the public, the law, and even Tesla, that driver did not do what they were supposed to do. That doesn't mean that the system didn't at least have control and I think that instead of just assuming that the car had, against all odds, drove properly; we could recognize how far these systems have come to keep us safe. What if someone passed out at the wheel due to being diabetic, or if they had a seizure, or anything that would cause a driver to be unable to drive suddenly without time to pull over. Instead of talking about how dangerous the car was, we would be talking about how amazing the car was to basically keep that person alive.

Something that would have made it much better is if the car could have pulled over for emergency lights. Someone has already dug into the Tesla software and found 3D models for each emergency vehicle to display on the Tesla screen, so i'm assuming that we will soon have that feature.
 
  • Funny
Reactions: AlanSubie4Life
Well King, I guess this case is closed

mtv90a.jpg
 
  • Like
Reactions: AlanSubie4Life
Absolutely, being super drunk (0.14% BAC) increases your odds of an accident by 10x. Accidents happen every 500k miles. So on average one could go about 50k miles between accidents when super drunk. I doubt you could use Autopilot that way for 50k miles without crashing.

https://www.washingtonpost.com/news...inks-make-your-odds-of-a-car-crash-skyrocket/

I am sorry for arguing with you. Not because you have clearly conveyed your arguement, but because you have made it clear that you are not worth speaking to.