Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla 3 crashes into overturned truck on highway

This site may earn commission on affiliate links.
My hypothetical question is, let's say I'm in my own car, in the driver's seat in the future with Fully Autonomous Driving on the freeway with a posted speed limit of 65 mph. If I roll the jog wheel up to 80 mph, and the Level 5 car dutifully complies and accelerates to 80 mph, would I be liable for a speeding ticket? Seems a bit unfair to put Tesla on the hook for my decision to have the car speed.


To make it interesting, lets go down the slippery slope.

Hypothetical scenario B. You are a passenger in a Tesla Robotaxi in the back seat. To be a safe, and more realistic driver, the Tesla Robotaxi decides to drive at 70 mph to keep up with the flow of traffic. A CHP officer decides to pull over the car. Clearly, as a passenger, you would not be liable for a ticket, and I suppose the car itself would get a ticket, and Tesla would have to pay for it.

Hypothetical scenario C. You are a passenger in a your own privately owned Tesla in the back seat. The Tesla decides to drive at 70 mph to keep up with the flow of traffic. A CHP officer decides to pull over the car. Clearly, as a passenger, you would not be liable for a ticket, and I suppose the car itself would get a ticket, and Tesla would have to pay for it, as the Tesla system was driving, even though you own the car.

Hypothetical scenario D. You are a passenger in a your own privately owned Tesla in the driver's seat. The Tesla decides to drive at 70 mph to keep up with the flow of traffic. A CHP officer decides to pull over the car. You are in the driver's seat, but make the case to the officer you are a passenger and should not be liable for a ticket. How the hell would that work? Hopefully you could play back the cabin camera (assuming you have a Model 3 or Model Y) to show you weren't driving.


Note: Getting pulled over and ticketed for going 5 mph over the speed limit is unlikely, but does happen (happened to me). Maybe the cop doesn't like Teslas, or it's the end of the month and they haven't got as many tickets in as usual, or you got pulled over anyway for not having a front license plate.

When tire blows out and you crash into something - is tire manufacturer to blame? In few specific conditions, like Firestone and Ford, yes, in most it's called an accident.

Similarly, heoretical Level 5 system would have been atested to adhere to certain autonomous regulations and be equipped with system redundancies and safeties to prevent malfunction but if it fails, in most instances, it would be deemed to be an accident.
 
True, but I wasn't talking about a robotaxi. I was talking about using my own car to self-drive me somewhere I want to go. If I'm instructing the car to break the law, wouldn't I be liable for, say, speeding? Though, point taken, it's still Level 4 / 5 if it doesn't require me to do anything.

It is not possible to instruct a L4 car to break the law. When in autonomous mode, a L4 car would be doing all the driving, with no input from the driver at all, and would presumably be programmed to always follow the law. You are just the passenger in a L4 car.

Now, if the L4 car were pulled over for speeding, the car would responsible for the ticket if the autonomous mode was on at the time of the speeding. Of course, like I said above, a L4 car would be programmed to follow the law so I think it is highly unlikely that a L4 car would ever get pulled ove for speeding.

Level 4 still requires human controls (steering wheels, foot pedals...) so my guess is, if there's a human driver in L4 car, human is still responsible for human controls to override L4 mistakes.

Incorrect on both counts.

Steering wheel and pedals are optional in a L4 car. Remember that L4 is completely autonomous (driverless) within its ODD which can include geofencing. So if the car is restricted to a geofenced area for example, there is no need for a driver at all, therefore no need for a steering wheel or pedals. An example of this might be an airport shuttle that is locked to a route between the parking lot and the airport. It would not need a steering wheel or pedals and would be considered L4.

The SAE does not define the levels based on having a steering wheel or pedals because they are not a defining characteristic. A L4 car could have a steering wheel or pedals if it wanted to give the owner the option of manual driving sometimes or it could remove the steering wheel and pedals if it wanted to be strictly autonomous all the time like my airport shuttle example. Likewise, even a L5 car could have a steering wheel and pedals if it wanted to give the owner the option of manual driving sometimes.

So we can say if a car has no steering wheel/pedals then it must be L4 or L5. But we cannot say that if a car is L4 or L5, then it will not have a steering wheel or pedal.

L4 means that the car is completely responsible for both the driving itself and the fall-back if something goes wrong within its ODD. So the human is never responsible for overriding L4 mistakes as long as the car is in autonomous mode. The human is only responsible when autonomous mode is turned off, since the car would not be driving autonomously at that point anymore.
 
Last edited:
  • Like
Reactions: OPRCE
It is not possible to instruct a L4 car to break the law. When in autonomous mode, a L4 car would be doing all the driving, with no input from the driver at all, and would presumably be programmed to always follow the law. You are just the passenger in a L4 car.
Is that in the SAE specification, or a logical assumption? If it's a logical assumption, then an analogy would be no auto maker would manufacture a car capable of driving over the speed limit, much less having a top speed of 162 mph. Yet we know this isn't the case.

Most of what I've read states driver input is "not required" but doesn't say "not allowed." I'm specifically thinking of setting the speed with the jog wheel, like you do with TACC/AP, not the accelerator. If this is allowed under Level 4, then liability may be shared.

Now, if the L4 car were pulled over for speeding, the car would responsible for the ticket if the autonomous mode was on at the time of the speeding. Of course, like I said above, a L4 car would be programmed to follow the law so I think it is highly unlikely that a L4 car would ever get pulled ove for speeding.

Sheesh, a cross country trip at 65-70 mph under FSD is going to be a pain. Will have to fall back on AutoPilot, which I suppose is fair that I'll have to pay attention.

Assuming the car never breaks the speed limit, it still could get pulled over by a cop making a mistake, say mixing up 2 cars, or the radar gun hit the wrong car. In California, we have the "basic speed law" which lets a cop decide what the speed limit is regardless of posted signs based on driving conditions (limit may be 0 mph if the officer deems it unsafe to drive at all). If s/he decides the speed limit is 55 mph on the highway, while the car was doing 65 mph, they could cite the car/driver for speeding.

Anyway, interesting discussion. I found a Wikipedia page about Self-Driving liability, but it seems to be accident centric.
 
Is that in the SAE specification, or a logical assumption? If it's a logical assumption, then an analogy would be no auto maker would manufacture a car capable of driving over the speed limit, much less having a top speed of 162 mph. Yet we know this isn't the case.

Most of what I've read states driver input is "not required" but doesn't say "not allowed." I'm specifically thinking of setting the speed with the jog wheel, like you do with TACC/AP, not the accelerator. If this is allowed under Level 4, then liability may be shared.

It is more of a logical assumption. Sure, driver input could still be allowed maybe for a L4 car if the designer wanted to give the passenger some say in how the L4 drives. But L4 does not require driver input. And if your L4 was good, then I don't think you would really want driver input. The whole point is to have a L4 that works really well, where driver input is not needed. And if you achieve that, then you would want to trust the L4. So you would not want the passenger to mess around by telling the car to go faster than it wants to, for example.

Sheesh, a cross country trip at 65-70 mph under FSD is going to be a pain. Will have to fall back on AutoPilot, which I suppose is fair that I'll have to pay attention.

Assuming the car never breaks the speed limit, it still could get pulled over by a cop making a mistake, say mixing up 2 cars, or the radar gun hit the wrong car. In California, we have the "basic speed law" which lets a cop decide what the speed limit is regardless of posted signs based on driving conditions (limit may be 0 mph if the officer deems it unsafe to drive at all). If s/he decides the speed limit is 55 mph on the highway, while the car was doing 65 mph, they could cite the car/driver for speeding.

Anyway, interesting discussion. I found a Wikipedia page about Self-Driving liability, but it seems to be accident centric.

Just a point of clarification, I did not mean to suggest that a L4 car would always stay below the absolute speed limit at all times, no matter what. The L4 car would need to stay with the flow of traffic which might be going a bit above the speed limit. There are times that the L4 car might go +5 or even +10 above the posted speed limit in certain cases. What I meant is that I don't think the L4 car would drive recklessly. So a L4 car is not going to go +10 or +15 or more above the speed limit in areas where it would be dangerous. Thus, I don't think a L4 car would ever get a speeding ticket on purpose.
 
What do you do when you are not driving? Can you do that while being whisked away in FSD? Sleep, eat, entertain, work, chat, etc... Will you get there faster because FSD will drive you all night?

Good point. Bring on the Snake Charger!

That said, I'm pretty sure my wife would prefer to sleep in a hotel / motel room on a cross country trip, than in the car. Back issues.
 
  • Like
Reactions: DanCar
I think that these cars will be rolling piles of filth The liability risk is very high and not just from accidents. People shoot up in their own cars. A Self driving car would be worse. You get in the car and get stuck by a used Heroin needle. Not to mention protecting passengers from each other. Sticky seats, wet seats People won't ride in an Uber right now and they have drivers to keep their cars clean. Driverless cars have no one to make sure they are clean the majority of the time You have no idea that the seat you are about to sit on was pissed on by a drunk 5 minutes before it picked you up.
 
I think that these cars will be rolling piles of filth The liability risk is very high and not just from accidents. People shoot up in their own cars. A Self driving car would be worse. You get in the car and get stuck by a used Heroin needle. Not to mention protecting passengers from each other. Sticky seats, wet seats People won't ride in an Uber right now and they have drivers to keep their cars clean. Driverless cars have no one to make sure they are clean the majority of the time You have no idea that the seat you are about to sit on was pissed on by a drunk 5 minutes before it picked you up.

Presumably, you could limit the passengers to 5 star ratings only.
 
...Incorrect on both counts. Steering wheel and pedals are optional in a L4 car.

Thanks for the clarification.

Now that you explained, I remember:

Google Firefly was L4 with no human controls and it worked all on its own to take a blind passenger along as long as it's in its predefined parameters and geofencing (maximum speed of 25MPH, pre-defined route...)

 
  • Like
Reactions: diplomat33
I think that these cars will be rolling piles of filth The liability risk is very high and not just from accidents. People shoot up in their own cars. A Self driving car would be worse. You get in the car and get stuck by a used Heroin needle. Not to mention protecting passengers from each other. Sticky seats, wet seats People won't ride in an Uber right now and they have drivers to keep their cars clean. Driverless cars have no one to make sure they are clean the majority of the time You have no idea that the seat you are about to sit on was pissed on by a drunk 5 minutes before it picked you up.
Very quickly there would be multiple cameras recording the inside of the cars.
So the only problem you'd have is the drunk/high problem where the person is in a group where somebody else used the app to hail the cab, and can't control themselves.

You use, you're banned. You smoke, you're banned. You litter, you're banned or pay an additional fee. You spill, you're banned or pay an additional fee.

It's not an anonymous service.
 
What amazes me about the video is not that the tesla crashed into it, but that several other cars driven by humans also nearly pile into the back of the tesla (that is half in a truck). It must have been weird lighting or something.

No, it's just that humans have an amazing ability to zone out of boring tasks where nothing remarkable usually happens for long periods. Then when something unexpectedly does, it takes a second to refocus and become aware of the need for emergency manoeuvres.
 
  • Like
Reactions: Watts_Up