Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Actually no because I specifically said you let them borrow it legally.

Lending it to a drunk driver would be illegal. Hence why I specifically said that.

And then called that out again, specifically, in the second post when JB seemed unclear on this which was over an hour before your reply above.
Legally, I took to mean in context, it was not for example stolen. The person in question legally borrowed the vehicle, even though the act of driving it may be done in an illegal manner.

I can give a related example. If the owner knows the car drives like a drunk driver or has serious flaws and lets it operate anyways, they can be held criminally liable. That's probably the most relevant point to what you were responding to:
So if I own a Tesla robotaxi for personal use, I'm responsible if it does anything wrong.
The VW/Mobileye L4 example will be more instructive. The operator, owner, and vehicle manufacturer is VW, but the software and processors are Mobileye.

There will probably be future combinations where there are 3 parties or more that may be responsible.
 
Last edited:
  • Like
Reactions: nativewolf
I answered a question about autonomous cars because that's the comment you responded to, in a thread about autonomous car progress. I have no interest in a discussion about the ethics or laws surrounding conventional cars. Not in this thread.


The problem with your reply is the same legal principle applies in both cases. If the owner has elected to let someone else drive, and has no basis for doubting they are legally allowed to, and competent to do so, then the driver, not the owner, is at fault in an accident.

You seem to realize that debunks your argument but want to pretend it doesn't by insisting it's "different" when software is "the driver" instead of a human (without providing ANY legal reasoning that would be true) and thus refuse to answer the very relevant conventional car question I asked.
 
  • Like
Reactions: nativewolf
You seem to realize that debunks your argument but want to pretend it doesn't by insisting it's "different" when software is "the driver" instead of a human (without providing ANY legal reasoning that would be true) and thus refuse to answer the very relevant conventional car question I asked.
I never had any idea that my thinking was so convoluted. Thanks for the clarification.
 
  • Like
Reactions: Knightshade
I never had any idea that my thinking was so convoluted. Thanks for the clarification.


Socratic-Method-Win-Argument-pin.jpg
 
Not sure why you are posting this here, did you go to the wrong thread?

This is the thread for FSD Beta 11:
11.3.x is mainline FSD
11.4.x is FSDb?
12.x is for Tesla insiders and Elon?
 
11.3.x is mainline FSD
11.4.x is FSDb?
12.x is for Tesla insiders and Elon?
As mentioned it doesn't really matter whether you consider 11.3.x to be FSD Beta or not (which BTW in general it is considered FSD Beta), it's that it's not on topic for this thread.


This thread isn't about FSD Beta or FSD mainline, it's about the progress of Autonomous vehicles in general. There are a ton of other threads on FSD Beta or FSD that you can post on where it is appropriate and existing discussion can be had on that.
 
  • Like
Reactions: nativewolf
If someone (legally) borrows your car and runs someone over- are you as the owner who chose to let another driver drive it criminally responsible for the death, or is the actual driver responsible?
Again, criminal liability and civil liability are two completely different things. The former is to dissuade reckless or immoral behavior while the latter is to assign responsibility for loss to make sure innocent victims are made whole. In your specific example, only the person driving the car can be held criminally responsible for any behavior resulting in the death because the driver is the only one that can have the requisite intent (assuming of course that the owner did not know what the driver was going to do or criminally recklessly disregarded the likely outcome).
 
This thread is for autonomous driving that is not Tesla.

The first post in this thread:

This thread is just a stub to continue discussion on autonomous car progress by other manufacturers and avoid the continual hijacking of other useful threads to discuss this subject.

To be fair, it's hard to keep a 500 page thread on topic. I was arguing that the "Elon FSD Beta Tweets" thread should be for discussing Elon's tweets about FSD Beta specifically, but that suggestion got shot down.
 
We’re all buds
Let do a better job with the posts titles

“Autonomous Driving Progress - Non Tesla”
This thread was created 6 years ago when there was a lot less discussion of Autonomous vehicles and well before FSD Beta or even the FSD computer was ever a thing. As per first post, this thread was generally created because people were hijacking other threads to discuss AVs. What deep discussion of FSD here is doing is kind of the flip example.

To be clear, I'm not saying all discussion of FSD or FSD Beta is off topic for this thread. If it was a comparison with other manufacturers or it was related to general Autonomous Vehicles (for example if something major happens like Tesla announces officially moving to L3 or L4), that certainly may still make sense to post here.

But just posting general impressions of specific FSD Beta releases has much better and more on topic threads where it can be posted, and doesn't have much to do with general AV progress.
 
Again, criminal liability and civil liability are two completely different things.

But neither would attach to the OWNER (rather than the driver) in the lawful lending of a vehicle in my original example.

And both would attach to the OWNER (in addition to the driver) in the unlawful/reckless lending of a vehicle in the later example.

So there's no real point in your distinction in those examples.

Further-

. In your specific example, only the person driving the car can be held criminally responsible for any behavior resulting in the death because the driver is the only one that can have the requisite intent

Again your understanding of the law is not accurate here.

Criminal liability does not require intent for all potential criminal charges.

Involuntary manslaughter is an example of a criminal charge covering the killing of a human being without intent of doing so, either expressed or implied...(and many states have laws specifically covering our second case where you kill someone while driving intoxicated even if you had no intent to harm anyone)
 

The announcement was on July 27 so it's "old news" by now. But it illustrates Cruise's more aggressive scaling plans. They plan to be in, I think, 7 cities and have 1000s of robotaxis next year. The plan seems to be to launch in a bunch of cities quickly and then scale up in time. Each service area will start small. This approach will have the advantage of looking like they are scaling fast since they will be announcing a lot of new cities. But the challenge will be making each city a meaningful service area. I think both Waymo and Cruise are under quite a lot of pressure to generate revenue. So we will see both companies launch robotaxis in more cities more quickly. Waymo announced they will be launching a robotaxi service in Austin this Fall.
 
  • Like
Reactions: cwerdna