Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Musk tweets, traffic lights, stop signs and roundabouts are in testing

This site may earn commission on affiliate links.
Elon Musk said it's just like an automatic elevator. If something goes wrong, the property's owner insurance should cover it. However, if it's Tesla's fault, it would cover it.

It sounded he talked about the monetary issue and not about who goes to jail.

My guess is: If someone has to go to jail, it'll be a property's owner first. The owner will then have to shift the blame to Tesla. But it's a corporation so it can't walk or be transported to jail. It's just like the NRA that has never gone to jail no matter how bad things get out there in the U.S.
Bingo! "It's a corporation". There is the key. Private people will not own self driving cars for that reason - liability. That is why Waymo, Ford, Uber, etc do not plan to sell to private individuals for the most part. Just like private individuals don't own elevators open to use by the public, no private individuals will want to own a self driving car which drives on public roads.
 
Then we're talking Level 5, and I guess Waymo would be in the same boat. I would assume Tesla would be liable. But I think that has to be reviewed on a case by case basis.
Why only Level 5? Level 4 allows for no driver in geofenced areas. It also doesn't require immediate driver intervention, so if someone is killed because the car didn't give sufficient warning for the driver to wake up and take over, who's fault is it then?
 
Bingo! "It's a corporation". There is the key. Private people will not own self driving cars for that reason - liability. That is why Waymo, Ford, Uber, etc do not plan to sell to private individuals for the most part. Just like private individuals don't own elevators open to use by the public, no private individuals will want to own a self driving car which drives on public roads.

A private FSD vehicle just needs to be insurable. The cost of insurance is simply calculated based on the expected payout. Insurance is a corporation taking on the risk.

The first FSD vehicles will be company owned mostly because of costly and unfriendly hardware.
 
  • Like
Reactions: robby
And yet, it the guys car drove 7 miles while the police was figuring out what's up and them stopping it. Maybe the guy passed out with hand on the wheel. GM supercruise has camera monitored attention in addition to hands on the wheel. But I know, you say it didn't happen, fake news, witch hunt, etc, etc...

I'm not claiming witch hunt. I am Claiming People Exaggerate, even Cops.
But you are kinda Claiming ALL IS FULL TRUTH as reported.
So I'll quote you "Were You there" to see the Full 7 Miles???
My car will not run 7 miles without input. Not even close.
If you own all the Cars in your sig I'd think you'd know that as well... unless you don't have the EAP options, then maybe not.

All in all, if he was Drunk the EAP saved people's life.
Because Drunk Drivers Pass Out.
In this case it kept him on the road... You can Thank Elon <- That's a Joke Relax...
 
A private FSD vehicle just needs to be insurable. The cost of insurance is simply calculated based on the expected payout. Insurance is a corporation taking on the risk.

The first FSD vehicles will be company owned mostly because of costly and unfriendly hardware.
No insurance company will insure you against criminal charges such as vehicular manslaughter.
 
Bingo! "It's a corporation". There is the key. Private people will not own self driving cars for that reason - liability. That is why Waymo, Ford, Uber, etc do not plan to sell to private individuals for the most part. Just like private individuals don't own elevators open to use by the public, no private individuals will want to own a self driving car which drives on public roads.

*Raises hand*

*Lowers hand*
OK, well I actually don't want one, not because of liability, but because I'd rather use an autonomous ride share service and save bags of money.
 
I'm not claiming witch hunt. I am Claiming People Exaggerate, even Cops.
But you are kinda Claiming ALL IS FULL TRUTH as reported.
So I'll quote you "Were You there" to see the Full 7 Miles???
My car will not run 7 miles without input. Not even close.
If you own all the Cars in your sig I'd think you'd know that as well... unless you don't have the EAP options, then maybe not.

All in all, if he was Drunk the EAP saved people's life.
Because Drunk Drivers Pass Out.
In this case it kept him on the road... You can Thank Elon <- That's a Joke Relax...
You lack imagination. Your car will not run 7 miles without input? How about if you put a weight on it, say you rest your hand on the wheel, or have your knees offer enough resistance that the wheel thinks you're touching it when it makes adjustments. But, if the fact that you can't think of how 7 miles could happen plus the fact that you didn't personally witness it, means it's fake news and CHP conspiracy, I guess we'll have to just agree to disagree.
 
Private people will not own self driving cars for that reason - liability. That is why Waymo, Ford, Uber, etc do not plan to sell to private individuals for the most part. Just like private individuals don't own elevators open to use by the public, no private individuals will want to own a self driving car which drives on public roads.

I don't understand this from an economic standpoint. Either the risk profile is good, and there is margin to absorb liability, or the risk profile is bad and the system is financially untenable. Whether a corporation absorbs that liability through direct ownership or through an insurance offering is just an implementation detail -- it will ultimately be decided by what people are willing to pay for.

The reasons that Waymo, Uber, et al. are not planning to sell direct-to-consumer are that (1) they don't have a plan to make cheap cars; and (2) they believe the variable costs of driving will be so low that owning a car does not make sense. I think they are on the wrong side of Jevons Paradox. Driving is going to get so cheap (both financially and from a use-of-time standpoint) that people will drive much more once SDCs are available. Imagine how easy and painless it will be to go on long trips when you can get in your car at bedtime and wake up 600 miles away at a national park, a ski area, or a city where your friends live. I think all these companies are too fixated on the step change in commuting (today's use case) and not enough on the coming step change of consumer behavior (tomorrow's use cases).
 
My son and I saw an Argo - Ford autonomous vehicle - on our block this morning. They were mapping the street for when ford plans to release their fleet of autonomous vehicles in DC in 2020+. Like Waymo in Tempe, AZ (Google’s autonomous cars), Ford’s plan is to run a ride sharing service within DC itself, only about 7 square miles.

Everyone else besides Tesla is looking at a very a small geographical area when they talk about autonomous vehicles. There are no plans for Argo to run in Maryland or Virginia suburbs.

We have seen google cars in DC too with all their Lydar equipment on top but Ford is the company that is working with DC government.
 
My son and I saw an Argo - Ford autonomous vehicle - on our block this morning. They were mapping the street for when ford plans to release their fleet of autonomous vehicles in DC in 2020+. Like Waymo in Tempe, AZ (Google’s autonomous cars), Ford’s plan is to run a ride sharing service within DC itself, only about 7 square miles.

Everyone else besides Tesla is looking at a very a small geographical area when they talk about autonomous vehicles. There are no plans for Argo to run in Maryland or Virginia suburbs.

We have seen google cars in DC too with all their Lydar equipment on top but Ford is the company that is working with DC government.

I am very happy Tesla doesn't go that route. I hate geofencing with a passion, and it's a bad hack compared to the real autonomous driving capability which is to handle all areas with normal map data.
 
I am very happy Tesla doesn't go that route. I hate geofencing with a passion, and it's a bad hack compared to the real autonomous driving capability which is to handle all areas with normal map data.

I wouldn't say geofencing is a hack, but a required element to allow for infrastructure to improve. Or for sensing technology to improve.

Basically we have two sides that have to come together for autonomous driving to work.

As an example I was driving in Seattle the other night in the rain after dropping someone off at the train station. The road I was on had really badly painted lines. I wanted to take a left where I'd have to get in the lane that's shared with the trolley thing, but I couldn't tell where the lane or turn lane was (if there even was one). My Tesla couldn't see the lines either. So I said screw it and turned a right instead and went back around.

I'm sure if I was more familiar with the road that it wouldn't have been an issue. Generally speaking knowledge=enhanced maps.

What do humans do with incomplete information? We improvise, and we take chances. Or if we driving a Tesla we go around the block to avoid the entire thing because one mishap = 6 months waiting for parts. Better to error on the side of caution.

There are lots of roads that require humans to deal with situations that are really entirely unacceptable.

Sometimes it's poor lane marking.
Sometimes it's broken infrastructure. Like pot holes, debris from lane boundaries that get broken.
Sometimes it's lots of rain over the roadway especially in areas with construction as wrecking the drainage capability is the first thing construction companies do.

I deal with these things on a daily basis mostly because I know about them. As in I have the enhanced maps inside my own head.
 
I wouldn't say geofencing is a hack, but a required element to allow for infrastructure to improve. Or for sensing technology to improve.

So what they do before sensing technology improves is a shortcut / hack to make up for it.


Basically we have two sides that have to come together for autonomous driving to work.

As an example I was driving in Seattle the other night in the rain after dropping someone off at the train station. The road I was on had really badly painted lines. I wanted to take a left where I'd have to get in the lane that's shared with the trolley thing, but I couldn't tell where the lane or turn lane was (if there even was one). My Tesla couldn't see the lines either. So I said screw it and turned a right instead and went back around.

I'm sure if I was more familiar with the road that it wouldn't have been an issue. Generally speaking knowledge=enhanced maps.

What do humans do with incomplete information? We improvise, and we take chances. Or if we driving a Tesla we go around the block to avoid the entire thing because one mishap = 6 months waiting for parts. Better to error on the side of caution.

There are lots of roads that require humans to deal with situations that are really entirely unacceptable.

Sometimes it's poor lane marking.
Sometimes it's broken infrastructure. Like pot holes, debris from lane boundaries that get broken.
Sometimes it's lots of rain over the roadway especially in areas with construction as wrecking the drainage capability is the first thing construction companies do.

I deal with these things on a daily basis mostly because I know about them. As in I have the enhanced maps inside my own head.

That's AI eventually will do, trying to act from similar situations using the cameras and ultrasonics. You have two eyes and act from previous experience. Tesla has cameras, machine learning will act from previous training (weird situations).

I do think the issue you are mentioning is far off. One idea is to remote control it if it gets stuck in a situation like that. It might not manage to handle a situation, but if it doesn't it know it doesn't and could signal help desk or something.
 
You lack imagination. Your car will not run 7 miles without input? How about if you put a weight on it, say you rest your hand on the wheel, or have your knees offer enough resistance that the wheel thinks you're touching it when it makes adjustments. But, if the fact that you can't think of how 7 miles could happen plus the fact that you didn't personally witness it, means it's fake news and CHP conspiracy, I guess we'll have to just agree to disagree.

Forehead on wheel is the only way I could imagine, but the fact that you cant thinking how a cop could possibly exaggerate... You AGAIN insinuate that the 7 miles is FACT, somebody else that is 100% right also DID NOT WITNESS IT
 
I don't understand this from an economic standpoint. Either the risk profile is good, and there is margin to absorb liability, or the risk profile is bad and the system is financially untenable. Whether a corporation absorbs that liability through direct ownership or through an insurance offering is just an implementation detail -- it will ultimately be decided by what people are willing to pay for.

The reasons that Waymo, Uber, et al. are not planning to sell direct-to-consumer are that (1) they don't have a plan to make cheap cars; and (2) they believe the variable costs of driving will be so low that owning a car does not make sense. I think they are on the wrong side of Jevons Paradox. Driving is going to get so cheap (both financially and from a use-of-time standpoint) that people will drive much more once SDCs are available. Imagine how easy and painless it will be to go on long trips when you can get in your car at bedtime and wake up 600 miles away at a national park, a ski area, or a city where your friends live. I think all these companies are too fixated on the step change in commuting (today's use case) and not enough on the coming step change of consumer behavior (tomorrow's use cases).
Think airplanes. They are extremely tightly regulated, they have strict maintenance schedule where the hours of use of almost every part on the airplane is tracked. A $4M airplane without its maintenance logs is worth about it's scrap value as it will not be allowed to fly in the US and no insurance company will insure it. Why do not many people own airplanes? Because they are expensive to maintain. Airlines have teams of mechanics, and they can keep all the maintenance logs, which allows them to fly and insurance companies to assume the risk. If the airline doesn't maintain the airplanes properly, insurance would not cover them. Now think self driving cars - if the car is going to be driving by itself without a driver onboard, it is logical that the manufacturer will assume the liability. However, in order to assume that liability, they will likely require proof that the vehicle was in proper working order, so aircraft like maintenance logs, etc, so that if your car gets into an accident because you scraped a curb the day before and the car was out of alignment, they don't want to own it. After the accident it would be very hard to tell if the car's alignment was screwed up, so instead they will simply require an alignment every 1000 many miles for example, brakes replaced every so many miles, same for tires, lubricants, linkages, etc, etc. - just like an airplane. If you do scrape the curb, car will have to be inspected by a Tesla approved mechanic and signed off as fully operational within spec. If you fail to report you scraped the curb, fines and penalties. If you get into an accident and you cannot produce maintenance logs signed by certified mechanic, no insurance money for you. You can buy an airplane today at the same price as some of the Teslas, however your per hour cost of usage of that airplane will be significantly higher than your Tesla, and I don't mean fuel. Self driving cars will have cost of usage much higher than an owner driven car. Will some people own them, yes they will, the percentage of people owning their own self driving car will be about the same as percentage of people flying owning their own planes.
 
Last edited:
Airplanes also has a factor of > 100 when it comes to killing people and damaging property. :rolleyes:
Tesla is also the safest car around so it would take a lot to actually die in a crash vs a plane if you crash, you're dead, and 100+ others as well.
I was comparing small planes to an autonomous car, not the large airliners (maybe when Tesla makes giant buses, then yet, we can compare those). A 2+ ton Tesla into a crowd would do more damage than a similar sized, but lighter 2 or 4 seater Cessna plane crashing. Also, plane coming down in a crash would give people a bit more warning as most don't expect the plane to fly near people on streets, but a Tesla on a street is expected near until it gets up on the sidewalk. Lastly, with self driving cars and nobody in the car, the car's safety doesn't help the people or cars it hits. Airplanes don't travel only feet away from other airplanes and pedestrians - they have miles of exclusion zones around them for most of their travel, something you cannot enforce for autonomous cars.

And yes, even a single seater airplane is subject to strict FAA maintenance regulation.
 
I was comparing small planes to an autonomous car, not the large airliners (maybe when Tesla makes giant buses, then yet, we can compare those). A 2+ ton Tesla into a crowd would do more damage than a similar sized, but lighter 2 or 4 seater Cessna plane crashing. Also, plane coming down in a crash would give people a bit more warning as most don't expect the plane to fly near people on streets, but a Tesla on a street is expected near until it gets up on the sidewalk. Lastly, with self driving cars and nobody in the car, the car's safety doesn't help the people or cars it hits. Airplanes don't travel only feet away from other airplanes and pedestrians - they have miles of exclusion zones around them for most of their travel, something you cannot enforce for autonomous cars.

And yes, even a single seater airplane is subject to strict FAA maintenance regulation.

Right, so why are normal taxis, buses and even cars without autopilot not regulated the same way if the danger level is the same as a small plane? It can't be because the plane happens to have an autopilot. Some small planes might not even have one. And a plane's autopilot is just for the course when you're in normal height etc.