Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A hint into Model Y's future

This site may earn commission on affiliate links.
There is no way that FSD will be approved soon, or even in the next three years. The computer may eventually be able to handle 99% of all driving decisions, but it is that last 1% where human logic and decision making is needed, because not every situation on the road is the same.

You heard it here first. There will not be FSD before 2021. Maybe not even then.
 
is it too much to ask for a normal electric crossover?
make the interior nice. no funny doors or door handles. two screens. and proper interior storage.
Like it or not, Tesla has proven time and again that they aren't interested in making "normal" cars. It's all about innovating the design, manufacture, and sales process. If you want a "normal electric crossover", there should be one offered by nearly every other major auto manufacturer within the next 4 years.
 
  • Like
Reactions: johnster007
No there doesn't need to be an agency. We have too many agencies. We already have a Department of Transportation. They just need to step up and do their job.

Oh I guess Europe will be happy, that the US department of transportation will monitor AVs for them...

I didn’t say we need one founded, but each country will need some agency doing that. If they make a new one, or let it be done by an existing one isn’t important.

People get covered by insurance....not OEM's. EAP gets used by people - who are insured. My insurance company is not insuring Tesla's EAP at all. They insure me in case MY EAP malfunctions.

EAP is far from FSD, basically advanced cruise control.

Once you have FSD, the OEM will have to pay for damages. If a subway crashes because of some sensor fault and you break your leg, you will be compensated by the company running the subway and they will get money from the company that built the subway.
 
These sorts of issues are overstated. Unless Musk invents a time machine (announced Q4218?) the car doesn't know the dogs future actions. In all these sorts of scenarios what the car will do is panic brake while staying on the roadway. That choice will be good enough to avoid excessive liability, which is what manufacturers want.
Absolutely NOT.

These issues are NOT overstated. They have been hugely understated. Especially for those involved in the situations ( humans and dogs included ). Ask the court systems.

You can't entertain these scenarios when the time comes. We should entertain these situations ASAP and have discussions.

The whole problem with FSD is because there has been so few discussions with those in power.
 
Oh I guess Europe will be happy, that the US department of transportation will monitor AVs for them...

I didn’t say we need one founded, but each country will need some agency doing that. If they make a new one, or let it be done by an existing one isn’t important.



EAP is far from FSD, basically advanced cruise control.

Once you have FSD, the OEM will have to pay for damages. If a subway crashes because of some sensor fault and you break your leg, you will be compensated by the company running the subway and they will get money from the company that built the subway.
Ok...lets look at what you said.

If a subway crashes because of some sensor fault you will be compensated by the company running the subway. Whether the company running the subway gets money from the company that built the subway is immaterial and is irrelevant to the person injured. That may or may not happen and may take years and years.
Its the same thing that happens as far as EAP right now. EAP liability and FSD liability will be the same IMHO>

If someone's FSD crashes into you because of some sensor fault you will be compensated by the FSD owner ( their insurance ). Find me an insurance company that will cover FSD and then we can talk.
IF Tesla and the automotive community ( including Google FSD and others ) needs to engage regulators in each countries perspective transportation leadership teams NOW.
 
Ok...lets look at what you said.

If a subway crashes because of some sensor fault you will be compensated by the company running the subway. Whether the company running the subway gets money from the company that built the subway is immaterial and is irrelevant to the person injured. That may or may not happen and may take years and years.
Its the same thing that happens as far as EAP right now. EAP liability and FSD liability will be the same IMHO>

If someone's FSD crashes into you because of some sensor fault you will be compensated by the FSD owner ( their insurance ). Find me an insurance company that will cover FSD and then we can talk.
IF Tesla and the automotive community ( including Google FSD and others ) needs to engage regulators in each countries perspective transportation leadership teams NOW.

No it really won’t. Look, if you hit someone with someone else’s car and neither you, nor the owner has insurance on the car, who pays?

Of course with insurance, they sometimes offer a multiple drivers insurance, but it’s still the drivers responsibility. All traffic violations also fall back to the driver, not the owner.

So if you use a level 3 and up car, it takes limited to full responsibility of the car. So if an accident happens, the OEM caused that accident, not you. So if you wouldn’t buy car insurance, you would not have to pay a dime.

So you will never have to worry about insurance of a level 4-5 car, that you can’t drive yourself. Maybe for things like hail, or flooding, but not for any damage done in operation.

That’s also the definition of level 3 and up. Where does the car take legal responsibility from the driver. Even if a car could drive itself basically anywhere, but wouldn’t take any legal responsibility for it’s actions, it would still be level 2.

Or just think about your FSD car picking you up empty. If it had an accident on the way, would that be your fault?
 
So if you use a level 3 and up car, it takes limited to full responsibility of the car. So if an accident happens, the OEM caused that accident, not you. So if you wouldn’t buy car insurance, you would not have to pay a dime.
?

Give me an insurance company that states that!!!

That's a "possibility".....but that needs to be LAW somewhere. Where is that law.

You are making statements that aren't fact yet.

Lets state facts here.....not presuppositions.
 
As an example of what a Full Self Driving vehicle might have to deal with..

I do high speed driving events on open highways in Nevada. (Silver State Challenge). We are told that there are sometimes wild animals, of significant size, crossing those highways.
We are told to maintain our speed and tragectory if we see something crossing. Taking avoidance maneuvers at speeds over 150 mph most likely will put us into the desert. We are instructed to maintain our course and allow the animal the chance to save their own lives. Chances are that if we swerve, they may also decide to change course as well.

Autopilots will have lots of algorithms designed to make them as safe as possible, but will never be without issues.

Even current Railroad Trains run off their steel tracks or collide with animals or cars at crossings. Expecting AP vehicles to be perfect is unrealistic and unreasonable.

The first goal is to be better, or far better than a human driver.
 
  • Informative
Reactions: jgs
How many dogs or humans have leapt in front of the average driver?
I don’t consider myself an unusual driver and the answer for me is “tons!” Well, deer and pedestrians anyway. And geese, come to think of it. The wildlife varies with locale I’m sure, but pedestrians that wander out in traffic exist in most locales, though living near a college campus I’m sure I see more than many.
 
I don’t consider myself an unusual driver and the answer for me is “tons!” Well, deer and pedestrians anyway. And geese, come to think of it. The wildlife varies with locale I’m sure, but pedestrians that wander out in traffic exist in most locales, though living near a college campus I’m sure I see more than many.
Fair enough, but the original context was about moral dilemmas associated with dogs or people jumping in front of the car and the AI having no choice but to decide between the dog/person and the tree. I’d be surprised if you or anyone else have deer or pedestrians putting them in such ethical pickles on a regular basis.
 
Even if FSD were to be permitted I can't imagine that a vehicle would not need a steering device for some modes of use, simply maneuvering in one's driveway for instance. But there are alternatives to a steering wheel. If the system is electronic and stiffens up or varies the ratio at speed then an aircraft type cross between a handlebar and a wheel might work well.
 
As an example of what a Full Self Driving vehicle might have to deal with..

I do high speed driving events on open highways in Nevada. (Silver State Challenge). We are told that there are sometimes wild animals, of significant size, crossing those highways.
We are told to maintain our speed and tragectory if we see something crossing. Taking avoidance maneuvers at speeds over 150 mph most likely will put us into the desert. We are instructed to maintain our course and allow the animal the chance to save their own lives. Chances are that if we swerve, they may also decide to change course as well.

Autopilots will have lots of algorithms designed to make them as safe as possible, but will never be without issues.

Even current Railroad Trains run off their steel tracks or collide with animals or cars at crossings. Expecting AP vehicles to be perfect is unrealistic and unreasonable.

The first goal is to be better, or far better than a human driver.
In Minnesota we are taught never to change course to avoid a wild animal since it increases risks to drivers and others on the road.
 
Fair enough, but the original context was about moral dilemmas associated with dogs or people jumping in front of the car and the AI having no choice but to decide between the dog/person and the tree. I’d be surprised if you or anyone else have deer or pedestrians putting them in such ethical pickles on a regular basis.
Weird, I don’t see the trolley problem context at all, even looking back over the thread. I wonder if it was someone I’ve blocked (there are a few though not many).

Anyway, sorry for the distraction.
 
Weird, I don’t see the trolley problem context at all, even looking back over the thread. I wonder if it was someone I’ve blocked (there are a few though not many).

Anyway, sorry for the distraction.

For instance - A dog runs out in front of a car. The car has a choice between hitting the dog and hitting a tree - Who is going to program in that moral decision.
A human runs out into the street. The car has a choice of hitting the human or hitting a dog. Who programs that morality into the car?

No worries, my friend. I’ve quoted the context in case you’re interested.