Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What SAE Level 3-5 system will be released to consumers in 2020?

What SAE Level 3-5 system will be released to consumers in 2020?


  • Total voters
    36
This site may earn commission on affiliate links.
Car companies have to pay out all the time for defects in their software/hardware! Self driving hardware/software is no different.

I strongly disagree that its 'no different'.

again, you interview your employees and you evaluate them. if you think they're a risk (at time of hire or over time) you don't keep them as employees.

the car companies do not have that kind of control. insurance companies do, though, but car companies, as sellers of PRODUCT, do not.

until they work this out, I don't see how a viable L4 car can be sold in the US, at least.
 
Tesla has to solve every single driving case perfectly for the ODD that they want to operate in before they can reach that step of deploying robotaxis or letting people sleep in their cars. Now, Tesla could do what Waymo is doing and deploy a limited number of Model 3's as robotaxis in a tightly geofenced area to greatly reduce the number of driving cases that the system has to handle, to make it easier.
Other companies don't limit their cars to certain areas because these areas a simple to handle (otherwise Cruise would certainly not have selected the city of San Francisco as their primary testbed). They do it to reduce the cost of data acquisition during the development phase. I don't think it would be a problem for companies like Google to do large-scale HD mapping of most of the country when they feel they are ready. They've done it before.
 
I strongly disagree that its 'no different'.

again, you interview your employees and you evaluate them. if you think they're a risk (at time of hire or over time) you don't keep them as employees.

the car companies do not have that kind of control. insurance companies do, though, but car companies, as sellers of PRODUCT, do not.

until they work this out, I don't see how a viable L4 car can be sold in the US, at least.
An autonomous car company has 100% control over the driver of the car because they ARE the driver of the car. If they don't like the way the car is driving they can change the software. All companies have liability for the products they sell. There is only one logical way that it has been and will be "worked out". In California the manufacturer of the autonomous vehicle is liable while it is in autonomous mode.
§228.04. Financial Requirements for a Permit to Deploy Autonomous Vehicles on Public Roads.

(a) A manufacturer of autonomous vehicles, both those that require a driver inside the vehicle and those that do not require a driver inside the vehicle, may satisfy the requirements of Vehicle Code section 38750 (c)(3) by presenting evidence of one of the following:

(1) The manufacturer has in place and has provided the department with evidence of the manufacturer’s ability to respond to a judgment or judgments for damages for personal injury, death, or property damage arising from collisions or accidents caused by the autonomous vehicles produced by the manufacturer in the form of an instrument of insurance, a surety bond, or proof of self-insurance.

(2) A surety bond, that meets the requirements of Section 227.10 of Article 3.7, and is conditioned that the surety shall be liable if the manufacturer, as principal, fails to pay any final judgment for damages for personal injury, death or property damage arising from a collision involving an autonomous vehicle deployed by the manufacturer pursuant to Vehicle Code section 38750(c), and shall be submitted to the department with the Autonomous Vehicles Manufacturer Deployment Program Surety Bond, form OL 317A (New 6/2014), which is hereby incorporated by reference.

(3) An insurance that meets the requirements of Section 227.08 of Article 3.7.

(4) A proof of self-insurance shall meet the requirements of, and be governed by, Section 227.12 of Article 3.7 and shall be submitted to the department on an Autonomous Vehicle Manufacturer’s Deployment Program Application for Certificate of Self- Insurance, form OL 319A (New 2/2017), which is hereby incorporated by reference.
https://www.dmv.ca.gov/portal/wcm/c...essAV_Adopted_Regulatory_Text.pdf?MOD=AJPERES
 
Updates to California DMV’s Proposed Autonomous Vehicle Deployment Regulations May Empower Innovation by Limiting Manufacturer Liability

its an older article, but its not clear to me that manuf's will take full responsibility. my gut feeling is that this will be very limited.

"The proposed California regulations would provide a wider shield for manufacturers than other states and may signal a trend toward greater protections for manufacturers in the autonomous vehicle space."

it would be nice to hear from legal experts on this matter.
 
Updates to California DMV’s Proposed Autonomous Vehicle Deployment Regulations May Empower Innovation by Limiting Manufacturer Liability

its an older article, but its not clear to me that manuf's will take full responsibility. my gut feeling is that this will be very limited.

"The proposed California regulations would provide a wider shield for manufacturers than other states and may signal a trend toward greater protections for manufacturers in the autonomous vehicle space."

it would be nice to hear from legal experts on this matter.
The protections to manufacturers they're talking about are in the case that the autonomous system is modified or not properly maintained. Obviously the owner of the vehicle could be responsible if their car is driving around on bald tires. Companies will always try to weasel out of responsibility, that's why there are regulations.
If the car is maintained in accordance to manufacturer specifications then who else could be responsible other than the manufacturer?
 
Another issue to consider is who is at fault in an accident. Autonomous cars will get into accidents, especially if they have to share public roads with human drivers. But if the autonomous car is not deemed at fault in an accident, then I don't think the auto maker would be liable. For example, there is a big difference between an autonomous car hitting a stopped vehicle and an autonomous car that tried to take evasive action from a drunk driver but could not avoid the collision. In the first case, the auto maker would be liable since the accident was a direct result of a flaw in the autonomous technology. In the second case, the autonomous car saw the drunk driver and tried to take evasive action but the collision was unavoidable. The autonomous car cannot be help responsible in a scenario like that.

So I think as long as auto makers do their homework to make sure their systems are safe, then they can greatly reduce their liability. For example, if an auto maker can show that their autonomous car has redundant hardware and excellent perception to see everything and has the driving policy to avoid pedestrians, cyclists, other cars, obey traffic laws, respect traffic lights, stop signs etc..., if they do like a million mikes without a disengagement, then that will go along way to reduce liability. But on the other hand, if an auto maker deployed an autonomous car knowing that the car won't brake for stopped vehicles for example, that would be a huge liability.

Other companies don't limit their cars to certain areas because these areas a simple to handle (otherwise Cruise would certainly not have selected the city of San Francisco as their primary testbed). They do it to reduce the cost of data acquisition during the development phase. I don't think it would be a problem for companies like Google to do large-scale HD mapping of most of the country when they feel they are ready. They've done it before.

Reducing cost could certainly be a factor. But I still maintain that a smaller geofenced area where the driving conditions are more controlled makes developing safe autonomous driving a little easier. I'm thinking one reason why waymo can remove safety drivers is because they've been able to test and optimize their autonomous driving for the area where they operate. If you suddenly put a million Waymo robotaxis on roads all across the US, they'd probably be less safe.
 
This is another reason why if Tesla wanted to deploy autonomous driving to the fleet as soon as possible, they would be better off making AP a L3 or maybe L4 system restricted to very narrow conditions. The smaller the conditions that the car has to operate in, the fewer the driving cases that the autonomous driving system has to solve and therefore the easier it is to get to a safe system. By aiming for L5 autonomy, Tesla has to solve every single driving case which is a much much bigger task. So, chances are that Tesla will require driver supervision for a very long time.

That approach makes sense, but I do wonder whether there is risk during transition between autonomous driving to manual driving in a L3 setting?
 
That approach makes sense, but I do wonder whether there is risk during transition between autonomous driving to manual driving in a L3 setting?

To be L3, the autonomous driving system is required to give the driver sufficient advanced notice when they need to take over. A driver facing camera that can monitor the driver can probably help to make sure the driver is alert and able to take over. But yes, there is probably some risk. Another reason, why I think L4 autonomy is better than L3 in this regard. If you can cut out this risk but eliminating the need for the driver to take over in the first place, so much the better.
 
The protections to manufacturers they're talking about are in the case that the autonomous system is modified or not properly maintained. Obviously the owner of the vehicle could be responsible if their car is driving around on bald tires. Companies will always try to weasel out of responsibility, that's why there are regulations.
If the car is maintained in accordance to manufacturer specifications then who else could be responsible other than the manufacturer?

Here is an interesting scenario to think of with regards to manufacturer's liability. Assuming there is a bad bug in the code that got into production cars and is causing frequent crashes. If the code update isn't trivial and takes a few weeks to be ready, and in the meantime the cars are not deem road worthy (assuming L4 or L5 operational cars). Would the manufacturer have to compensate all owners for loss associated with the cars being down for a few weeks? Do the manufacturer have the right to ground every car to prevent accidental usage?

If this happens to a car model with several hundred thousand cars on the road, then it could very well bankrupt the company.
 
just knowing how all US laws are tailored by and for big business, there is nearly zero chance that big business (car makers) will take on liability for self-driving cars. my bet is that they will assign that risk to you as a condition of you owning the car.

there will be lots of words, like all contracts, and most of us won't really understand what is really going on. but you can bet that before corporate america officially supports L4 on normal streets, it won't be the car makers who take on responsibility. I just don't see that happening unless a major political change happens. and I mean real major (ie, not going to happen).
 
Wouldn't that mean Tesla can never fulfill Elon's promise of robo taxi or sleeping while car is driving?

Over 90% in this poll believe that will not happen in 2020, but EM was far more optimistic in his assertions.

The best OCR doesn't work that great today some 4 decades after the technology became common. That is a fairly structured and constrained problem with 26 characters and a few fonts. If you want me to believe the most unstructured problem humans do regularly will be solved soon (i.e., driving a car), I want you to question why OCR still has plenty of typos. Imagine each typo as an error of some sort . . . most will be minor, but some will threaten human life.

We are decades away from the dream of full autonomy. I want to believe, but that is the cold hard reality.
 
  • Disagree
Reactions: diplomat33
just knowing how all US laws are tailored by and for big business, there is nearly zero chance that big business (car makers) will take on liability for self-driving cars. my bet is that they will assign that risk to you as a condition of you owning the car.

there will be lots of words, like all contracts, and most of us won't really understand what is really going on. but you can bet that before corporate america officially supports L4 on normal streets, it won't be the car makers who take on responsibility. I just don't see that happening unless a major political change happens. and I mean real major (ie, not going to happen).
That's ridiculous. Car companies are held liable for defects in their products all the time. Just read the regulations, this isn't a hypothetical, the manufacturer of the self driving vehicle is responsible.
Here is an interesting scenario to think of with regards to manufacturer's liability. Assuming there is a bad bug in the code that got into production cars and is causing frequent crashes. If the code update isn't trivial and takes a few weeks to be ready, and in the meantime the cars are not deem road worthy (assuming L4 or L5 operational cars). Would the manufacturer have to compensate all owners for loss associated with the cars being down for a few weeks? Do the manufacturer have the right to ground every car to prevent accidental usage?

If this happens to a car model with several hundred thousand cars on the road, then it could very well bankrupt the company.
Here's another hypothetical. Let's say a manufacturer sells a vehicle that they promise will be Level 5 at inderminate future date. Will that manufacturer have to compensate owners if they don't deliver in a reasonable time frame?:p I think it will be decided by a lawsuit. I assume that the manufacturer of the autonomous vehicle will have remote access to enforce when and where it can be used in autonomous mode. Tesla has pushed software updates without the owner's consent.

I think that L4 and L5 vehicles will never be sold to consumers and will only operate as robotaxis. I'm skeptical that L3 vehicles will ever exist. The ability to hand off to a driver in 10 seconds (or whatever ends up being reasonable) doesn't make the problem much easier.
 
I think that L4 and L5 vehicles will never be sold to consumers and will only operate as robotaxis. I'm skeptical that L3 vehicles will ever exist. The ability to hand off to a driver in 10 seconds (or whatever ends up being reasonable) doesn't make the problem much easier.

I think not selling the car itself and only having car on a service contract (ride sharing) is a good way to minimize liability on the manufacturer.
 
  • Like
Reactions: S4WRXTTCS
Is this North American market only? Or International?

My international vote is for "Audi Traffic Jam Assist" with the Audi A8 in Germany. Last I heard they were trying to get regulatory approval for it in Germany. But, I haven't seen any status updates on it. It doesn't get much reporting in the States.

My vote for the North American Market is nothing. I don't think anyone has the guts to release an L3 in the states.

Instead I think 2020 will be the year of L2+ systems. Especially hands free systems like Supercruise, and NoA will see some significant improvements.
 
I think that L4 and L5 vehicles will never be sold to consumers and will only operate as robotaxis. I'm skeptical that L3 vehicles will ever exist. The ability to hand off to a driver in 10 seconds (or whatever ends up being reasonable) doesn't make the problem much easier.

I agree that L5 systems will never be sold within 20 years.

L4 systems will likely grow in ODD (Operation Design Domains) as more areas are validated for it. L4 systems are really the only thing worth talking about in terms of FSD.

As to being sold to consumers I think that depends on your definition.

Sold as something to own will likely be rare as the systems need to evolve.

Sold as a service with monthly lease payments I see as a strong possibility. You drive it manually until you reach the approved area, and then it takes over. Where you have to take controls back over before it exits the approved area, and if you don't then it parks itself.
 
Last edited:
  • Like
Reactions: Dubes
Just my personal opinion but I believe several companies have the technical capability of doing L4 highway in 2020. For example, I think Waymo or Cruise could deploy L4 highway in 2020 if they really wanted to. After all, they already have really good autonomous driving now, far better than what Tesla has, with far better perception and planning, that does quite well even in complex urban areas, So, I am pretty sure they could handle limited access divided highway driving. So, if companies don't deploy L4 highway in 2020, I think it will be more because of business and legal issues. it cost money, time and work to add the hardware to the cars and the companies would be liable in case of any accident. Companies might not see it as worth it yet.
 

I do think FSD is a perfect illustration of this tweet by Elon. He could have taken a more cautious approach and in 2016, just promised a better driver assist autopilot with the AP2 hardware. Or if he still wanted to go for self-driving but remain more realistic, he could have ignored city self-driving completely and just said the goal is L3 highway with no timeline. But instead he promises nothing short of L5 autonomy right out of the gate and before Tesla even had a single FSD feature ready! WOW! Yep, he definitely bit off more than he can chew!

I imagine the conversation with the Tesla engineering team goes something like this:

Team: Good news Elon! We've made improvements to the NN. Our camera vision can now see traffic lights, stop signs and road markings. It's not ready for the public yet but it is getting better.

Elon: That's great. Let me go tweet that in 6 months, Teslas will self-drive from their house to their workplace.

6 months later....

Elon: So? Is FSD ready?

Team: Uh, no. FSD is really hard. We still have a lot left to do.

Elon: So when? 3 months maybe... 6 months definitely?
 
Last edited:
  • Like
Reactions: DanCar
Just my personal opinion but I believe several companies have the technical capability of doing L4 highway in 2020. For example, I think Waymo or Cruise could deploy L4 highway in 2020 if they really wanted to.

I have some friends that went to both W and C. we all know W is doing well. my friend tells me he's incredibly impressed with what he saw when he first started at C. I trust his judgement even though I have yet to get a tour of their site and their tech.

So, if companies don't deploy L4 highway in 2020, I think it will be more because of business and legal issues. it cost money, time and work to add the hardware to the cars and the companies would be liable in case of any accident. Companies might not see it as worth it yet.

that's my take, also. the tech issues are hard, but the *business* risk of it all will be enough to turn off every single ceo-type. none of them will want to risk it all on peoples' lives like this.
 
Examples of legal liability issues: Deer pops out of nowhere and collides, who pays? Insurance may sue Tesla. Will dog owner whose dog gets hit and loses a leg sue Tesla? If Tesla hits a pothole and there is tire damage, will Tesla get sued? If someone creates an adversarial attack, that causes damages, then Tesla may get sued. If Tesla hits one the many endangered species, snake?, will Tesla get sued? If Tesla brakes hard and causes whiplash ...
Anecdotal story: I saw the aftermath of what appeared to be a small rear end collision. No heavy damage to the cars. The driver of the car that was rear ended, appeared dead. I estimate his age to be 85 or greater, could be 95. Suspect his neck snapped. Summary: there are fragile people out there, and doesn't take much for some to die.
 
Last edited: