Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
AP doesn't drive you anywhere. It is a driver assist feature. Level 2. Period.
I've said it before and will say it again. It is better to avoid the question of who drives, because here common language and legal language differ. The word "drive" is ambiguous. Let's say it like this, which is uncontroversial:

Autopilot steers, brakes, and accelerates, while the human driver confirms or denies certain actions. The human takes over when he deems it necessary. It is an automated driving system requiring continuous human supervision.
 
Here is a useful quote that explains the current existing regulatory framework in the US regarding AVs:

The Existing Regulatory Framework in the United States

The United States does not currently have national regulations that apply specifically to AVs. Instead, AVs are subject to the same Federal Motor Vehicle Safety Standards (FMVSS) that all other new automobiles in the United States must meet (with only very limited exceptions). The FMVSS do not regulate everything on a car, but only certain vehicle components. Most of the mechanical components of vehicles that can create a risk to safety (for example brakes and tires) are subject to the FMVSS, but with few exceptions, a vehicle's electronics are not covered by the FMVSS. The FMVSS were all written with the natural assumption -that a human being would be the driver of the vehicle. That assumption, of course, is incorrect with regard to self-driving vehicles, and as a consequence, it is difficult or even impossible to determine whether an AV complies with many of the FMVSS.

As written today, the FMVSS do not regulate the control unit for the vehicle - the decision-maker that decides whether and when the vehicle accelerates and brakes, turns, honks its horn or takes any other action that affects safety. That is because until the advent of AVs, the decision maker who controls the vehicle has been a human driver. Federal law does not regulate human drivers. Instead that has been left to the states. The states regulate drivers through driving tests as a condition of drivers' licenses, and through the enactment and enforcement of state and local traffic laws that govern the decisions that drivers make as they operate their vehicles. Although the states are allowed and expected to regulate human drivers who control vehicles, federal law preempts the states from regulating the design or manufacturing of the vehicle itself if the state regulation would conflict with the FMVSS.

Source: https://2uj256fs8px404p3p2l7nvkd-wp...021/05/Kevin-Vincent-Regulatory-Framework.pdf
 
  • Informative
Reactions: Doggydogworld
MB's L3 system isn't legal, at least not in the US. Has MB's L3 system been deployed anywhere?
Level 3 systems like Drive PIlot are legal in Germany now and throughout the EU soon. Mercedes says it will be available on the S Class in "2nd half of 2021" and the EQS in "first half of 2022". We'll see.

Mercedes targets US release in 2023, but the US is a patchwork of 50 different regulatory regimes. Maybe they'll release it in some states and not others, maybe they'll hold off. Nobody knows yet, not even Mercedes.

I'm glad you like Autopilot, but please do not read a book, text, watch a movie or answer emails while moving. It's not only illegal, it's unsafe for you and innocent people around you.
 
People in this country have been driving for well over a century and the federal and state regulatory bodies have been working on regulations nearly all that time. I get that things have been dynamic due to ADAS and laws and policies are probably still coming together. However, I imagine there's a lot more federal hoops to jump through for any L3/4/5 vehicle and doubt it's just "all L3/4/5 vehicles proceed to Go".

I'd say the more automated equates to more regulated and scrutinized, not less.

Just a few months ago NHTSA dinged vision-only Teslas for a lacking automated safety features like FCW and AEB. NHTSA just kicked-off their latest safety probe of Tesla's L2 system. Perhaps there weren't enough crashes in CA for the DMV to join in and launch their own.


Likewise, the NHTSA will probe L3/4/5 systems when they need to.

No, they do not.

None of those bodies has any requirement for pre-approval of L3 features in the US.

There's nothing, federally, preventing anyone from putting L3 vehicles on the road today.

hell, there's L4 ones on the road. Today. In the US.




In opposite directions.

In AZ you don't have to do really anything. In CA it's heavily regulated.

That was the point. There's nothing stopping L3 from being deployed today in a number of US states, legally.

It would require changes to STATE (not federal) law in most other states.

National HTSA, as in federal and covers things nation-wide

Regulations about the yoke have nothing to do with L3. The FMVSS regulates the physical components of a car like steering wheels, airbags or brakes, it does not regulate levels of autonomy. AFAIK, there are no regulations barring anyone from releasing L3 in the US.
 
Okay. MB's L3 is coming soon. I'm not particularly looking forward to reading a book, watching a movie, ...etc... while driving.

Here's what AP has been doing for me for the past year...

Drive to the beach and back. Speed set >37 mph, but moving slower due to traffic.

MY_Beach_AP.jpg


Drive on freeways at speed (>37 mph). I have a reverse commute so I'm not stuck in conjestion.

AP_at_77mph.jpg



Level 3 systems like Drive PIlot are legal in Germany now and throughout the EU soon. Mercedes says it will be available on the S Class in "2nd half of 2021" and the EQS in "first half of 2022". We'll see.

Mercedes targets US release in 2023, but the US is a patchwork of 50 different regulatory regimes. Maybe they'll release it in some states and not others, maybe they'll hold off. Nobody knows yet, not even Mercedes.

I'm glad you like Autopilot, but please do not read a book, text, watch a movie or answer emails while moving. It's not only illegal, it's unsafe for you and innocent people around you.
 
I'm not particularly looking forward to reading a book, watching a movie, ...etc... while driving.
You wouldn't be driving. I think L3 features would be more popular than many people around here think. Unfortunately I also think they may not happen any time soon.

It seems possible that over time the performance of L3 systems will improve and the maximum speed will increase. Right now they're just estimating safety but eventually they will have proof.
 
Last edited:
  • Like
Reactions: Doggydogworld
. However, I imagine there's a lot more federal hoops to jump through for any L3/4/5 vehicle and doubt it's just "all L3/4/5 vehicles proceed to Go".

You can imagine whatever you wish of course- but there's no actual hoops at all federally.

That's why I asked you to cite whatever law you think would in any way restrict a car maker rolling out L3+ vehicles. There isn't one.

Waymo is driving customers around in L4 cars today and the federal government has done zero regulating of it.


Certainly they COULD pass regulations in the FUTURE on this stuff. In fact a ton of people keep hoping they will so that car makers don't have to deal with 50 different sets of rules as they do today.

But so far? Nothing.
 
  • Like
Reactions: diplomat33
You wouldn't be driving. I think L3 features would be more popular than many people around here think. Unfortunately I also think they may not happen any time soon.

It seems possible that over time the performance of L3 systems will improve and the maximum speed will increase. Right now they're just estimating safety but eventually they will have proof.
You wouldn't be driving until the system tells you you're about to crash. IOW, the main difference is how long before the driver must "take over" and how the vehicle reaches a "safe condition" (pulling over, going to dead stop, etc.) if the prompts are ignored. If FSD messes up, one must be ready to take over immediately.

In an L3, you could theoretically be texting, reading, etc because it will give you a (IIRC) ten second window to take over. However, like any automated driving system, it cannot save you from being t-boned by a red light runner which would require instantaneous aversive action.
 
You wouldn't be driving until the system tells you you're about to crash. IOW, the main difference is how long before the driver must "take over" and how the vehicle reaches a "safe condition" (pulling over, going to dead stop, etc.) if the prompts are ignored. If FSD messes up, one must be ready to take over immediately.

In an L3, you could theoretically be texting, reading, etc because it will give you a (IIRC) ten second window to take over. However, like any automated driving system, it cannot save you from being t-boned by a red light runner which would require instantaneous aversive action.
The main difference is the car will tell you when to take over and you are not responsible for any collisions that occur while it's operating. Not having to pay attention is a huge difference IMO.
There is nothing in the L3 definition that precludes the car taking evasive action. However, all the system that are going be released "any day now" only work on controlled access highways at low speeds so I'm guessing their main collision avoidance mechanism is just slamming on the brakes.
 
The main difference is the car will tell you when to take over and you are not responsible for any collisions that occur while it's operating. Not having to pay attention is a huge difference IMO.
There is nothing in the L3 definition that precludes the car taking evasive action. However, all the system that are going be released "any day now" only work on controlled access highways at low speeds so I'm guessing their main collision avoidance mechanism is just slamming on the brakes.
So when the user doesn't respond and the car is creamed from behind because it went into "safety mode" in the middle of the road, who is liable?
 
FWIW one big problem with L3 is the whole "the human has X amount of time to take over"

There's nothing in the SAE docs, or any law I'm aware of, that actually defines X specifically.

So you end up with one guy sure it's supposedly 10 seconds, another guy sure it's 30 seconds, a third person sure it's supposed to be 1 minute.... and all of them being right in that ANY of those would be L3 under the vague definition SAE gives you.
 
FWIW one big problem with L3 is the whole "the human has X amount of time to take over"

There's nothing in the SAE docs, or any law I'm aware of, that actually defines X specifically.

So you end up with one guy sure it's supposedly 10 seconds, another guy sure it's 30 seconds, a third person sure it's supposed to be 1 minute.... and all of them being right in that ANY of those would be L3 under the vague definition SAE gives you.
Maybe no one has imagined a realistic scenario yet where it would matter? Approximately zero dangerous situations last more than a few seconds on divided highways so the system will have to deal with them itself.
Obviously stopping in the middle of the highway waiting for the driver to take over isn't perfectly safe but how often will that happen?
It will be interesting to see if it is actually a problem. I'm skeptical.
 
Maybe no one has imagined a realistic scenario yet where it would matter? Approximately zero dangerous situations last more than a few seconds on divided highways so the system will have to deal with them itself.
Obviously stopping in the middle of the highway waiting for the driver to take over isn't perfectly safe but how often will that happen?
It will be interesting to see if it is actually a problem. I'm skeptical.


It's self evidently a problem though.

The point of L3 is you do not have to actively pay attention

You only need to be available to take over in some "reasonable" amount of time when the car requests it.

That's why you can be reading a book, or watching a movie, or whatever- instead of watching the road and conditions.

It's enough of a problem there's a number of academic papers out there trying to figure out the "right" way to do this.

Because asking a person who has NO awareness of WTF is going on around them to INSTANTLY drive is... not a great plan.

That said- there's nothing in L3 definitions requiring it to be a "dangerous" situation. Just one outside the cars ODD.

If it's L3 highway as its ODD for example, it KNOWS when it's going to be exiting the highway thanks to your nav route- so it can warn you a full minute in advance with no problem.


That situation isn't really an issue. But there's plenty where it might be.

This traffic jam thing has an ODD of 37 mph right... what happens when it's doing 35 and suddenly the road totally clear ahead... does it keep puttering at 35 mph on the highway until you look up from your book?
 
I'm glad you like Autopilot, but please do not read a book, text, watch a movie or answer emails while moving. It's not only illegal, it's unsafe for you and innocent people around you.
not to mention, the wipers on teslas can't deal with all the blood. its not only impolite to kill, it gets the windshield all dirty and stuff.

wont someone think of the wipers??
 
It's self evidently a problem though.

The point of L3 is you do not have to actively pay attention

You only need to be available to take over in some "reasonable" amount of time when the car requests it.

That's why you can be reading a book, or watching a movie, or whatever- instead of watching the road and conditions.

It's enough of a problem there's a number of academic papers out there trying to figure out the "right" way to do this.

Because asking a person who has NO awareness of WTF is going on around them to INSTANTLY drive is... not a great plan.

That said- there's nothing in L3 definitions requiring it to be a "dangerous" situation. Just one outside the cars ODD.

If it's L3 highway as its ODD for example, it KNOWS when it's going to be exiting the highway thanks to your nav route- so it can warn you a full minute in advance with no problem.
It's self evident that it could be a problem. :p I'm still trying to figure out if it's actually a problem.
I agree that human driving ability may be diminished for a time after handoff. It just seems like it will be a very rare event for the system to handoff while moving into a challenging situation. Are the increased odds of a collision really significant?

I get the impression that some people think the system is going to handoff in to some sort of Mad Max scenario or 3 seconds before you hit a brick wall. That's not going to happen.
 
My bias is level 3 autonomy will not be realized until the ‘automation’ is not limited to the car itself. AI and visualization analytic need to be part of road and highway systems. This will take some time since this is mostly public infrastructure and the cost/benefit ratio will not tip in favor of such improvements until public demand increases considerably or obviously when the rewards outweigh the costs. Some measure of bootstrapping is necessary for autonomous vehicles to progress.

Peer-to-peer situational awareness with some form of ad hoc self-organization and control will go a long way to augment and increase the benefit to cost. However, much like smart highways there are integration and interoperability issues that need to be resolved. The benefit is these can be accomplished without largescale investment in road and highway smart infrastructure.

Many of the problems in achieving autonomous driving occur when drivers break the protocol and do not strictly follow the rules of the road, have poor situation awareness (e.g. are inattentive), perform poorly, and at times experience mechanical failure. Without augmentation an autonomous vehicle will need to learn flawless defensive driving skills that includes pedestrian behavior, operating under blind conditions, (which are often temporary), rush hour and congestion driving behaviors, and may others. In a nutshell, an autonomous vehicle must learn how to drive under the worst possible conditions, with the worse of human behavior and decision-making, with the same number unknowns, and then make the right choice. Otherwise they will be just adding to the problem.

We are a long way away. This will require a mountain of data. But we’re not there yet. Is it any wonder Tesla doesn’t just push FSD out there to everyone and reap the benefits of all this data, much of it at some point necessary? First, there is the increased liability (why do you think it’s an priced option), and second, although beneficial, at this point it’s more data then Tesla needs to solve the problems they are trying to currently address.

Finally, until the problems associated with congestion and traffic flow are addressable many of these problems will remain. This can be in some practical form centralized, decentralized, or a bit of both. But having to teach a vehicle to drive like the biggest idiots on the road, predicting the outcome of many unknowns, and then ‘training’ them to act defensively and respond flawlessly will be difficult if not impossible any time soon.
 
I’m blown away. Listen to what the rider support person says about the # of Waymos in Phoenix. Maybe it’s wrong information, but wow wow wow. I doubt it’s wrong information based on how the rider support guy responded. He seemed to deflect the low # by saying they’re ”always expanding”:

 
This reminds me of the squat and stop scam a long time ago. I'm sure there are variations of the same running now.

The driver that rear ends a vehicle is almost always held liable.

Here's a situation related to the above:

A freeway has a max speed limit of 65 mph. An executive in his DrivePilot-equipped MB is commuting to work, but wants to take care of some emails and puts his car into L3 mode with a max speed of 37 mph. His car driving ~37 mph while others are going significantly faster (65+) will cause dangerous backups and jostling as drivers try to change lanes and get ahead.

Even pulling over on the side of the road and putting your hazards on (to take a phone call) can cause crashes from rubbernecking.

I have limited experience with Waymo Pacificas and I'm not looking forward to more L3/4/5 vehicles on the road.

You wouldn't be driving. I think L3 features would be more popular than many people around here think. Unfortunately I also think they may not happen any time soon.

It seems possible that over time the performance of L3 systems will improve and the maximum speed will increase. Right now they're just estimating safety but eventually they will have proof.