Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Driving is not just the act of steering, pressing the accelerator or pressing the brakes. Driving is the entire spectrum of watching the road for possible risks, planning actions and then applying steering/accelerator/braking appropriately.
Yeah, the car is doing that.

With L2, it can do some steering/accelerator/braking tasks but it cannot do the entire spectrum of driving from watching for all risks, planning all actions.
You won't find that in any spec. L2 can do all. The difference between L2 and L4 is the human is in charge. It is not based on capabilities. If you want to talk about what the spec says, then the difference is only a design intent.
 
Yeah, the car is doing that.

Not it is not. Heck, we've seen AP accidents where AP hit a fire truck because it could not respond to that event.

You won't find that in any spec. L2 can do all. The difference between L2 and L4 is the human is in charge. It is not based on capabilities. If you want to talk about what the spec says, then the difference is only a design intent.

No, L2 cannot do all. You are mistaken. SAE L2 specifically says that L2 cannot do entire OEDR, ie it cannot respond to all object/events:

The sustained and ODD-specific execution by a driving automation system of both the lateral and longitudinal vehicle motion control subtasks of the DDT with the expectation that the driver completes the OEDR subtask and supervises the driving automation system.
 
Oh by that logic, than humans definitely aren't driving since they have way more accidents.

You are being ridiculous. AP was not driving because it was not doing the entire driving task. It was not designed to avoid the fire truck. Humans are clearly driving because they are doing the entire driving task. The accident rate has nothing to do with who or what is driving.
 
You are being ridiculous. AP was not driving because it was not designed to avoid the fire truck. Humans are clearly driving because they are doing the entire driving task. The accident rate has nothing to do with who or what is driving.
Are you upset because you are wrong? And please don't change the subject. We are talking about current FSD, not ancient AP.

> The accident rate has nothing to do with who or what is driving.
You said FSD isn't driving because it has accidents. So if that is true then people aren't' driving because they have accidents. You are the one that made the connection. I'm just pointing out a flaw in your logic.
 
Are you upset because you are wrong?

I am not wrong. I quoted the SAE L2 definition.

And please don't change the subject. We are talking about current FSD, not ancient AP.

So you are saying that FSD beta can do the entire OEDR?

> The accident rate has nothing to do with who or what is driving.
You said FSD isn't driving because it has accidents. So if that is true then people aren't' driving because they have accidents. You are the one that made the connection. I'm just pointing out a flaw in your logic.

Stop gaslighting. I did not say that FSD isn't driving because it has accidents. I said it is not driving because it cannot do the entire OEDR.

In my example of hitting the fire truck, it hit the fire truck because it was not designed to respond to that object/event. It is L2 because it was not designed to respond to that situation. Hitting the fire truck was a by-product of not being able to respond to that situation. So it's not the accident that makes it L2. The accident is a by-product of it being L2.
 
Last edited:
So, are you all ready to drive blindfolded on the highway next year? In the city?

I'm not sure how to explain how hard it to get from "it can drive" to "it's so safe we don't have to watch it". But I feel the FSD bulls completely ignore this fact.

The city and the highway environments are both very hard. City is hard because of the VRU:s, the complexity and the number of actors. Highway is hard because of the speed.

Let's get real. The FSDb is at 10 miles per DE (not counting accel taps and speed adjustments). 80-90 miles per critical disengagement. And Elon is hinting its solved again. That guy is moving the goal posts or lying about FSD all the time... Getting to L3 requires at least 50k miles between failures. Explain to me how that happens. Today: 80, in 12 months 50k. Give it a shot.

Read my lips: It's not going to happen in 2024 either. Will it be a solid L2? Perhaps even hands off? Hopefully.
 
Last edited:
I think Elon is just capitalizing on the ambiguity, he doesn't believe robotaxis are happening this year. "Solved FSD" means rolled out FSD -- or specifically Autosteer on City Streets and I guess the single stack Autopilot -- to everyone who purchases it, but it will still be a Level 2 ADAS requiring a driver ready to take over at any moment.

"FSD Beta" in a wider release was always going to be a Level 2 ADAS, this was leaked to us years ago.
 
Yeah, the car is doing that.


No only is the car NOT doing that, Tesla explicitly makes clear the car is not doing that, both in their own description of FSD as sold, and more verbosely in the CA DMV docs where they call out FSDb as being incapable of doing that due to an incomplete OEDR system, and that Tesla has no intention of changing that to be a complete OEDR.

And a complete OEDR is a required element of any system that does the complete driving task.
 
I think Elon is just capitalizing on the ambiguity, he doesn't believe robotaxis are happening this year. "Solved FSD" means rolled out FSD -- or specifically Autosteer on City Streets and I guess the single stack Autopilot -- to everyone who purchases it, but it will still be a Level 2 ADAS requiring a driver ready to take over at any moment.

"FSD Beta" in a wider release was always going to be a Level 2 ADAS, this was leaked to us years ago.
I personally don't care if it is totally fsd this year as long as they continously make improvements and I get free updates.

Tesla is the only car out of three where I get free updates and improvements of any kind.

Someone will always be disappointed because everyone has different driving styles and it's impossible to please everyone.
 
So, are you all ready to drive blindfolded on the highway next year? In the city?

I'm not sure how to explain how hard it to get from "it can drive" to "it's so safe we don't have to watch it". But I feel the FSD bulls completely ignore this fact.

The city and the highway environments are both very hard. City is hard because of the VRU:s, the complexity and the number of actors. Highway is hard because of the speed.

Let's get real. The FSDb is at 10 miles per DE (not counting accel taps and speed adjustments). 80-90 miles per critical disengagement. And Elon is hinting its solved again. That guy is moving the goal posts or lying about FSD all the time... Getting to L3 requires at least 50k miles between failures. Explain to me how that happens. Today: 80, in 12 months 50k. Give it a shot.

Read my lips: It's not going to happen in 2024 either. Will it be a solid L2? Perhaps even hands off? Hopefully.

I definitely agree with you that there is a huge gap between "can drive" and "can remove supervision". And I agree that Tesla fans and Elon do seem to ignore how big the gap is. But I am not sure where you are getting the 50k miles per intervention. I think it might be worst than you think. It's way more than 50k miles per intervention Tesla told the CA DMV that the intervention rate needs to be on the order of 1-2M miles per intervention before FSD would move up to L3 or higher. From the March 9, 2021 memo, Miguel Acosta reported that CJ Moore from Tesla said this about Tesla's intervention rate for reaching L3+:

"DMV asked CJ to address, from an engineering perspective, Elon’s messaging about L5 capability by the end of the year. Elon’s tweet does not match engineering reality per CJ. Tesla is at Level 2 currently. The ratio of driver interaction would need to be in the magnitude of 1 or 2 million miles per driver interaction to move into higher levels of automation."

Just to highlight how high a bar "eyes off" is. Mobileye says that the Mean Time Before Failure (MTBF) needs to be about 10M hours of driving for highway driving. That number comes from the fact that the human MTBF rate on highways for fatalities in the US is about 3.75M hours of driving per failure. A MTBF of 10M hours of driving would be 2.67x better than humans. So an "eyes off" system on the highway would need to go 10M hours of driving per fatal accident! For non-fatal failures, the MTBF is 25k hours of driving so far less. So you could have more non-fatal failures.

NZjHR3H.png


It should be noted that accident rates and severity of accidents varies greatly from ODD to ODD. Also, it depends on what safety standard the AV manufacturer is aiming for. Are they aiming for 2x safer than humans, 10x safer than humans? So I think it is really hard to pinpoint just one MBTF or one intervention rate as a magical number for removing driver supervision. But FSD beta would need to improve critical disengagement rate by maybe 10,000x or more to remove supervision. So FSD beta has a very long ways to go before removing supervision.
 
Last edited:
I certainly agree with some of what you say but this statement is inaccurate. L3 still requires a driver and the driver has to be ready to take over. The takeover however has to be in a controlled manner and not instantaneously like now.
A failure in an eyes-off system on the highway may result in deaths. 50k miles per failure is likely not enough, but a high enough number for people to realize FSDb will not get there next year. A controlled hand over on ODD exit is not a failure.

Like @diplomat33 writes above.
 
Last edited:
  • Like
Reactions: diplomat33
A failure in an eyes-off system on the highway may result in deaths. 50k miles per failure is likely not enough, but a high enough number for people to realize FSDb will not get there next year. A controlled hand over on ODD exit is not a failure.

Like @diplomat33 writes above.
Your statement is still inaccurate there is nothing in the L3 definition that states how many miles per failure is required. Frankly 50K miles IMO is not nearly enough but I've easily done 5k without a disengagement myself so it's doable. Just set to minimal change lane setting. (Controlled access highways only)
 
Your statement is still inaccurate there is nothing in the L3 definition that states how many miles per failure is required.
Conventional vehicle safety is ~1 fatality per 1M miles. A failure doesn't always equal fatality, so if we assume one fatality per 20 failures that's 50k miles, so it's not a completely unreasonable KPI target for autonomous driving. I'd agree that's definitely in the low end.

Tesla is currently at around 80 miles per critical disengagement which is 625x worse. For all disengagements FSDb is at 12 miles, or 4100x worse than 50000 miles. So Tesla would need 4.1 orders of magnitude improvement for it to be anywhere near Level 3 in the current ODD (unlimited). In a more limited ODD like on highway, on dry roads, only daytime perhaps "only" 2-3 orders of magnitude.

Again, the point of my post was to explain that anyone that believes that current cars will be L3 next year need to read up, do the math and/or get their head checked. I think that we broadly agree on this point?

If we're focusing on fact-checking, you said L3 "still needs a driver", which is factually incorrect since L3 needs a person that is fallback ready. In the J3016 spec it says "fallback ready user". That person is not a driver until they accept to take over.
 
Last edited:
  • Like
Reactions: DanCar
God damn it, this is like the third irony meter of mine you've broken this month!
I have not seen any true trolls or critics spend much time on the forums. It is clear very early they are not owners or true potential owners. The only trolls I have witnessed are the fanboys trying to help Elon pump his the kickstarter stock. The Roadster was a kit car, S was for the rich tree huggers and the 3 would have never made it without the kickstarter FSD hype that Tesla is an investment and is more than a depreciating asset. The cyber truck has a long way to prove the critics wrong. Most of the machine learning is built on common models. The electrical motor has been around for a while and the biggest change in EV's in the last 40 year's has been Delta's efficient inverters the made AC EV's with rapid DCV charging viable. Even the crash benefits of a Tesla are really inherent to all EV's with the motor has been removed and the crumple zone extended. Elon would keep trying to hype the bot or the hyperloop if the market would believe him.

I can't wait to ride in on of your new roadsters. I hear it will have cold fart thrusters, and will fly. Its's gonna be so cool, Elon is sleeping on the factory floor until it is finished or he makes another baby.

Nothing has been solved in the battle of the 9s, Elon lost
fark_l7a0IF_gaHi5cuj0fhgKWpPIoYQ.jpg
 
Tesla is currently at around 80 miles per critical disengagement
I look forward to the day when it can safely make the UPL out of my neighborhood. Or transit a local newly reconfigured stoplight, clearly marked, going straight. It can do neither today without a major safety issue. So I daily get 2 critical disengagements within 3 miles of my house. Neither is overly complex.

Someday but not today.
 
  • Like
Reactions: spacecoin
Conventional vehicle safety is ~1 fatality per 1M miles. A failure doesn't always equal fatality, so if we assume one fatality per 20 failures that's 50k miles, so it's not a completely unreasonable KPI target for autonomous driving. I'd agree that's definitely in the low end.

Thanks for explaining how you came up with 50k miles per safety critical intervention. I would add that you would also want your AV to avoid non-fatal accidents as well. There can be non-fatal accidents that are still pretty serious (cause some physical damage and injury). So we can't just look at fatal accidents. So yeah, 50k is definitely on the low end.

I like Mobileye's definition of a failure: a failure is a perception error where if the driving policy relies on the perception, there will be an accident. They also make the assumption that the perception error starts no more than 10 seconds before the accident.

LSsRjuF.png


So they are defining failure where 1 failure = 1 accident. That is much simpler. It excludes more minor perception errors that don't cause accidents. You don't have to make any assumptions of how many failures per accidents. And the failure rate becomes easier to calculate. The failure rate can simply equal the accident rate that you want to achieve.

FYI, Mobileye has an interesting paper on how you can estimate the accident rate directly from the perception MTBF. https://arxiv.org/pdf/2205.02621.pdf

Another way to look at this, is to look at behavior capabilities. Here is the NHTSA list of behavior capabilities:

Parking (Note: ODD may include parking garages, surface lots, parallel parking)
•Navigate a parking lot, locate spaces, make appropriate forward and reverse parking maneuvers

Lane Maintenance & Car Following (Note: ODD may include high and low speed roads)
•Car following, including stop and go, lead vehicle changing lanes, and responding to emergency braking
• Speed maintenance, including detecting changes in speed limits and speed advisories
•Lane centering
•Detect and respond to encroaching vehicles
•Enhancing conspicuity (e.g., headlights)
•Detect and respond to vehicles turning at non-signalized junctions

Lane Change(Note: ODD may include high and low speed roads)
•Lane switching, including overtaking or to achieve a minimal risk condition
•Merge for high and low speed
•Detect and respond to encroaching vehicles
•Enhancing conspicuity (e.g., blinkers)
•Detect and respond to vehicles turning at non-signalized junctions
•Detect and respond to no passing zones

Navigate Intersection (Note: ODD may include signalized and non-signalized junctions)
•Navigate on/off ramps
•Navigate roundabouts
•Navigate signalized intersection
•Detect and respond to traffic control devices
•Navigate crosswalk
•U-Turn
•Car following through intersections, including stop and go, lead vehicle changing lanes, and responding to emergency braking
•Navigate rail crossings
•Detect and respond to vehicle running red light or stop sign
•Vehicles turning - same direction
•LTAP/OD at signalized junction and non-signalized junction
•Navigate right turn at signalized and non-signalized junctions

Navigate Temporary or A Typical Condition
•Detect and respond to work zone or temporary traffic patterns, including construction workers directing traffic
•Detect and respond to relevant safety officials that are overriding traffic control devices
•Detect and respond to citizens directing traffic after an incident
•N-point turn

OEDR: Vehicles
•Detect and respond to encroaching, oncoming vehicles
•Vehicle following
•Detect and respond to relevant stopped vehicle, including in lane or on the side of the road
•Detect and respond to lane changes, including unexpected cut ins
•Detect and respond to cut-outs, including unexpected reveals
•Detect and respond to school buses
•Detect and respond to emergency vehicles, including at intersections
•Detect and respond to vehicle roadway entry
•Detect and respond to relevant adjacent vehicles
•Detect and respond to relevant vehicles when in forward and reverse

OEDR: Traffic Control Devices and Infrastructure
•Follow driving laws
•Detect and respond to speed limit changes or advisories
•Detect and respond to relevant access restrictions, including one-way streets, no-turn locations, bicycle lanes, transit lanes, and pedestrian ways (See MUTCD for more complete list))
•Detect and respond to relevant traffic control devices, including signalized intersections, stop signs, yield signs, crosswalks, and lane markings (potentially including faded markings) (See MUTCD for more complete list)
•Detect and respond to infrastructure elements, including curves, roadway edges, and guard rails (See AASHTO Green Book for more complete list)

OEDR: Vulnerable Road Users, Objects, Animals
•Detect and respond to relevant static obstacles in lane
•Detect and respond to pedestrians, pedal cyclists, animals in lane or on side of road

ODD Boundary
•Detect and respond to ODD boundary transition, including unanticipated weather or lighting conditions outside of vehicle's capability

Degraded Performance/Health Monitoring, Including Achieving Minimal Risk Condition
•Detect degraded performance and respond with appropriate fail-safe/fail-operational mechanisms, including detect and respond to conditions involving vehicle, system, or component-level failures or faults (e.g., power failure, sensing failure, sensing obstruction, computing failure, fault handling or response)
•Detect and respond to vehicle control loss (e.g., reduced road friction)
•Detect and respond to vehicle road departure
•Detect and respond to vehicle being involved in incident with another vehicle, pedestrian, or animal
•Non-collision safety situations, including vehicle doors ajar, fuel level, engine overheating

Failure Mitigation Strategy
•Detect and respond to catastrophic event, for example flooding or debilitating cyber attack

Source: https://www.nhtsa.gov/sites/nhtsa.g...82-automateddrivingsystems_092618_v1a_tag.pdf

I like this list because it gives a good idea of what an AV needs to be able to do. To be "eyes off", the AV needs to be ~99.9999% reliable in the capabilities listed that are relevant to its ODD. So which capabilities on this list is FSD Beta ~99.9999% reliable? That gives us a sense of what capabilities FSD beta still needs to work on in order to achieve "eyes off". I think it shows that FSD Beta is a long ways from "eyes off".
 
Last edited:
I have not seen any true trolls or critics spend much time on the forums.

You must be new here :)



The Roadster was a kit car, S was for the rich tree huggers and the 3 would have never made it without the kickstarter FSD hype that Tesla is an investment and is more than a depreciating asset. The cyber truck has a long way to prove the critics wrong. Most of the machine learning is built on common models. The electrical motor has been around for a while and the biggest change in EV's in the last 40 year's has been Delta's efficient inverters the made AC EV's with rapid DCV charging viable. Even the crash benefits of a Tesla are really inherent to all EV's with the motor has been removed and the crumple zone extended. Elon would keep trying to hype the bot or the hyperloop if the market would believe him.

Ah, my bad- you just don't own a mirror!