EVNow
Well-Known Member
Hmm ... no. You didn't give a definition. Definition is something you can write down without examples - and shouldn't be made to fit theories you are proposing.Yes.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Hmm ... no. You didn't give a definition. Definition is something you can write down without examples - and shouldn't be made to fit theories you are proposing.Yes.
But then, they wouldn't have tested as L4 and won't get regulatory approval.The problem with this definition is someone could just leave out a single module and test their system as "L2" and then add the module in right before deploying.
Yes - its very clear from the feature list I've posted earlier that Tesla "FSD" is missing a LOT of features needed for L4. It is L2 - and city NOA is no different from freeway NOA. Unlike Waymo/Cruise - whose design intent has been robotaxi - for all practical purposes Tesla design intent is L2. They have diligently worked on each feature NHTSA prescribes. There is no attempt to even "follow all state laws".Well, I could be wrong. The reason I said that is because I consider pulling over for emergency vehicles to be part of the OEDR. And based on the definition of L2, if the system is missing some OEDR, then it is L2. So it would seem to fall under L2.
"FSD" is just a marketing term - should not be confused with actual design intent or functional practicality. "robotaxi level reliability" being planned for next year would be news to KarpathyThe software is literally called "Full Self Driving." and robotaxi level reliability is planned for late next year.
Because you are correct that there isn't a clear definition except design intent. Tesla's beta FSD is the same as the prototype AV software developed by dozens of other AV companies. If you listen to the users of beta FSD it's clear that they believe they are using prototype AV software that will ultimately be capable of autonomous operation.Hmm ... no. You didn't give a definition. Definition is something you can write down without examples - and shouldn't be made to fit theories you are proposing.
I've never heard Karpathy say that. It's crazy that people can watch Tesla presentations and interviews with the CEO and come away thinking that the design intent of beta FSD is not robotaxis or that's there's some super secret other build of the software that is the actual robotaxi software. Beta FSD is robotaxi software (that needs a lot of work!)."FSD" is just a marketing term - should not be confused with actual design intent or functional practicality. "robotaxi level reliability" being planned for next year would be news to Karpathy
Its not. I just provided the reasoning in my above post.Tesla's beta FSD is the same as the prototype AV software developed by dozens of other AV companies.
Its crazy to think what Musk says is the design intent vs long term strategic goal/objective.It's crazy that people can watch Tesla presentations and interviews with the CEO and come away thinking that the design intent of beta FSD is not robotaxis ...
I guess we have very different definitions of design intent. You really think all the companies officially testing autonomous vehicles have emergency vehicle response in their initial test software?Its crazy to think what Musk says is the design intent vs long term strategic goal/objective.
To me "design" is a very specific term - which says exactly how a piece of sw/hw will be made to work to output some functionality. So, when even basic functionality like pulling over when you hear/see emergency vehicle is not on the feature list - design intent at this time is definitely not "FSD". The current design intent is City NOA - i.e. do the basic minimum needed to go from place A to place B - and let the driver handle all "edge cases" like emergency vehicles, different speed limits based on the time of the day etc etc etc that are listed in the feature list I've posted earlier.
Here is the key distinction: a test vehicle that cannot do the full OEDR is testing a L2 system. It was L2 to begin with. It is not a L4 system that became L2 for testing purposes.
This a slippery slope argument and I don't think there is a bright line. NoA is certainly pushing the limits of what I would consider driver assist.
So you're arguing that all autonomous vehicle testing regulations are completely unnecessary and can be easily circumvented? If you look at Uber's AV testing saga in CA they made the exact same argument and were shutdown by the DMV. Of course after that Uber killed someone in Arizona while testing their AV.
The problem with this definition is someone could just leave out a single module and test their system as "L2" and then add the module in right before deploying.
To me the only thing that really matters is if it improves safety. NoA might improve safety (I'm not convinced but it is plausible) but I seriously doubt beta FSD improves safety.
All this is moot if testing AVs is no more dangerous than regular driving. Hopefully that is the case but I suspect that Tesla knows it is not and will only release a very scaled back version of the software (i.e. autosteer on city streets).
Hopefully that is the case but I suspect that Tesla knows it is not and will only release a very scaled back version of the software (i.e. autosteer on city streets).
(i.e. autosteer on city streets).
But it is worth nothing that the SAE document clearly says that levels are assigned, not measured, based on the design intent of the manufacturer:
Page 30: "Levels are assigned, rather than measured, and reflect the design intent for the driving automation system feature as defined by its manufacturer.
"FSD" is just a marketing term - should not be confused with actual design intent or functional practicality. "robotaxi level reliability" being planned for next year would be news to Karpathy
If you listen to the users of beta FSD it's clear that they believe they are using prototype AV software that will ultimately be capable of autonomous operation.
It's crazy that people can watch Tesla presentations and interviews with the CEO and come away thinking that the design intent of beta FSD is not robotaxis or that's there's some super secret other build of the software that is the actual robotaxi software. Beta FSD is robotaxi software (that needs a lot of work!).
Its crazy to think what Musk says is the design intent vs long term strategic goal/objective.
I guess we have very different definitions of design intent. You really think all the companies officially testing autonomous vehicles have emergency vehicle response in their initial test software?
I'm curious what type situation the writers of the SAE spec had in mind when they talk about misclassification of prototype L4-5 as L2?
To me, Tesla has autosteer on city streets ever since Ap1.
"Autosteer on city streets" is specifically listed as a FSD feature on the website. So it is more than just using the "autosteer" part of AP on city streets. It refers to a special set of FSD features. It refers to "autosteer" being able to do more than what it does now. FSD Beta basically shows us what "autosteer on city streets" is. It is the ability of "autosteer" to do things like make turns at intersections, maneuver around double parked cars, move over in an empty parking spot to let an incoming car pass by etc....
Yes, I would. Again the only reason any of this matters is safety. If testing a L4 system is safe (relative to regular driving) then it's fine. I happen to believe that testing autonomous vehicles requires skilled test drivers whose performance is monitored to achieve that level of safety. I could be wrong. We'll see!Let's take a step back, If a company develops and releases a product that offers hands free driving address to address, including intersections and parking and everything, but the product is very clearly designed as a L2 system that requires a driver at the wheel and is only clearly advertised as a driver assist system. Would you say such system is pushing the limits of what you would consider driver assist?
Yes. I think testing in an urban environment is potentially much more dangerous.I am starting to think, that maybe you don't care whether something is driver assist or not... but simply care about testing automation in a highway environment or urban environment? yea? or Is this an incorrect assumption?
Yeah because the safety drivers are monitored and will get fired if they screw up. I've seen a lot of beta FSD users letting the car do illegal maneuvers just to see what will happen. I'm pretty sure test drivers of other manufacturers don't do that.So I think are opinions here are quite aligned... I mean I will point out that companies testing robotaxi systems with safety drivers in city environments have far higher safety records than just manual driving in city environments. Tesla FSD beta being released to end users is a little different than trained safety drivers I agree.
"autosteer on city streets" is currently the only promised future feature of FSD. I meant "autosteer on city streets" that actually works well and is officially supported (technically the owner's manual tells you not to use the current autosteer on city streets).Can you clarify what you mean by this?? To me, Tesla has autosteer on city streets ever since Ap1.
You're saying there is no hard line and I agree. To me it all comes down to whether extra vigilance is required to safely test the system or the system is good enough that it causes the driver to lose vigilance (my bigger long term concern).Where would you draw the line, and for what reason?
Uber tried to claim their system was L2 in California. The DMV decided that it was subject to AV testing regulations and forced them to comply. So, no, companies can't opt out.Yes thank you this is an important point. Another reason why it makes no sense to regulate based on L2/L4... because that just lets companies decide to opt in or opt out of regulation.
I don't care about SAE specs. Its all just theoretical.I guess we have very different definitions of design intent. You really think all the companies officially testing autonomous vehicles have emergency vehicle response in their initial test software?
I'm curious what type situation the writers of the SAE spec had in mind when they talk about misclassification of prototype L4-5 as L2?
At least we are winning big in number of covi cases. /sAnyone think that CA might lose big ...
So only locked into Cities... tsk tsk. Too bad it can't cross over into another city via interstate.By testing, I mean that Waymo has vehicles equipped with their FSD package, with a safety driver, that are driving around in autonomous mode on public roads, collecting data, refining the software etc... Those rides are not open to the public. They are strictly in-house testing. So we do not have videos of those drives.
Here is the map and info I am referring to. This is from the Waymo Safety Report released in Sept 2020:
Source: https://storage.googleapis.com/sdc-prod/v1/safety-report/2020-09-waymo-safety-report.pdf
PS: If you have not done so already, I would recommend reading the Safety Report. It gives great detail on all the testing Waymo does and how their FSD works.
So only locked into Cities... tsk tsk. Too bad it can't cross over into another city via interstate.
Stop with the "Waymo is geofenced to a tiny desert sandbox" strawman. You know full well that Waymo is testing FSD in several States across the US, almost as many as Tesla, if you look at the map Waymo put out.
And yes, there is a massive chasm between Tesla and Waymo. Waymo has true driverless, Tesla does not. As Krafcik put it, the Waymo Driver is a complete replacement for a licensed human driver. Tesla's FSD is not.
Not necessarily. A system would be Level 4 if it operates everywhere but only under certain weather conditions, for example.Geofencing is required for level 4.
the problem is people doesn’t bother to read actual data.
Geofencing is required for level 4. Per both SAE standard and CA/AZ rule.
That was the main reason CA DMV went after Elon and Tesla last October. Because Tesla does not have permit to test Self Driving function.
Tesla legal formally confirmed its an L2 system. That is also what let NTSB to warn Tesla on Friday.