Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD on city streets coming later this year with reliability far in excess of human drivers!

Really? :)

  • Yes

    Votes: 37 18.7%
  • No

    Votes: 161 81.3%

  • Total voters
    198
This site may earn commission on affiliate links.
You do not need the experience before the features are activated, only before you turn off the nags and no longer require an engaged driver
They have clarified that now, but initially they said exactly what I wrote in the posting you quoted (see screenshot in the first posting in the thread). But anyway, there's no way they will be able to collect "billions of miles of experience" with these unreleased features by the end of the year, and there is no clear metric to determine when they will achieve "reliability far in excess of human drivers". It's just another way of saying "we have no clue if and when that happens".
 
That's not a registration form as you are claiming, it's an application for a permit. No guarantee the application will eventually result in a permit. Why are you here (besides to spread FUD)? To disrupt the forum from productive discussion about things that matter?
I fail to see the distinction. If I register my vehicle and it doesn't meet the registration requirements there's no guarantee that it will be approved either...
 
More FUD.

Pointing out your claim was factually untrue is FUD?

I do not think that word means what you think it means.

M
Not even applicable to what Tesla is doing which is developing full self-driving cars, for sale to the public, that anyone can buy and operate without a driver in place. You have drifted the subject far from your original FUD to distract from the fact that you are spreading FUD. Fear, uncertainty and doubt.


...I'm not sure what meds you're taking, but your dosage is wrong.... this is just incoherent rambling....


Tesla could sell, today full self driving cars to the public that anyone can buy and operate without a driver in place.

Right now.

With no "regulator approval" in a number of US states.

Today.


Your claiming otherwise was what I was correcting you on.
 
A lot of the automation nay-sayers are going to be shocked at how capable these systems become and how quickly they become more competent than your average driver because they don't understand how neural nets work and learn.

I'm fairly aware of how these machine learning systems work, and while they're very neat at doing a lot of things, they often behave most interestingly as they approach their maximums. It's those areas that need to be explored the most, and why I believe eliminating things like running red lights and stop signs will be fast easy wins, but full autonomous driving won't be.

Level 5 autonomy is already legal in California. Here's the form to register your level 5 vehiclehttps://www.dmv.ca.gov/portal/wcm/c...89c5-b2bc7de3fd2c/ol321.pdf?MOD=AJPERES&CVID=
https://www.dmv.ca.gov/portal/wcm/c...89c5-b2bc7de3fd2c/ol321.pdf?MOD=AJPERES&CVID=

There are no Level 5 vehicles. So it's neat that California has some standards put together, but it's not like there are L5 vehicles to take the most advantage of this. Waymo (when it was Google) was probably the most advanced, but only on the campus where they had extremely controlled environments and high resolution map data. In the real world, they have operators in all their vehicles.

Waymo has a permit to test autonomous vehicles without a human test driver (a remote operator can take over when the system finds a fault) but have not yet begun to do so as far as I have heard.

Only when it was Google.

Audi is deploying a Level 3 system in Europe. Tesla should really be working on a similar system to Audi's traffic jam assist as it would actually be useful to many drivers here.

Ah, Audi Jam Assist. I do love hearing stories from VW Auto Group. They disabled this, by the way. So this is still just Audi planning to deploy Level 3. It's already a year late, and they disabled it so now we're looking at 2 years late. But their marketing materials made it look like it was totally done and ready to go. Nope.

They pulled Uber's permit to test in Arizona and California after a single fatal accident. I think a similar accident caused by an inattentive driver using "Automatically driving on city streets" mode would get a similar response.

Uber's permit was pulled because they disabled all safety features and acted extremely irresponsibly. They deserved to have their permits pulled, and they deserve a lot worse than that IMO. They killed a person in a completely avoidable situation. To the point I think you're making, if a human driver did that they'd be in jail.
 
Ah, Audi Jam Assist. I do love hearing stories from VW Auto Group. They disabled this, by the way. So this is still just Audi planning to deploy Level 3. It's already a year late, and they disabled it so now we're looking at 2 years late. But their marketing materials made it look like it was totally done and ready to go. Nope.

What do you mean by we?

Audi isn't planning on releasing the L3 feature in the North American market. They were going to, but then said "nah, the regulatory situation is a mess".

As I understand it this feature was supposed to be released on the Audi A8 in Germany. But, I haven't seen any mention of it actually being released in Germany. I also haven't seen any information as to whether it was delayed or not.
 
Only when it was Google.
Huh? Permit Holders (Driverless Testing)
Ah, Audi Jam Assist. I do love hearing stories from VW Auto Group. They disabled this, by the way. So this is still just Audi planning to deploy Level 3. It's already a year late, and they disabled it so now we're looking at 2 years late. But their marketing materials made it look like it was totally done and ready to go. Nope.
Sounds like they're learning from Tesla! Didn't realize that. I still think that a system like that is what Tesla should be working on.
Uber's permit was pulled because they disabled all safety features and acted extremely irresponsibly. They deserved to have their permits pulled, and they deserve a lot worse than that IMO. They killed a person in a completely avoidable situation. To the point I think you're making, if a human driver did that they'd be in jail.
My point is that I think a similar accident is extremely likely in "Automatic driving on city streets" mode with a bunch of untrained test drivers.
There are no Level 5 vehicles. So it's neat that California has some standards put together, but it's not like there are L5 vehicles to take the most advantage of this. Waymo (when it was Google) was probably the most advanced, but only on the campus where they had extremely controlled environments and high resolution map data. In the real world, they have operators in all their vehicles.
Yep because self-driving is an extraordinarily difficult problem. More than $80 billion dollars has been invested in self-driving research and development. I was just complaining that Elon Musk seems to be trying to give people the impression that autonomous vehicles will be released in short order if not for those pesky regulators.
 
What do you mean by we?

Audi isn't planning on releasing the L3 feature in the North American market. They were going to, but then said "nah, the regulatory situation is a mess".

As I understand it this feature was supposed to be released on the Audi A8 in Germany. But, I haven't seen any mention of it actually being released in Germany. I also haven't seen any information as to whether it was delayed or not.
It does make sense to test it in one market first before releasing it everywhere and different states have different rules though I'm sure they'll converge at some point. California would be the perfect market to release a Level 3 system.
 
As I understand it this feature was supposed to be released on the Audi A8 in Germany. But, I haven't seen any mention of it actually being released in Germany. I also haven't seen any information as to whether it was delayed or not.

The feature was supposed to be rolled out in several markets, US and EU basically. They've disabled it globally before they sold a single car. It was also supposed to be on multiple models when first announced, and then became an A8 only feature. Now it's not a feature at all.

Sounds like they're learning from Tesla! Didn't realize that. I still think that a system like that is what Tesla should be working on.

That's what Tesla is doing. The end goal can be Level 5, that's all fine and dandy. But right now their goal is to implement a human augmenting system, then a system that gets augmented by humans, and finally full autonomy some day in the future.

My point is that I think a similar accident is extremely likely in "Automatic driving on city streets" mode with a bunch of untrained test drivers.

Maybe. But as of now, Uber is the only autonomous system that has hit a pedestrian. Tesla's system already does EAB for pedestrians, and we've already seen sample footage from years ago where autopilot stopped out of caution when two pedestrians were walking on a sidewalk near a road. Obviously that was too cautious, but overly cautious does prevent running people over.
 
That's what Tesla is doing. The end goal can be Level 5, that's all fine and dandy. But right now their goal is to implement a human augmenting system, then a system that gets augmented by humans, and finally full autonomy some day in the future.
Well I think a Level 2 system for city driving is worse than useless. They should work on a Level 3 system for highway driving at lower speeds because that is a much easier problem to solve than a system that works everywhere. I think until that until they can do Level 3 they should have FSD run in the background and take over to prevent accidents. That would actually make the roads safer.
Maybe. But as of now, Uber is the only autonomous system that has hit a pedestrian. Tesla's system already does EAB for pedestrians, and we've already seen sample footage from years ago where autopilot stopped out of caution when two pedestrians were walking on a sidewalk near a road. Obviously that was too cautious, but overly cautious does prevent running people over.
Autopilot is so erratic it probably thought the people were a car in the lane. :rolleyes: No one else is using untrained customers to test their self-driving systems. I'm sure the Uber test driver had seen the car stop for pedestrians, that's why she thought it would be fine to watch TV on her phone. The fact that the Tesla system will stop for pedestrians most of the time makes the system less safe because people will become complacent.
The feature was supposed to be rolled out in several markets, US and EU basically. They've disabled it globally before they sold a single car. It was also supposed to be on multiple models when first announced, and then became an A8 only feature. Now it's not a feature at all.
I bet they've found some interesting corner cases :eek:
Please, no! I just want to live.
After professional testing shows it has reliability far in excess of human drivers!
 
  • Like
Reactions: AlanSubie4Life
I can't imagine how any AP feature improvement would be far better than my driving experience so far. Let's see... it's going to be better than my 51 years of driving 700,000 miles with no accidents and no tickets ! How much better can I expect? Now that I am old, maybe I need it, but I think I might be dead before it becomes a reality :rolleyes::D
 
I can't imagine how any AP feature improvement would be far better than my driving experience so far. Let's see... it's going to be better than my 51 years of driving 700,000 miles with no accidents and no tickets ! How much better can I expect? Now that I am old, maybe I need it, but I think I might be dead before it becomes a reality :rolleyes::D


What I can see is that people will learn how to drive using FSD. Their cars will teach them how to drive. For instance - I can almost guarantee that people don't follow all of the rules of the road when they drive. I can almost guarantee you that not all people know the rules of the road when they drive.

For instance....this new feature that's coming to FSD soon will probably ALWAYS stop at the appropriate spot on the street when its first to a stop light. EVERYDAY I see cars stopped in the crosswalk blocking pedestrians.

I can see bug reports flying out now with this new feature " My car stops too far back when it comes to a red light". I surely hope that I don't get a job in Tesla's customer service because I would do them a great dis-service by responding " That's how you are supposed to drive. That's what you should have been doing for the past 30 years - idiot".

https://electrek.co/2019/03/05/tesla-autopilot-detects-stop-lines-intersections/
 
We'll see if Tesla writes the software for FSD on HW 2.x which Musk says is possible. If so there are about 250,000 cars with it. At 1000 miles a month. That's 3 billion miles a year. 70% is 2.1 billion. If it takes HW 3, that means they will have to retrofit the hardware before they start the verification. In the meantime there are few restrictions on the use of any features so long as there is an engaged driver as is required now for EAP. I've been using EAP around a 16-45k population town for 6 months. It operates very well on poorly marked roads, short steep bridges, fairly tight curves, heavy traffic. It stops for pedestrians. It slows and may stop for cyclists in the outer edge of their bike lanes. If their in the center of their lane it passes normally. It stays engaged on unmarked side streets if it can engage originally with a center line. It recognized, slowed and stopped for pedestrians walking on the edge of the road and continued when they went to the sidewalk. Yesterday it did an aggressive successful zipper merge on the onramp of an interstate in NoA.

It does not yet recognize stop signs or lights but that is currently being verified on the road by Tesla as well as turning at intersections. It also sees road stop lines. Deciding when or if to stop on an amber light is the most difficult according to Musk. That's understandable given the disparity of timing, some of which are designed to produce tickets.

There seems to be confusion between FSD feature complete and FSD statutory approval. We will be able to use FSD with all it's features when it updates to the car the same way we use EAP. When sufficient verification is achieved, the nags will be turned off and the system will be level 3 and proceed to 4 and 5.
 
I love my D90S and the EAP is a huge benefit for me personally. But I have a hard time seeing a future in which you would not need a driver watching the road ahead.

But i wonder if iOT technology could enable “almost-self-driving” - a future time where we constantly watch the road but can keep our hands off the steering wheel except when something unexpected happens - person or deer, etc. darts across the road or sudden accident right in front of you, etc. As i envision it, towns, counties, States, etc could be certified as “almost-self-driving” ready. There would be iOT connected devices at every cross walk, intersection, points where speed limit changes and so forth. These devices would communicate to vehicles in their vicinity things like - you are X feet from a cross walk, there is someone in the cross walk, you are approaching stop sign, speed limit changed to X mph, stop light will turn red in X seconds, stop light just turned green but traffic has not cleared, etc. etc. If the geographic area is not “certified” as being auto-pilot capable a nag will be active and you will need to manage driving as we do now. Maybe combined with weather alerts that could disable self-driving in icy, foggy, snow, etc. and combined with all the capabilities Tesla, Mobileye. GM, etc. are working on involving cameras, sonar, lidar, radar, etc., such “almost-self-driving” can be possible in our life times.
 
That's not how it works. The training dataset needs to be labled. Besides, NoA is an active system, meaning that actions it takes (such as changing lanes or taking an exist) change the input.You will never be able to test these functions without actually executing them (either physically or in a simulator).
We are far away from systems being able to make this kind of inference autonomously.

Because I have some experience with neural nets.
Raw data is useless without labeling.

Besides, this whole discussion is pointless. As the updated fine print shows, it is not about enabling the features but about allowing their unsupervised use.

You can Literally feed the Video Data into a newly written program. With New "Labels". The Program can react to live video translated into data or shadow data.
The only thing you could not do is change the shadow data or how that processes as they do not collect full video
 
You can Literally feed the Video Data into a newly written program. With New "Labels". The Program can react to live video translated into data or shadow data.
The only thing you could not do is change the shadow data or how that processes as they do not collect full video
It doesn't work that way. I suggest you read up on the basics of machine learning to understand the purpose of data labeling. Also, there is no such thing as "shadow data", and the so-called "shadow mode" is in reality a far more profane data collection process than people have been led to believe (see e.g. this analysis).
 
  • Like
Reactions: zmarty