Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Where do you think Tesla's FSD will be 12 months from now (April 2020)?

Where do you think Tesla's FSD will be 12 months from now (April 2020)

  • Other. Please explain.

    Votes: 0 0.0%

  • Total voters
    110
  • Poll closed .
This site may earn commission on affiliate links.
You seem to be saying that Tesla can't do it right and therefore Teslas will inevitably run through red lights and hit stuff, so they should not even try.

I think it makes sense for Tesla to not try very hard and be super conservative with rollout. They’re going to make it very clear things are level 2 for a LONG time. And keep the nags to guarantee it is level 2.

There is no great benefit to being first in this game. It’s going to be a quickly adopted technology so the first-mover fallacy applies. In this case, you can let everyone else fall over themselves and have the fatal accidents whilst Tesla develops their reputation as a competent conservative player with safety as the number one priority. They can learn from others’ mistakes and get a lot of data from the fleet to study and understand an extremely difficult problem, while continually improving fleet safety. One of these days I hope Tesla will publish safety statistics; could be super interesting and helpful. ;) .

There is no rush. Build clean performance cars that people want.

In the end safety is all that matters. FSD is not required to significantly improve on that metric.
 
I'm not saying that Tesla can't add stop sign and stop light recognition, I'm just saying that it would be a very bad idea. The consequences of running a stop sign are way worse for Tesla's image than someone plowing into the side of semi truck. It's not just bad press either, I think that it would result in regulators in many states deciding that they are testing self driving in violation of the current regulations.
That is true even now.

NOA can run into other cars, in to barriers causing major injuries/fatalities. Speed kills.

That hasn't stopped Tesla (and even other OEMs) from releasing NOA / something similar. Besides, Musk is all about taking chances. If he was afraid of failure he would have never started an EV company (nobody has succeeded in starting an automobile company in the US in some 100 years) and never started Space X.

After they release city NOA, individual drivers are still responsible for killing someone in the intersection by running red light (more likely fender bender). So, people intervene - just like we all do now when running on AP.

Infact, stopping at stop sign or red light is not different from stopping behind another vehicle. I always monitor carefully to make sure the car is decelerating as I approach a stopped / stopping vehicle. If the car isn't slowing down, I intervene (hasn't happened lately, though early on it took me time to figure out when to expect the car to start slowing down). I plan to do exactly the same thing with city NOA.
 
  • Like
Reactions: GSP and DanCar
I'm not saying that Tesla can't add stop sign and stop light recognition, I'm just saying that it would be a very bad idea. The consequences of running a stop sign are way worse for Tesla's image than someone plowing into the side of semi truck. It's not just bad press either, I think that it would result in regulators in many states deciding that they are testing self driving in violation of the current regulations.
In other words, Tesla should stop FSD development right now. I can't say I agree with that thinking.
 
You're missing the point. The motor is more than powerful enough to turn the front wheels in the worst case condition, I doubt you were turning the wheel in whatever that is. Driving without power steering is easy when you're moving. I contend that applying over a hundred pounds of force to the steering wheel is not possible for most people in a normal driving position.
It was an S as I said. Yes, it's certainly easier to steer when rolling, but really that there is still the rack and pinion which has mechanical advantage plus the diameter of the steering wheel which adds to it. I did not have to move from the normal driving position.
 
Besides, Musk is all about taking chances. If he was afraid of failure he would have never started an EV company

I suspect that Musk DOES care about fatalities though. Making EVs (arguably) is saving lives so is worth risk. It’s entirely possible that FSD will result in greater loss of life if rolled out incorrectly.

So I am fairly sure Tesla will be super-conservative, and if that means not rolling out features that are not fully vetted and safer than human drivers, they won’t do it.

At the moment, even Waymo and others don’t have the safety right. So I don’t see Tesla getting to that point soon. They may well implement some level 2 features (possibly even background features) which improve safety, but I don’t see any strong reason to push straight to actual FSD.

Most people who purchased FSD are presumably not expecting any actual FSD features any time soon (years), and for the minority of people who get upset, Tesla can always refund them!

Safety is top priority. Setting back FSD by years with a poorly planned fatal rollout will also cost thousands of lives with a delayed long term rollout due to regulatory issues. So that’s bad at two levels.
 
... So I am fairly sure Tesla will be super-conservative, and if that means not rolling out features that are not fully vetted and safer than human drivers, they won’t do it.
Doubt it. They will repeat the past and roll out what some will consider very dangerous stuff and put the onus on the driver as usual.

At the moment, even Waymo and others don’t have the safety right. So I don’t see Tesla getting to that point soon.
I think waymo is further along then people think, but they are willing to take zero risk with blood on their hands. So they will never be successful, unless there is new ownership. Instantly successful if Elon was the owner.
They may well implement some level 2 features (possibly even background features) which improve safety, but I don’t see any strong reason to push straight to actual FSD.
Strong reason is that Elon said they would.

Most people who purchased FSD are presumably not expecting any actual FSD features any time soon (years), and for the minority of people who get upset, Tesla can always refund them!
Too many gulibles, tesla believers, and lawyers who are eager to sue.

Safety is top priority. Setting back FSD by years with a poorly planned fatal rollout will also cost thousands of lives with a delayed long term rollout due to regulatory issues. So that’s bad at two levels.
History will repeat itself. People will die, but it will be their fault because they were not attentive. Imagine when traffic light detection is 99.9% accurate. That means on average each person will crash once every 3 years. Who is going to pay attention when it is that accurate or even worse, 99.99% accurate?
 
Last edited:
You seem to be saying that Tesla can't do it right and therefore Teslas will inevitably run through red lights and hit stuff, so they should not even try. But what if Tesla does it right and Teslas don't run red lights?

I am confident that Tesla FSD will be able to stop at red lights with 99.9% reliability. I am also confident that 100% reliability is impossible, so @Daniel in SD concern about consequences when one day, one FSD car will run a red light, somewhere on earth, is valid.

Tesla will need 99.999999% reliability, or at least 10x better than humans whatever that is. This will be very hard and take a very long time to prove.

I do agree with you however, Tesla will do it, with some corrections for “Elon Time.”

GSP

PS. 99.9% means running one red light per thousand. I would expect the average driver stops at several thousand red lights per year, so on average more than one bad event per year.
 
  • Helpful
Reactions: diplomat33
I suspect that Musk DOES care about fatalities though. Making EVs (arguably) is saving lives so is worth risk. It’s entirely possible that FSD will result in greater loss of life if rolled out incorrectly.

So I am fairly sure Tesla will be super-conservative, and if that means not rolling out features that are not fully vetted and safer than human drivers, they won’t do it.

Tesla will need 99.999999% reliability, or at least 10x better than humans whatever that is. This will be very hard and take a very long time to prove.
Yes - Musk won't let Tesla roll out features he is not happy with. Even if they miss deadlines - like we see with enhanced summon.

As Karpathy said in that talk about multi-task NNs, Tesla is targeting six 9s for all features. That means until they get to six 9s level for recognizing traffic lights and stop lights, they won't release the feature. Six 9s is 99.9999% correct. I don't have the stat for running red lights specifically, but, humans are on average 99.9996% correct (i.e. 4 severe crashes per 1 million miles). So, six 9s is 4x better than humans.

Once they get to six 9s, they will release the feature because it is better than humans and will likely save more lives than it might cost.

Now, what does six 9s for Tesla mean ? That just means in their testing using the training/test data they have, NN should be very accurate. If they are good at something like 99.99% (?) per frame at recognizing red lights / traffic signs, they will be six 9s at recognizing traffic lights given a few seconds / several hundred frames.

It doesn't actually mean they won't have any issues in the wild. That is why they will monitor the feature closely once released and any problems will lead to retraining to make it better. That is how they will get to six 9s in the real world.
 
Last edited:
  • Informative
  • Like
Reactions: GSP and diplomat33
What do you mean by this? That it has to be safe? Waymo has permission to operate L4 (without a safety driver) in both Arizona and California. They're not doing it because their system isn't safe enough.

Lots of states have legislation allowing for testing of Autonomous cars, but I don't see any clear pathway for a company like Tesla to get regulatory approval for either L3 driving or L4 driving in the US.

What I see is a patchwork of legislation, and no clear pathway for Tesla to follow to get approval for L3 or L4 driving with the US.

It was such a mess that Audi cited that as one of the reasons they weren't planning on releasing the Audi A8 with Limited L3 capabilities within the US.
 
Lots of states have legislation allowing for testing of Autonomous cars, but I don't see any clear pathway for a company like Tesla to get regulatory approval for either L3 driving or L4 driving in the US.

What I see is a patchwork of legislation, and no clear pathway for Tesla to follow to get approval for L3 or L4 driving with the US.

It was such a mess that Audi cited that as one of the reasons they weren't planning on releasing the Audi A8 with Limited L3 capabilities within the US.

Can you elaborate? Is your thought that since there are different States with different regulations, that it will hard for Tesla to meet all the different regulations needed to get approval for testing in all 50 States since Tesla wants FSD for the entire US?

My thought is that the Tesla fleet should be an advantage for Tesla. Tesla can release FSD as "L2" to the entire US fleet and quickly get the billions of miles of validation needed to show regulators in all 50 States that the system is safe as L3+.
 
Lots of states have legislation allowing for testing of Autonomous cars, but I don't see any clear pathway for a company like Tesla to get regulatory approval for either L3 driving or L4 driving in the US.

What I see is a patchwork of legislation, and no clear pathway for Tesla to follow to get approval for L3 or L4 driving with the US.

It was such a mess that Audi cited that as one of the reasons they weren't planning on releasing the Audi A8 with Limited L3 capabilities within the US.
Currently technology is the inhibitor and not regulation. Florida has legalized robotaxis. Why are we not seeing any robotaxis there - after all Florida is a big state with a lot of business potential. Uber lobbied the legislature to pass the legislation.

Currently citing regulations is just an excuse. No same company will release anything like L3+ because technology is simply not there, given the liability and risk. Nothing to do with regulations. Waymo has tried robotaxis without safety drivers. But they put them back - why ? Not regulation.

Its like citing regulation as the reason why we haven't seen cold fusion ;)
 
Currently citing regulations is just an excuse. No same company will release anything like L3+ because technology is simply not there, given the liability and risk. Nothing to do with regulations. Waymo has tried robotaxis without safety drivers. But they put them back - why ? Not regulation.

It's probably why Cruise admitted that they are not ready to deploy robotaxis in SF this year and they need more time to do further testing and validation. In other words, they are admitting that their tech is not quite there yet.

In terms of Waymo, I read that the reason Waymo put the safety drivers back in was purely PR, because some customers were nervous about getting into a car with no driver. Although who knows if that is the real reason?
 
Doubt it. They will repeat the past and roll out what some will consider very dangerous stuff and put the onus on the driver as usual.

I understand what you’re saying, but think this paints things in too negative a light.
It’s possible that Teslas have saved lives overall (nothing really to do with AP). No one really knows, and unfortunately Tesla hasn’t provided statistics. But in total, taking into account all factors, it is possible. Background active safety measures are effective.

People will die, but it will be their fault because they were not attentive. Imagine when traffic light detection is 99.9% accurate. That means on average each person will crash once every 3 years. Who is going to pay attention when it is that accurate or even worse, 99.99% accurate?

That’s nowhere near good enough, obviously. Great for a background system but not good for anything else.

Once they get to six 9s, they will release the feature because it is better than humans and will likely save more lives than it might cost.

Gotta be a little careful about this. It’s all very well to get to 6 9’s overall (and that’s going to be really difficult!), but there may be specific situations where failure rate might be as high as 10% even though overall you have 6 9’s. It’s possible. It’s also totally unacceptable, even if you can validly argue that you are saving lives. I know that’s the argument. But it’s these corner cases where the system doesn’t work well at all that are tricky. Monitoring isn’t really good enough. Remember, the damage is done at that point.

Tesla will ensure they are a responsible member of the community and everything they do improves safety without violating community trust. Trust and safety are paramount.

Again, there is NO NECESSITY to roll out FSD. Massive improvements in safety are possible with more capable hardware and a data-driven, continuously improving system. They can stuff all this active safety stuff in the background and keep drivers responsible for driving for as long as necessary. Owners with FSD will still be benefiting for the years it takes to get to actual FSD (if it is even possible!), and there is no loss of trust with the wider community.

Is it as sexy as true FSD? Maybe not. But it’s continuous safety improvement and continues to achieve Tesla’s goal. Make the cars really good. Don’t pursue perfection - just make cars safer.

There’s no rush - if someone actually “solves” the FSD problem it’ll be reasonably straightforward to duplicate it. That’s the advantage of being second. And in the meantime you can be making awesome cars that get safer and cleaner every day.
 
Last edited:
I think it is safe to assume that Tesla will try to be as safe as possible with the roll out of FSD features. They are not going to willfully roll out a feature like traffic light detection if they know that it will cause accidents because it is not reliable yet. Heck, traffic light detection has been in development for months now. The reason Tesla has not already released it is precisely because they want to make sure it is safe.
 
Gotta be a little careful about this. It’s all very well to get to 6 9’s overall, but there may be specific situations where failure rate might be as high as 10% even though overall you have 6 9’s. It’s possible. It’s also totally unacceptable, even if you can validly argue that you are saving lives. I know that’s the argument. But it’s these corner cases where the system doesn’t work well at all that are tricky. Monitoring isn’t really good enough. Remember, the damage is done at that point.
You are getting confused. You don't stop deployment because there may be unknown cases where there is 10% chance of failure. Heck, there will be cases where the chance of failure is 100%.

There will be hundreds of cases where City NOA will simply not work. Just as in Freeway NOA. Currently Tesla cars do not move over and stop when a emergency vehicle has its siren on at the rear. Tesla wants you to handle those edge cases.

Same with City NOA. Those edge cases where there is 10% chance of crash is where Tesla wants you to take over.

Remember we are talking about Level 2 - not level 3/4/5.
 
  • Funny
Reactions: AlanSubie4Life
Remember we are talking about Level 2 - not level 3/4/5.

Sure, I agree for level 2. You don’t even necessarily have to have 6 9’s reliable sign detection and stop light detection for that. Just need stuff running in the background for failsafe. And no actual active stopping for stop lights and signs in most cases. (Emergency failsafe operation only would potentially be fine, depending on implementation.)

It’s all about how to maintain driver vigilance at that point, keeping the driver in charge of driving at all times, and doing all the driving, stopping, etc., with City NOA.

I think we basically agree on this. I was just responding in the context of the thread title. Level 2 doesn’t really have much to do with FSD, which is fine. There’s definitely tons of safety improvement possible within the limitations of level 2. And it may well require the hardware people purchased with FSD.
 
The motor connected to the steering rack is so strong most people would have a lot of trouble overpowering it. Think about it, it's powerful enough to turn the wheels while the car is sitting still. Try turning the wheel of a 4000lb car without the engine on and add to that the steering ratio is quite a bit higher on a Model 3 than the average car and add to that the steering motor is stronger than the bare minimum needed.
I suspect that the brakes can be disabled by the ABS solenoids.
My prediction is that Tesla will be L2 on the highway and city NoA will never be released. My hope is that they're able to at least develop some sort of limited L3 for interstates.

It's not a matter of whether the motor is strong enough (as it clearly is), but whether the existing HW allows for it to be used in that manner. Where the driver input can go from being amplified (during L1 driving) to overuled (during L4 driving).

Right now the take over force is very small. So ideally it would simply be increased (in SW) to a point where a takeover event had to be a deliberate act versus accidental.

I was less concerned about the steering than I was the brakes. I don't know if the same kind of implementation can be done where brake action is simply ignored or the force is increased.

The SAE Level 4 is clear in that the steering and pedals can be removed, but I don't see that happening with the Model 3 as that's really only for driverless taxi's approved for small areas. Not a vehicle that can go from L1 driving somewhere offload to L4 driving on a freeway.
 
Currently technology is the inhibitor and not regulation. Florida has legalized robotaxis. Why are we not seeing any robotaxis there - after all Florida is a big state with a lot of business potential. Uber lobbied the legislature to pass the legislation.

Currently citing regulations is just an excuse. No same company will release anything like L3+ because technology is simply not there, given the liability and risk. Nothing to do with regulations. Waymo has tried robotaxis without safety drivers. But they put them back - why ? Not regulation.

Its like citing regulation as the reason why we haven't seen cold fusion ;)

Why are you bringing up brownies when I was talking about ice cream? Sure you can say you need to have brownies to put the ice cream on, but sometimes with some people it's easier to talk about the ice cream.

In this case the regulatory aspects (not just the US, but European market as well) is a very real challenge to L3/L4 vehicles. Hence the Audi A8 example I used where the technology is ready for a very limited L3 system. Or at least was claimed to be by Audi.

The regulatory aspects is the ice cream.

Just because I talk about regulatory challenges doesn't mean I dismiss the technical challenges. Just like if I talk about the technical challenges I don't dismiss the regulatory ones. In a lot of ways they have to happen in lock step with each other. The technology needs to have goal posts of what people expect. Tesla is just making up numbers for the goal post, and that's not going to work.
 
Last edited:
Why are you bringing up brownies when I was talking about ice cream? Sure you can say you need to have brownies to put the ice cream on, but sometimes with some people it's easier to talk about the ice cream.

In this case the regulatory aspects (not just the US, but European market as well) is a very real challenge to L3/L4 vehicles. Hence the Audi A8 example I used where the technology is ready for a very limited L3 system. Or at least was claimed to be by Audi.

The regulatory aspects is the ice cream.

Just because I talk about regulatory challenges doesn't mean I dismiss the technical challenges. Just like if I talk about the technical challenges I don't dismiss the regulatory ones. In a lot of ways they have to happen in lock step with each other. The technology needs to have goal posts of what people expect. Tesla is just making up numbers for the goal post, and that's not going to work.
Because I think talking about regulatory hurdles currently is putting the cart before the horse. First get the FSD ready. A lot of industry players, irritatingly, hide behind regulations to obfuscate the state of technology.