Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What the chances Tesla cars will be self driving in 3 years? Why do you think that way?

What the chances Tesla cars will be self driving in 3 years?


  • Total voters
    215
This site may earn commission on affiliate links.
Looks like Tesla is doing their FSD demo for the April 22 event.

yAdD2gR.png
 
The federal regulation and the Insurance industry will have to be reshaped. Tesla can start with designating proven roads at first.
I think regulation will be left up to the states for a while. What’s wrong with the current state regulations?
Right now California requires $5 million in liability insurance for self driving cars but that seems fine since they’ll have reliability far better than a human. I already have $1 million dollar coverage so expect my insurance rates to go down.
 
Please quote the exact language from any of this list that is actually responsive to my question asking for research behind your statement

Happy to be of assistance! We’ll see how things go...the main thing is to approach this in a manner which minimizes risk while still allowing adequate progress. Given that there are quite a few unknowns that is the prudent thing to do.
 
  • Funny
Reactions: bhzmark
The roads in my neighborhood do not have lines. At what point will a Tesla be able to navigate one of those? A human can see it is a 2 way road and that they need to stay to the right and out of the middle - but when will the car be able to make that same set of decisions?
 
It's the last 5-10% of situations that are going to be ridiculously hard to code for. Those odd intersections where you can kinda see the traffic light of another street and think it's yours and almost run through an intersection. Those times where construction really screws with the lanes. When a street just isn't well-marked. When the power is out. When a police officer is directing traffic even though the traffic lights above him are working.... There are *so* many fringe cases that come up quite often.
 
It's the last 5-10% of situations that are going to be ridiculously hard to code for. Those odd intersections where you can kinda see the traffic light of another street and think it's yours and almost run through an intersection. Those times where construction really screws with the lanes. When a street just isn't well-marked. When the power is out. When a police officer is directing traffic even though the traffic lights above him are working.... There are *so* many fringe cases that come up quite often.

Oh, I see you've been to my fair city....
 
  • Like
Reactions: dcdttu
TBH I think it would be amazing if in the next 3 years we achieve level 3 autonomy as defined by the SAE


  • Level 3 ("eyes off"): The driver can safely turn their attention away from the driving tasks, e.g. the driver can text or watch a movie. The vehicle will handle situations that call for an immediate response, like emergency braking. The driver must still be prepared to intervene within some limited time, specified by the manufacturer, when called upon by the vehicle to do so.

If level 4, or level 5 take 5-10 years, that's okay with me. I think a dependable, wide reaching level 3 is what most people currently want.
 
I think Tesla will get there - but not in 3 years. As many others have said - I think over the next 3 years it will be a major leap forward - but 3 years to punching an address in your car and it just goes there no matter the weather conditions... not likely in 3 years. I think we will be well up into level 3 by the end of 3 years and maybe under limited locations and conditions even some level 4.

Limited Self-Driving Automation (Level 3). Drivers can elect — but don’t have to — to turn over control of all safety-critical functions under certain traffic or environmental conditions. However, drivers can’t completely check-out; they’re expected to be available for occasional control.

Full Self-Driving Automation (Level 4): These look for navigational input to know where to drive, but after telling the vehicle where to go, drivers aren’t expected to step in and take the reins. These vehicles may be operated unoccupied and are able to perform all safety-critical driving functions and continuously monitor roadway conditions.
 
It's the last 5-10% of situations that are going to be ridiculously hard to code for. Those odd intersections where you can kinda see the traffic light of another street and think it's yours and almost run through an intersection. Those times where construction really screws with the lanes. When a street just isn't well-marked. When the power is out. When a police officer is directing traffic even though the traffic lights above him are working.... There are *so* many fringe cases that come up quite often.

Those edge cases you mention are all covered by rules.
For example, police or construction workers directing traffic don't just make up signals on the spot.

I disagree with the idea that those edge cases are "ridiculously hard". Everything on the road either has rules, or you fall back to the rule of the road, move carefully, don't hit things.
 
  • Like
Reactions: WarpedOne
One big issue is that whenever it rains, some of the traffic lights go out, while others just flash. It's the ones that are totally out that are the issue because human drivers don't see them (or pretend they don't). There needs to be some "free day at the zoo" algorithms.
 
  • Informative
Reactions: pilotSteve
The roads in my neighborhood do not have lines. At what point will a Tesla be able to navigate one of those? A human can see it is a 2 way road and that they need to stay to the right and out of the middle - but when will the car be able to make that same set of decisions?

FWIW, the Model 3 knows to move over when using Enhanced Summon in an un-lined parking lot when another car is coming.
 
Thanks for the well thought-out response. Yet another layet of complexity (or maybe just noise) is public perception. As long as every fatal auto drive accident makes the front page, the longer L4 and L5 will take. We need to get past using perfection as the bar, and start using “better than humans”.

Part of that problem is that most autodrive crashes are the kinds that humans could trivially avoid, whereas human-caused crashes (mostly due to inattention, impairment, sleepiness and boredom, slow reaction times etc) are the kinds that NEVER occur with AI. So in the public eye an AI crash seems stupid and makes the front page, whereas the avoided accidents don’t even get acknowledged.


0% if you're referring to complete autonomy in all situations.

I work in the broader AI field, admittedly in the NLP/NLU/NLC space rather than image processing, but autonomy (which is what we're really talking about) is an extraordinarily difficult problem to solve, where the inputs are not finite. There are literally an infinite number of "ifs" with a finite number of "thens" to consider, and so this is both a computational problem as well as a learning problem and a recall (storage/latency) problem.

I certainly think that something close to self-driving will be available for "sunny day" scenarios, like driving around relatively calm and quiet grid-based cities with good weather and predictable traffic. But negotiating poorly signed road-works on smashed up highways around New York or Detroit, or dealing with heavy rain, snow, fog, erratic drivers, cyclists, traffic cops telling drivers to do the opposite of what the signs say, dealing with emergency vehicles, navigating parking lots, doing all of that in the dark, doing any of that when there's no wireless signal and not enough information in the onboard database, etc, all mean that "full" autonomy is probably 3 or 4 generations away (10 - 15 years IMO).

In technology people talking about 80/20 where 20% of the effort gets you 80% of the result, and the remaining 80% effort gets you the final 20% result. In the development of autonomous driving, those "edge cases" which might only be 1 - 2% of driving situations, will consume more than 95% of the effort.

I would so love to be wrong BTW.
r of
 
What the chances Tesla cars will be self driving in 3 years? Why do you think that way?

Your question is not specific enough. Tesla cars are already self driving. They are not Fully Self Driving, or level 5, but they do drive themselves under an ever expanding set of conditions.

In 3 years:
Level 5... Probably not, but I wouldn't bet against it.
Level 4... Probably, but I wouldn't put money on it.
Level 3... Are we there yet?
 
  • Funny
Reactions: AlanSubie4Life
The federal regulation and the Insurance industry will have to be reshaped. Tesla can start with designating proven roads at first.
That's the real reason we don't already have self driving cars already imo. The technology itself is important, but the ramifications are so broad that if they can't just start selling a 100% self driving car overnight, it would be massively disruptive to numerous industries, and on top of that the laws and regulations need major changes/thought.

Regardless of whether it's 80% or a 100% FSD in the next few years, they won't release it until they are 100% sure there is a significant safety advantage to using it, which should be proven beyond a doubt now. Assuming the safety advantage, does Tesla have a responsibility to give it away for free like Volvo did with seatbelts? Should it be mandated worldwide for new cars and retrofitted on old? More people die a year (over a million worldwide) in car accidents than from malaria or the flu to put it in perspective. Not the most apples to apples but in some ways it's like having a vaccine.
 
  • Funny
Reactions: AlanSubie4Life