Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What the chances Tesla cars will be self driving in 3 years? Why do you think that way?

What the chances Tesla cars will be self driving in 3 years?


  • Total voters
    215
This site may earn commission on affiliate links.
The way my earlier company used to do this was to have smaller team of employees test every build and larger group test slightly more stable builds. And then they would throw it open to outside devs and then to anyone who wanted the beta version - before release candidates.
Ah Ha...so that answers a question I had from last night. You were in development or QA?
 
That would leave support. I went from development to Mgmt (mainframe SysProgramming) to support to development. Since 2000 I've been almost exclusively Java dev and Android. I just couldn't see Objective-C. Not opposed to Swift though, I just haven't done that yet as my devices are Android.
Oh, trust me - there are a lot of other roles :)

The main roles you have are - dev, operations (some times combined as dev-ops) and PM. Very few companies have specialized testers now. Then there are architects, analysts, data scientists, all kinds of managers etc etc.
 
Last edited:

I may need a lawyer? :D (Usually the second letter's an 'A'.)

Not that someone won't try to sue them (people do all the time), but Tesla's chance of winning is higher because of the warnings.

It starts to become harder and harder to argue that as the system gets closer and closer to full autonomy, though. At some point it starts to fail the reasonable person test (i.e. "Would a reasonable person believe that he/she is actually driving the car?"), regardless of any warnings and disclaimers to the contrary. I'm not sure where that point is, and we're clearly nowhere near it yet, but that point presumably does exist. :)
 
The break up with Mobilieye was only because it found out Tesla was developing its own vision stack and neural net with the plan to eventually dump Mobileye. As early as 2015 Tesla was already hiring Jim Keller/Pete Bannon and tried to recruit George Hotz for its own autonomous project. Mobileye just did the you did not fire me I quit thing to save its stock price. Who in their right mind would want to leave its most important customer unless if there is no choice. The result was Tesla had to push AP2 before it was ready but that would have come anyway.

There is significant reason to believe it was MobilEye who dumped Tesla after the Brown crash though. We even know a MobilEye EyeQ chip was supposed to be on the original AP board (there is even an empty space for it on the AP2 board).

I’m not saying many things could not have contributed to it — life is never black and white — but I do think Tesla wanted to keep using MobilEye longer and that’s why they were caught with their pants down with ”EAP” in 2016-2017...
 
  • Like
Reactions: 1 person
My humble opinion as an IT professional.

Full self driving that most of us think of, as in doing work or reading while the car drives, is a long way off. Traffic, like weather is a chaotic system and the basic tenants of chaos are unpredictable and inconsistent. Right now you cant code or build an algorithm for that kind of chaos. Even AI and machine learning are really just buzzwords to be honest, and they have existed for decades, nothing new.

I'm not of the opinion a true FSD ecosystem can coexist with standard people driven cars, bikes, motorcycles, and pedestrians safely and efficiently. For true FSD you would need a network where all other cars knew about each other and nothing else can impinge on the FSD roadway.

Again - just my opinion. Flame suit on!
 
I saw a video on Youtube by Amnon Shashua , were he claims they dumped Tesla over safety concerns and slandering their small company. The video was about year after the incident.

Yes he was trying to cover his ass and to save the stock price even at the expense of its former customer. You don't need anything other than the published 2015 George Holz incidence to prove he's lying. I also don't remember Elon has slandered his company or for that matter any company. Wasn't him the one who won't miss any opportunities to slander Tesla?
 
My humble opinion as an IT professional.

Full self driving that most of us think of, as in doing work or reading while the car drives, is a long way off. Traffic, like weather is a chaotic system and the basic tenants of chaos are unpredictable and inconsistent. Right now you cant code or build an algorithm for that kind of chaos. Even AI and machine learning are really just buzzwords to be honest, and they have existed for decades, nothing new.

I'm not of the opinion a true FSD ecosystem can coexist with standard people driven cars, bikes, motorcycles, and pedestrians safely and efficiently. For true FSD you would need a network where all other cars knew about each other and nothing else can impinge on the FSD roadway.

Again - just my opinion. Flame suit on!
 
Yes he was trying to cover his ass and to save the stock price even at the expense of its former customer. You don't need anything other than the published 2015 George Holz incidence to prove he's lying. I also don't remember Elon has slandered his company or for that matter any company. Wasn't him the one who won't miss any opportunities to slander Tesla?

Elon did say the breakup was due to MobilEye’s ”high engineering drag co-efficient” or such, those are fighting words between engineers. :)
 
I may need a lawyer? :D (Usually the second letter's an 'A'.)



It starts to become harder and harder to argue that as the system gets closer and closer to full autonomy, though. At some point it starts to fail the reasonable person test (i.e. "Would a reasonable person believe that he/she is actually driving the car?"), regardless of any warnings and disclaimers to the contrary. I'm not sure where that point is, and we're clearly nowhere near it yet, but that point presumably does exist. :)
I thought you made a great point, right up until your last sentence because I believe we are AT that point now and this entire thread is proof of that. The people who believe we are have the what difference does the number make and the people who think we are decades/years away cite numbers as their proof. I'll try to use your words to re-iterate my point. If a driver drives their Tesla for 3 hours but actually only controls the car for 5-10 mins of that, what (who) is driving the car should be patently obvious irrespective of numbers and conditions set by a society (of automotive engineers). Would it be nicer if the driver could take a nap, watch a movie, have sex in the back seat? Sure but between then and now they are well advised to pay attention to what the car and surrounding traffic is doing.
 
My humble opinion as an IT professional.

Full self driving that most of us think of, as in doing work or reading while the car drives, is a long way off. Traffic, like weather is a chaotic system and the basic tenants of chaos are unpredictable and inconsistent. Right now you cant code or build an algorithm for that kind of chaos. Even AI and machine learning are really just buzzwords to be honest, and they have existed for decades, nothing new.

I'm not of the opinion a true FSD ecosystem can coexist with standard people driven cars, bikes, motorcycles, and pedestrians safely and efficiently. For true FSD you would need a network where all other cars knew about each other and nothing else can impinge on the FSD roadway.

Again - just my opinion. Flame suit on!
You are right in that 'that most of us think of". I think most people have irrational expectations of FSD. So the issue is, what is the rational expectation. For me, it is a system that provides equal to or greater than miles per fatal accident and miles per non-fatal accident. You are further correct in the unpredictability of chaos. Somewhere on this planet is the best human driver. You put him on the same section of road as several poor drivers, bad stuff will ensue. In life there are no absolute guarantees. If people are looking to FSD to provide that, they will consistently be disappointed. They drive the President of the United States around in what is, effectively, a tank. If one were sufficiently motivated, that would not protect them. As for sleeping/driving/fornicating while driving or being driven, pay your dime and take your chance. So, it really all comes down to probability. Can an automated system beat the national average? Absolutely. Do those national averages drop if you look filter for only blinding snow? Absolutely.
 
  • Like
Reactions: DJVoorhees
... I also don't remember Elon has slandered his company or for that matter any company. ...
Elon blamed the Mobileye system for the failure that caused Joshua Brown to die.
Mobileye spills the beans: Tesla was dropped because of safety concerns
Mobileye said that is false and Tesla was using the product in an unintended way. Making false accusations is slander.

Is Tesla telling us the truth over autopilot spat?
Quote: At first, Tesla pinned the blame for the crash on the car's camera not being able to tell the difference between a white trailer and a bright sky – in effect shifting the blame from its own systems to those provided by a third party.
 
Elon blamed the Mobileye system for the failure that caused Joshua Brown to die.
Mobileye spills the beans: Tesla was dropped because of safety concerns
Mobileye said that is false and Tesla was using the product in an unintended way. Making false accusations is slander.

Is Tesla telling us the truth over autopilot spat?
Quote: At first, Tesla pinned the blame for the crash on the car's camera not being able to tell the difference between a white trailer and a bright sky – in effect shifting the blame from its own systems to those provided by a third party.

Yep. MobilEye EyeQ3 used on AP1 does not support cross-traffic, so a crossing vehicle is up to the OEM (Tesla in this case) and their redundancy (or geo-fencing).
 
  • Like
Reactions: DanCar
It starts to become harder and harder to argue that as the system gets closer and closer to full autonomy, though. At some point it starts to fail the reasonable person test (i.e. "Would a reasonable person believe that he/she is actually driving the car?"), regardless of any warnings and disclaimers to the contrary. I'm not sure where that point is, and we're clearly nowhere near it yet, but that point presumably does exist. :)
As it gets closer to full autonomy, chances of crashes get lower too ;)