Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Full Autonomy All But Confirmed for 3, "We're doing the obvious thing"

What do you think? Will Model 3 have full autonomy?

  • Most Definitely, It's Obvious!

    Votes: 56 24.7%
  • Possibly... Still Not Sure

    Votes: 76 33.5%
  • No Way, You'll have to drive that sucker yourself!

    Votes: 95 41.9%

  • Total voters
    227
This site may earn commission on affiliate links.
NHTSA is a government agency, they would need to first define a level 5 driverless car and set regulations surrounding such a thing before something like that would be allowed to cross state lines etc. At least we'd hope haha
There was an interview with NHTSA director (IIRC) sometime back on NPR. He basically said the department is waiting for someone in the industry to bring a car to them - they'd look to approving the cars on a case-by-case basis in the beginning. IOW, no detailed regulations/guidelines in the beginning.
 
  • Informative
Reactions: SW2Fiddler
But Tesla's (and other's) car tech advancements follow the more practical approach of MobileEye, who is the industry-defining leader currently. I personally would not trust a "summon" delivery on anything less than Level 5. It's also irresponsible to make a statement claiming this would work reliably in 2 years.

Which is my point exactly. So the answer to the question of whether the Model 3 will have full autonomy IMO is definitely no.

It's confusing to have two different ratings.
 
To clarify, in my opinion it will be hardware capable at launch... but not necessarily ready to activate the feature...

But that means nothing. Autonomy is all about software. Software could easily take another 5 years or more. Witness "in the next few months" becoming over a year to get out the current autopilot software, which is about 1/1000th as difficult as full autonomy.
 
Fully autonomous isn't a big need on my part nor is city driving. Just get better and better at highway driving and I'll be happy. I don't need it for short drives or construction zones, but would find it to be the most helpful if it could continue to improve highway travel and decrease driver involvement. That said, if it is fully autonomous, that would certainly be a huge convenience.
 
But that means nothing. Autonomy is all about software. Software could easily take another 5 years or more.
As a software engineer, I have to disagree. While full autonomy might not fit your timeframe, having the hardware be capable of it means everything.

For one, all the algorithms can be tested (silently) across hundreds of thousands of vehicles. For two, incremental improvements can be constantly delivered. And lastly, when full autonomy software is ready (and approved by regulators) you won't need to buy another car to get it. A software upgrade is called a patch. A hardware upgrade is called a recall. You want the patch. :)
 
From Amnon Shashua's (MobilEye CTO) CES 2016 presentation on Jan. 7, 2016 (embedded at bottom of post), I think it's entirely possible that EyeQ4 will make it into Model 3 from the beginning. It looks to me like Elon's end of year announcement will be about AP 2.0 with new hardware and software. I'm guessing it will be the "Full Vision" hardware suite with multiple cameras, out on Model S/X first with network of three EyeQ3 chips, then later next year with EyeQ4. With any luck, Model 3 will get the same sensor suite and EyeQ4 from the launch. It's also possible Model 3 will launch with the triple EyeQ3 setup, if EyeQ4 isn't quite ready in enough volume. Time will tell.

Discussed at about 21:20. Note the four launches for Trifocal EyeQ4 in 2017 and 2018 and Full Vision EyeQ4 in 2017 (probably late 2017, see below) with partial functionality. Could include revamped Model S/X as well as Model 3 afterwards. The comment about partial functionality lines up with Tesla's way of rolling out the hardware sensors first and then doing OTA software updates with new features that use that hardware.

upload_2016-6-6_16-2-54.png



Discussed at about 26:30. Note mention of implementation on network of three EyeQ3 chips late this year (matching Elon's end of year announcement). Then EyeQ4 is slated for late 2017/early 2018, nicely matching with Model 3.

upload_2016-6-6_16-9-9.png


Full presentation:
 
From Amnon Shashua's (MobilEye CTO) CES 2016 presentation on Jan. 7, 2016 (embedded at bottom of post), I think it's entirely possible that EyeQ4 will make it into Model 3 from the beginning. It looks to me like Elon's end of year announcement will be about AP 2.0 with new hardware and software. I'm guessing it will be the "Full Vision" hardware suite with multiple cameras, out on Model S/X first with network of three EyeQ3 chips, then later next year with EyeQ4. With any luck, Model 3 will get the same sensor suite and EyeQ4 from the launch. It's also possible Model 3 will launch with the triple EyeQ3 setup, if EyeQ4 isn't quite ready in enough volume. Time will tell.
Really nice post favo. On the bright side, it seems fairly clear that the Model 3 will at least ship with the upgraded sensor suite that's due to be announced end-of-year, which means (worst case) something close to full highway autonomy.

I wonder how much they'll push for the EyeQ4 for the Model 3, however, keeping in mind the incredible time-sensitivity of the massive launch ramp. If the EyeQ4 is planned for, but isn't ready or isn't shipping in the necessary quantities, then Tesla will have a production disaster on their hands.

For that reason, I suspect Tesla will be conservative with the Model 3, and choose the EyeQ3 platform. [I'd love to be wrong]

However, if they're feeling generous, they could still install the cameras and sensors necessary for Full Vision, only use the front cameras with EyeQ3, and then offer a "module upgrade" at a later date, swapping out the EyeQ3 hardware for EyeQ4 (which would make use of the additional cameras). They'd have to be feeling pretty generous to do the extra engineering work and eat the up-front sensor costs, but that's the sort of extra effort that causes your customers to become brand evangelists who make their own car commercials in their spare time.
 
  • Like
Reactions: Bimbels
From Amnon Shashua's (MobilEye CTO) CES 2016 presentation on Jan. 7, 2016 (embedded at bottom of post), I think it's entirely possible that EyeQ4 will make it into Model 3 from the beginning. It looks to me like Elon's end of year announcement will be about AP 2.0 with new hardware and software. I'm guessing it will be the "Full Vision" hardware suite with multiple cameras, out on Model S/X first with network of three EyeQ3 chips, then later next year with EyeQ4. With any luck, Model 3 will get the same sensor suite and EyeQ4 from the launch. It's also possible Model 3 will launch with the triple EyeQ3 setup, if EyeQ4 isn't quite ready in enough volume. Time will tell.

Discussed at about 21:20. Note the four launches for Trifocal EyeQ4 in 2017 and 2018 and Full Vision EyeQ4 in 2017 (probably late 2017, see below) with partial functionality. Could include revamped Model S/X as well as Model 3 afterwards. The comment about partial functionality lines up with Tesla's way of rolling out the hardware sensors first and then doing OTA software updates with new features that use that hardware.

View attachment 179742


Discussed at about 26:30. Note mention of implementation on network of three EyeQ3 chips late this year (matching Elon's end of year announcement). Then EyeQ4 is slated for late 2017/early 2018, nicely matching with Model 3.

View attachment 179743

Full presentation:
I think that meant 3 platforms using EyeQ3 chips not 3 EyeQ3 chips. We know as of late 2015 there were "two programs" by one or more OEMs using 5 connected EyeQ3 chips.
 
As a software engineer, I have to disagree. While full autonomy might not fit your timeframe, having the hardware be capable of it means everything.

For one, all the algorithms can be tested (silently) across hundreds of thousands of vehicles. For two, incremental improvements can be constantly delivered. And lastly, when full autonomy software is ready (and approved by regulators) you won't need to buy another car to get it. A software upgrade is called a patch. A hardware upgrade is called a recall. You want the patch. :)

Until you've written the software you can't be sure that the hardware is indeed sufficient. However my main point is that if the software doesn't exist you cannot claim that the Model 3 has full autonomy. It's like saying you have a robot that can play chess because it can move the pieces, now you just need to write the part where it knows how to play chess (but it's just software).
 
  • Love
Reactions: deonb
Until you've written the software you can't be sure that the hardware is indeed sufficient. However my main point is that if the software doesn't exist you cannot claim that the Model 3 has full autonomy. It's like saying you have a robot that can play chess because it can move the pieces, now you just need to write the part where it knows how to play chess (but it's just software).
Tesla's not claiming the Model 3 has full autonomy. Nor would they make that claim for the S/X. "Hardware" means it can be upgraded later, to the extent the hardware supports. We'll likely find out the extent of the hardware capability later this year. But, hardware support most certainly does not mean "nothing".

u7AgVzl.png
 
It's not clear to me why anyone would expect a auto-pilot to be making moral decisions. If there is some situation which can be determined with sufficient clarity that a human would make a certain moral decision, the auto-pilot can be programmed to do the same thing. The hard decisions are the one for which there are a lot of unknowns, and I wouldn't classify those as moral decisions, for precisely that reason.

Thank you kindly.
 
From Amnon Shashua's (MobilEye CTO) CES 2016 presentation on Jan. 7, 2016 (embedded at bottom of post), I think it's entirely possible that EyeQ4 will make it into Model 3 from the beginning. It looks to me like Elon's end of year announcement will be about AP 2.0 with new hardware and software. I'm guessing it will be the "Full Vision" hardware suite with multiple cameras, out on Model S/X first with network of three EyeQ3 chips, then later next year with EyeQ4. With any luck, Model 3 will get the same sensor suite and EyeQ4 from the launch. It's also possible Model 3 will launch with the triple EyeQ3 setup, if EyeQ4 isn't quite ready in enough volume. Time will tell.

Discussed at about 21:20. Note the four launches for Trifocal EyeQ4 in 2017 and 2018 and Full Vision EyeQ4 in 2017 (probably late 2017, see below) with partial functionality. Could include revamped Model S/X as well as Model 3 afterwards. The comment about partial functionality lines up with Tesla's way of rolling out the hardware sensors first and then doing OTA software updates with new features that use that hardware.

View attachment 179742


Discussed at about 26:30. Note mention of implementation on network of three EyeQ3 chips late this year (matching Elon's end of year announcement). Then EyeQ4 is slated for late 2017/early 2018, nicely matching with Model 3.

View attachment 179743

Full presentation:
VERY informative. I think where you marked with second red arrow is the biggest clue yet... "program launches from 2017 with partial functionality"... hmmm what manufacturer has done this in the past other than Tesla? Seriously is anyone else doing OTA updates that significantly upgrade hardware functionality???
 
Really nice post favo. On the bright side, it seems fairly clear that the Model 3 will at least ship with the upgraded sensor suite that's due to be announced end-of-year, which means (worst case) something close to full highway autonomy.

I wonder how much they'll push for the EyeQ4 for the Model 3, however, keeping in mind the incredible time-sensitivity of the massive launch ramp. If the EyeQ4 is planned for, but isn't ready or isn't shipping in the necessary quantities, then Tesla will have a production disaster on their hands.

For that reason, I suspect Tesla will be conservative with the Model 3, and choose the EyeQ3 platform. [I'd love to be wrong]

However, if they're feeling generous, they could still install the cameras and sensors necessary for Full Vision, only use the front cameras with EyeQ3, and then offer a "module upgrade" at a later date, swapping out the EyeQ3 hardware for EyeQ4 (which would make use of the additional cameras). They'd have to be feeling pretty generous to do the extra engineering work and eat the up-front sensor costs, but that's the sort of extra effort that causes your customers to become brand evangelists who make their own car commercials in their spare time.
Maybe Elon would give reservation holders the choice on whether or not they want to delay delivery another 6 mos and have full autonomous capability...
 
Maybe Elon would give reservation holders the choice on whether or not they want to delay delivery another 6 mos and have full autonomous capability...
Make no mistake the hardware needed is going to be ready in plenty of time it's the DNN models which might need to be updated so that's an OTA update. It's already well known exactly what hardware (sensor-wise) is necessary.
 
It's not clear to me why anyone would expect a auto-pilot to be making moral decisions. If there is some situation which can be determined with sufficient clarity that a human would make a certain moral decision, the auto-pilot can be programmed to do the same thing. The hard decisions are the one for which there are a lot of unknowns, and I wouldn't classify those as moral decisions, for precisely that reason.
I'd like to respond in case this post is in response to the philosophical point I raised earlier. The AI philosophers like to discuss what's going to be happening years ahead of where we are now, so the idea is that the autonomous software knows and sees a lot more than we could ever see. It can see and evaluate the number of passengers in each car, the velocity of those vehicles, the exact trajectory, the probability of them steering in certain paths based on movements that the autonomous software makes itself, etc. In a heavily simplified version, you can think of the way you can teach a computer to play chess. It can evaluate every outcome and use probabilistic modeling to determine the right thing to do, and it does it nearly instantaneously. People used to say "once a computer can play chess, we've reached full AI."

If we teach a model to do this same type of thing, it's not going to be by "programming" it in the traditional way. It'll be a model that learns from humans and improves based upon its own expanded visibility. At that point, we will be expecting it to make moral decisions well beyond what our very limited human minds can comprehend. And that's why it's in the field of philosophy at this point.
 
I think that meant 3 platforms using EyeQ3 chips not 3 EyeQ3 chips. We know as of late 2015 there were "two programs" by one or more OEMs using 5 connected EyeQ3 chips.
If you go to 28:00 in the presentation, he clearly says "there is one production which is coming sooner [than late 2017] which is running on three EyeQ3's..."

And he goes on to say that everything he's talking about is backed by actual production agreements with customers, not based on stuff in the lab or that is purely theoretical.