EVNow
Well-Known Member
I guess the best parts of Calcutta would be like New DelhiI don't know about Calcutta, but if it's anything like Delhi, don't drive there.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I guess the best parts of Calcutta would be like New DelhiI don't know about Calcutta, but if it's anything like Delhi, don't drive there.
There was an interview with NHTSA director (IIRC) sometime back on NPR. He basically said the department is waiting for someone in the industry to bring a car to them - they'd look to approving the cars on a case-by-case basis in the beginning. IOW, no detailed regulations/guidelines in the beginning.NHTSA is a government agency, they would need to first define a level 5 driverless car and set regulations surrounding such a thing before something like that would be allowed to cross state lines etc. At least we'd hope haha
But Tesla's (and other's) car tech advancements follow the more practical approach of MobileEye, who is the industry-defining leader currently. I personally would not trust a "summon" delivery on anything less than Level 5. It's also irresponsible to make a statement claiming this would work reliably in 2 years.
To clarify, in my opinion it will be hardware capable at launch... but not necessarily ready to activate the feature...
As a software engineer, I have to disagree. While full autonomy might not fit your timeframe, having the hardware be capable of it means everything.But that means nothing. Autonomy is all about software. Software could easily take another 5 years or more.
Really nice post favo. On the bright side, it seems fairly clear that the Model 3 will at least ship with the upgraded sensor suite that's due to be announced end-of-year, which means (worst case) something close to full highway autonomy.From Amnon Shashua's (MobilEye CTO) CES 2016 presentation on Jan. 7, 2016 (embedded at bottom of post), I think it's entirely possible that EyeQ4 will make it into Model 3 from the beginning. It looks to me like Elon's end of year announcement will be about AP 2.0 with new hardware and software. I'm guessing it will be the "Full Vision" hardware suite with multiple cameras, out on Model S/X first with network of three EyeQ3 chips, then later next year with EyeQ4. With any luck, Model 3 will get the same sensor suite and EyeQ4 from the launch. It's also possible Model 3 will launch with the triple EyeQ3 setup, if EyeQ4 isn't quite ready in enough volume. Time will tell.
I think that meant 3 platforms using EyeQ3 chips not 3 EyeQ3 chips. We know as of late 2015 there were "two programs" by one or more OEMs using 5 connected EyeQ3 chips.From Amnon Shashua's (MobilEye CTO) CES 2016 presentation on Jan. 7, 2016 (embedded at bottom of post), I think it's entirely possible that EyeQ4 will make it into Model 3 from the beginning. It looks to me like Elon's end of year announcement will be about AP 2.0 with new hardware and software. I'm guessing it will be the "Full Vision" hardware suite with multiple cameras, out on Model S/X first with network of three EyeQ3 chips, then later next year with EyeQ4. With any luck, Model 3 will get the same sensor suite and EyeQ4 from the launch. It's also possible Model 3 will launch with the triple EyeQ3 setup, if EyeQ4 isn't quite ready in enough volume. Time will tell.
Discussed at about 21:20. Note the four launches for Trifocal EyeQ4 in 2017 and 2018 and Full Vision EyeQ4 in 2017 (probably late 2017, see below) with partial functionality. Could include revamped Model S/X as well as Model 3 afterwards. The comment about partial functionality lines up with Tesla's way of rolling out the hardware sensors first and then doing OTA software updates with new features that use that hardware.
View attachment 179742
Discussed at about 26:30. Note mention of implementation on network of three EyeQ3 chips late this year (matching Elon's end of year announcement). Then EyeQ4 is slated for late 2017/early 2018, nicely matching with Model 3.
View attachment 179743
Full presentation:
As a software engineer, I have to disagree. While full autonomy might not fit your timeframe, having the hardware be capable of it means everything.
For one, all the algorithms can be tested (silently) across hundreds of thousands of vehicles. For two, incremental improvements can be constantly delivered. And lastly, when full autonomy software is ready (and approved by regulators) you won't need to buy another car to get it. A software upgrade is called a patch. A hardware upgrade is called a recall. You want the patch.
Tesla's not claiming the Model 3 has full autonomy. Nor would they make that claim for the S/X. "Hardware" means it can be upgraded later, to the extent the hardware supports. We'll likely find out the extent of the hardware capability later this year. But, hardware support most certainly does not mean "nothing".Until you've written the software you can't be sure that the hardware is indeed sufficient. However my main point is that if the software doesn't exist you cannot claim that the Model 3 has full autonomy. It's like saying you have a robot that can play chess because it can move the pieces, now you just need to write the part where it knows how to play chess (but it's just software).
VERY informative. I think where you marked with second red arrow is the biggest clue yet... "program launches from 2017 with partial functionality"... hmmm what manufacturer has done this in the past other than Tesla? Seriously is anyone else doing OTA updates that significantly upgrade hardware functionality???From Amnon Shashua's (MobilEye CTO) CES 2016 presentation on Jan. 7, 2016 (embedded at bottom of post), I think it's entirely possible that EyeQ4 will make it into Model 3 from the beginning. It looks to me like Elon's end of year announcement will be about AP 2.0 with new hardware and software. I'm guessing it will be the "Full Vision" hardware suite with multiple cameras, out on Model S/X first with network of three EyeQ3 chips, then later next year with EyeQ4. With any luck, Model 3 will get the same sensor suite and EyeQ4 from the launch. It's also possible Model 3 will launch with the triple EyeQ3 setup, if EyeQ4 isn't quite ready in enough volume. Time will tell.
Discussed at about 21:20. Note the four launches for Trifocal EyeQ4 in 2017 and 2018 and Full Vision EyeQ4 in 2017 (probably late 2017, see below) with partial functionality. Could include revamped Model S/X as well as Model 3 afterwards. The comment about partial functionality lines up with Tesla's way of rolling out the hardware sensors first and then doing OTA software updates with new features that use that hardware.
View attachment 179742
Discussed at about 26:30. Note mention of implementation on network of three EyeQ3 chips late this year (matching Elon's end of year announcement). Then EyeQ4 is slated for late 2017/early 2018, nicely matching with Model 3.
View attachment 179743
Full presentation:
Maybe Elon would give reservation holders the choice on whether or not they want to delay delivery another 6 mos and have full autonomous capability...Really nice post favo. On the bright side, it seems fairly clear that the Model 3 will at least ship with the upgraded sensor suite that's due to be announced end-of-year, which means (worst case) something close to full highway autonomy.
I wonder how much they'll push for the EyeQ4 for the Model 3, however, keeping in mind the incredible time-sensitivity of the massive launch ramp. If the EyeQ4 is planned for, but isn't ready or isn't shipping in the necessary quantities, then Tesla will have a production disaster on their hands.
For that reason, I suspect Tesla will be conservative with the Model 3, and choose the EyeQ3 platform. [I'd love to be wrong]
However, if they're feeling generous, they could still install the cameras and sensors necessary for Full Vision, only use the front cameras with EyeQ3, and then offer a "module upgrade" at a later date, swapping out the EyeQ3 hardware for EyeQ4 (which would make use of the additional cameras). They'd have to be feeling pretty generous to do the extra engineering work and eat the up-front sensor costs, but that's the sort of extra effort that causes your customers to become brand evangelists who make their own car commercials in their spare time.
Make no mistake the hardware needed is going to be ready in plenty of time it's the DNN models which might need to be updated so that's an OTA update. It's already well known exactly what hardware (sensor-wise) is necessary.Maybe Elon would give reservation holders the choice on whether or not they want to delay delivery another 6 mos and have full autonomous capability...
I'd like to respond in case this post is in response to the philosophical point I raised earlier. The AI philosophers like to discuss what's going to be happening years ahead of where we are now, so the idea is that the autonomous software knows and sees a lot more than we could ever see. It can see and evaluate the number of passengers in each car, the velocity of those vehicles, the exact trajectory, the probability of them steering in certain paths based on movements that the autonomous software makes itself, etc. In a heavily simplified version, you can think of the way you can teach a computer to play chess. It can evaluate every outcome and use probabilistic modeling to determine the right thing to do, and it does it nearly instantaneously. People used to say "once a computer can play chess, we've reached full AI."It's not clear to me why anyone would expect a auto-pilot to be making moral decisions. If there is some situation which can be determined with sufficient clarity that a human would make a certain moral decision, the auto-pilot can be programmed to do the same thing. The hard decisions are the one for which there are a lot of unknowns, and I wouldn't classify those as moral decisions, for precisely that reason.
If you go to 28:00 in the presentation, he clearly says "there is one production which is coming sooner [than late 2017] which is running on three EyeQ3's..."I think that meant 3 platforms using EyeQ3 chips not 3 EyeQ3 chips. We know as of late 2015 there were "two programs" by one or more OEMs using 5 connected EyeQ3 chips.