Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot 2.0 hardware rumored this year on EyeQ3, article link enclosed - 8 cameras!

This site may earn commission on affiliate links.
Such a hardware suite would add $10k to the price of the car (a little more than half of it is cost), and it would hamper the performance and mileage of the car by 10%-20%. Not to mention when launched it will be far from perfect and will require as much attention on autopilot 2.0 as autopilot 1.0 requires. It launches as a tool, not a take-over. Still awesome though. I doubt the timeframe because Musk is always bad with this type of thing, but it is technically possible. The problem though is that we have yet to see Tesla research mules in the city taking intersection after intersection so they still either have to build the 5 eyeq3 prototype or if it is built collect the data required before moving on to the next step. While they have plenty of data from behind and in front, Tesla has little data from side angles.
Those seem like some mighty fine made up numbers! According to most research, the average unit cost of an EyeQ3 is $44. To get to 5 you need 4 more, so that's ~$100. It's going to be hard to get to $10k from there with 6 additional cameras, two of which are likely integrated into the front unit and share an assembly.

The performance and efficiency reduction seems even more unlikely. How do you get to a 20% performance and range reduction from a couple cameras and circuit boards?
 
Those seem like some mighty fine made up numbers! . . .The performance and efficiency reduction seems even more unlikely. How do you get to a 20% performance and range reduction from a couple cameras and circuit boards?

He's wrong. Mobileye's executives claimed in this morning's Q3 earnings conference call that fleet operators have seen a 7-12% INCREASE in fuel efficiency because operators drive more smoothly and carefully with less acceleration and braking (which, they noted, results in a 100% ROI of the cost of the Mobileye hardware in less than a year of use). They said they do not have data yet from people like Tesla to report if Autopilot is also resulting in more efficient driving, but that they strongly suspect it does.
 
The CEO described the “more sophisticated system” saying that one OEM is already implementing it in a vehicle:
“Today we are already preparing with one of the OEM, a first vehicle based on 8 cameras, one radar and ultrasonic around the vehicle. So this is much wider implementation of the first introduction of semi-autonomous driving and the trifocal is going to be here as we planned, but additional 4 cameras around the vehicle and one camera looking back. The system will run on 5 EyeQ3 chips and all of them will be connected.”
Aviram didn’t disclose which automaker is testing the system, but he hinted it could find its way into a commercial product as soon as this year and we know that Tesla has been testing a similar hardware suite.


Oh boy.. I have 72 hours to decide if I want to push my production date of my 70d out until March/April before I am locked into December. What would you do?

I've heard rumors of a major refresh in the spring.
 
Transcript of the call is here: http://seekingalpha.com/article/3639186-mobileyes-mbly-ceo-ziv-aviram-on-q3-2015-results-earnings-call-transcript


Snips that I found useful/interesting
Recently, we launched our first deep learning functions on Tesla auto pilot feature. These capabilities include semantic free-space which uses every pixel in the scene to help us understand where are the curves, barriers, [indiscernible] drills, moving objects and anything that is not part of the driving path.

Once we know the free-space, the big challenge is where to locate the vehicle in this free-space. We saw this with the holistic path prediction, which uses the context of the road to determine exactly where the car should go at all the time.

Both of these new capabilities push the envelope of scene interpretation by lips and bones, and their fundamental elements for semi-autonomous driving. As we said, these capabilities are already implemented and will be implemented in the future semi-autonomous launches, including in 2016 by two of our OEM customers.

The Tesla auto pilot feature is currently using a mono camera sensor for performing the most important understanding of the scene the visual interpretation. Our multiple camera sensor configuration launches our planned to begin as early as next year.

We are on track with four launches of the front-sensing trifocal camera configuration to support highly autonomous driving. And we are on track with two launches of an eight camera 360 degree awareness system design to support fully autonomous driving. And all the above, our plan for the 2016 to 2019 timeframe as will occur in parallel rather than one following the other.



 
Transcript of the call is here: http://seekingalpha.com/article/3639186-mobileyes-mbly-ceo-ziv-aviram-on-q3-2015-results-earnings-call-transcript


Tip for anyone trying to read the transcript - Seeking Alpha wants you to register to get past page 2. Just go to your URL bar and change the number of the page at the end of the link from "2" to "3" and then "4" etc. and the following pages will load just fine.
 
I guarantee that this rumor is true because I just ordered my car. My guess is that it will start being packed into new builds the day after my car is produced, which will be sometime in early December. You're welcome:) Seriously though, even if this was confirmed before my order finalizes, I wouldn't cancel. Autopilot 1.0 is more than adequate for me. Every day that goes by that I either drive my diesel tank or ride the commuter bus, I lose a little more of my soul. Can't wait to get my car!!!
You can delay your production! Talk to your DS.
 
Exactly. I'm a buy-expensive-car-and-drive-til-wheels-fall-off person - this rate of progress is difficult to stomach! My last 10 years steed has been a 2004 E55 AMG Benz super sedan - which is no longer very "super" compared with Tesla lol. But it at least was uber powerful and uber luxurious - and technology was basically stable.

But at the rate Elon is innovating the 2024 Model S will have vertical takeoff and landing features and warp drive! But seriously I would not be surprised if the 2024 Model S can be had with no steering wheel and reclining rear lounge seats which do let you legally and safely go to sleep and wake up at your destination. Even two years ago I would have scoffed at the notion but now I'm not so sure.

I don't exactly buy ultra-luxurious cars, but I do drive them forever. My current car is a 1992 Buick I bought new.
 
The Tesla auto pilot feature is currently using a mono camera sensor for performing the most important understanding of the scene the visual interpretation. Our multiple camera sensor configuration launches our planned to begin as early as next year.

We are on track with four launches of the front-sensing trifocal camera configuration to support highly autonomous driving. And we are on track with two launches of an eight camera 360 degree awareness system design to support fully autonomous driving. And all the above, our plan for the 2016 to 2019 timeframe as will occur in parallel rather than one following the other.


The article from Electrek noted the CEO of Mobileye said:

"Today we are already preparing with one of the OEM, a first vehicle based on 8 cameras, one radar and ultrasonic around the vehicle. So this is much wider implementation of the first introduction of semi-autonomous driving and the trifocal is going to be here as we planned, but additional 4 cameras around the vehicle and one camera looking back. The system will run on 5 EyeQ3 chips and all of them will be connected."

And today, on the earnings call, he revealed (via Seeking Alpha transcript):

"We confirm that our technology is supporting to camera processing showcase by Nissan’s Intelligent Driving Prototype Car at the recent Tokyo Motor Show. It is managed by an eight camera system as shown in pictures released by the media and the processing is powered by multiple EyeQ3 chips."

So, I think that "this year" was the Nissan, but Tesla could have an upgrade in the next year:

"And we are on track with two launches of an eight camera 360-degree awareness system design to support fully autonomous driving. And all the above, our plan for the 2016 to 2019 timeframe as will occur in parallel rather than one following the other."

Note there is some flexibility on that timeline (2016-2019). But later, he continues about Tesla and "Autopilot":

"What we presented is Lane Keeping Assist system rather than auto pilot system. Autopilot system is going to be presented next year, where it's going to be 360 degrees coverage around the vehicle and is going to be multiple cameras with additional sensors. What we have today is just a mono camera looking forward. So, it’s a very limited input that we have on the road."

So, it does look like a new sensor suite could be coming "next year." He didn't specifically say Tesla, but it was in reference to Tesla's Autopilot. His use of "presented" also confuses things a bit. ("Presented" meaning delivered to consumers or presented to the OEM?) I don't think that means in the next couple months or "as soon as this year" for Tesla. I wouldn't hold off buying a car now, but I would be prepared for the usual incremental improvements. Of course, like Autopilot 1.0, we might not even see these functions activated until 2017.
 
And today, on the earnings call, he revealed (via Seeking Alpha transcript):

"We confirm that our technology is supporting to camera processing showcase by Nissan’s Intelligent Driving Prototype Car at the recent Tokyo Motor Show. It is managed by an eight camera system as shown in pictures released by the media and the processing is powered by multiple EyeQ3 chips."

So, I think that "this year" was the Nissan, but Tesla could have an upgrade in the next year:

Based on this I don't think it's Nissan.

http://europe.autonews.com/article/20151027/BLOG15/151029866/nissan-makes-autonomous-driving-look-easy


"The prototypes have been on the road for about three months and have already been driven accident-free for about 965km (600 miles). Iijima said Nissan will expand its fleet of autonomous test vehicles to test more cars in Japan and will begin tests Europe and the U.S. within two years."
 
Based on this I don't think it's Nissan.

http://europe.autonews.com/article/20151027/BLOG15/151029866/nissan-makes-autonomous-driving-look-easy


"The prototypes have been on the road for about three months and have already been driven accident-free for about 965km (600 miles). Iijima said Nissan will expand its fleet of autonomous test vehicles to test more cars in Japan and will begin tests Europe and the U.S. within two years."

Well, I was quoting the CEO in an earnings call. The article you linked to even says Mobileye was involved:

"Nissan did much of the work on the prototypes in-house with Nvidia and Advanced Scientific Concepts, both based in California, and Israel's Mobileye, as key contributors."

Are you suggesting the CEO meant another car would have eight-cameras on multiple EyeQ3s this year?
 
Are you suggesting the CEO meant another car would have eight-cameras on multiple EyeQ3s this year?

MarkS22 - personally I can only speculate. But in addition to the earnings call with Mobileye today there was also a presentation in September, reported on yesterday. The Citi 2015 Global Technology Conference. As reported by electrek.co here: Supplier hints at next generation Autopilot hardware for Tesla as soon as this year | Electrek

But Aviram said that some customers were pushing the company for more applications:
“The appetite of the OEMs we work with is growing and the first application is going to be much wider than what we planned. We see acceleration of the development and needs of our customers to present much more sophisticated systems.”
The CEO described the “more sophisticated system” saying that one OEM is already implementing it in a vehicle:
“Today we are already preparing with one of the OEM, a first vehicle based on 8 cameras, one radar and ultrasonic around the vehicle. So this is much wider implementation of the first introduction of semi-autonomous driving and the trifocal is going to be here as we planned, but additional 4 cameras around the vehicle and one camera looking back. The system will run on 5 EyeQ3 chips and all of them will be connected.”
Aviram didn’t disclose which automaker is testing the system, but he said during a recent conference that Tesla is willing to push the envelope “faster and more aggressively than any other OEM”. He also hinted that the new system could find its way into a commercial product as soon as this year and we know that Tesla has been testing a similar hardware suite.
 
MarkS22 - personally I can only speculate. But in addition to the earnings call with Mobileye today there was also a presentation in September, reported on yesterday. The Citi 2015 Global Technology Conference. As reported by electrek.co here: Supplier hints at next generation Autopilot hardware for Tesla as soon as this year | Electrek

Yeah, my point was the CEO's comment in September stated: "Today we are already preparing with one of the OEM, a first vehicle based on 8 cameras, one radar and ultrasonic around the vehicle ... The system will run on 5 EyeQ3 chips and all of them will be connected."

And then today, the CEO says: "We confirm that our technology is supporting to camera processing showcase by Nissan’s Intelligent Driving Prototype Car at the recent Tokyo Motor Show. It is managed by an eight camera system as shown in pictures released by the media and the processing is powered by multiple EyeQ3 chips."

Make no mistake, I think Tesla will eventually release an eight-camera system (with a trifocal front camera array, plus radar and ultrasonics for redundancy) running off either multiple EyeQ3s and eventually the EyeQ4. I'm only suggesting, based on the comment today, that Nissan may be the one using the eight-camera setup this year.
 
Mobileye's executives claimed in this morning's Q3 earnings conference call that fleet operators have seen a 7-12% INCREASE in fuel efficiency because operators drive more smoothly and carefully with less acceleration and braking (which, they noted, results in a 100% ROI of the cost of the Mobileye hardware in less than a year of use). They said they do not have data yet from people like Tesla to report if Autopilot is also resulting in more efficient driving, but that they strongly suspect it does.

I wonder about this in Tesla's case because AP uses TACC, which holds speed constant. As the road goes up and down hills, TACC puts in considerably more power to hold speed going uphill, and regenerates to hold speed going down. An efficient human would let the speed of the car drop going uphill and rise going down, to smooth power consumption. Or maybe I'm overstating this because, living in the Virginia piedmont, I'm constantly driving up and down hills!
 
So, if I've ordered my MS, it hadn't entered the production queue yet, but had pushed out the Delivery to April of 2016, will it come with whatever the lastest hardware being installed on the cars at the time of production? I would think so . . .
 
So, if I've ordered my MS, it hadn't entered the production queue yet, but had pushed out the Delivery to April of 2016, will it come with whatever the lastest hardware being installed on the cars at the time of production? I would think so . . .

I was told the following: "Of course, if we were to releasea new package leading up with your vehicle build date, you would be able to optinto the new features."

#ToPush or #NotToPush that is the question
 
chriSharek... Seems you are a similar spot as me. I put in the order in and will likely choose to have it delivered in the April/May 2016 timeframe and hope the rumors are correct in that the additional cameras will be included (or available for purchase) at that time.
 
I wonder about this in Tesla's case because AP uses TACC, which holds speed constant. As the road goes up and down hills, TACC puts in considerably more power to hold speed going uphill, and regenerates to hold speed going down. An efficient human would let the speed of the car drop going uphill and rise going down, to smooth power consumption. Or maybe I'm overstating this because, living in the Virginia piedmont, I'm constantly driving up and down hills!

Sounds to me like there should be an option for trying to keep the same speed or optimize for best efficiency. Hills are pretty common in the west coast states. I-5, the main N-S artery has two passed over 4000 feet and I've lived in a number of places from Los Angeles to Seattle and always on a hill (and I've never sought out someplace that was on a hill, it just happened).