Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Mobileye eyeq4 - going into the Model S soon? CEO video said Q3 2015 release

This site may earn commission on affiliate links.
So I saw a video by the cofounder of Mobileye earlier this year which claimed that their next system - eyeq4 - would be released in Q3 2015. I believe he also said they already have a car company on board for it - but he did not state who. One would think Tesla would make sense - they've been shipping cars with eyeq3 for over a year now.

So - if there is a decent chance that in less than 6 months Elon will say "by the way we now have eyeq4 with much more cpu power and more cameras - and this enables us to link the car to gps and have it negotiate intersections, exit freeways etc." - then I am going to forget about the referral deadline tomorrow, and the $2,500 California EV credit which will disappear on Jan 1. And just wait it out a few more months for a more capable batch of hardware.

I just completed a 600 mile round-trip in a rented Model S with Autopilot and it's amazing.

Oy - these cars are changing faster than cell phones.
 
The EyeQ4 has engineering samples coming out around now, with the first actual test samples (for Tesla to use internally) coming around the beginning/middle of next year. They'll be ready for production in early 2018. Before moving to EyeQ4, Tesla does have the option to add additional EyeQ3s (or even move to the nVidia Drive PX... the nVidia CEO with a Model X Founder's Edition was interesting). There were suggestions one manufacturer was going to have a system using five EyeQ3s in the near future.

The question is, will Tesla continue to max out the current hardware (which still has room for improvement) and keep the Autopilot hardware suite from splintering to make development easier... or will they add incremental improvements, like an extra camera and EyeQ3 for "Autopilot 1.5"?

I do think Telsa's true "Autopilot 2.0" will feature a trifocal camera cluster up front, with radar and a possible scanning laser, running off the EyeQ4. I also expect up to five additional cameras--eight in total--around the car to supplement the ultrasonics and create a more complete environment model.
 
The question is, will Tesla continue to max out the current hardware (which still has room for improvement) and keep the Autopilot hardware suite from splintering to make development easier... or will they add incremental improvements, like an extra camera and EyeQ3 for "Autopilot 1.5"?

I do think Telsa's true "Autopilot 2.0" will feature a trifocal camera cluster up front, with radar and a possible scanning laser, running off the EyeQ4. I also expect up to five additional cameras--eight in total--around the car to supplement the ultrasonics and create a more complete environment model.

Fascinating question indeed. The video from last year with the co-founder of Mobileye showed some impressive deep learning going on for path prediction - he showed, for example, a snowy road with no visible lane markings - and yet the computer accurately predicted the path. He claimed, I believe, that this improved software in development can run on EyeQ3.

So perhaps the current hardware suite has a lot of room to "grow."

It does seem to me that Tesla has a big lead in semi-autonomous driving now that they have a fleet of customers out there driving the cars and building the map for them. By the time GM launches Super Cruise Tesla will have over a year's lead in building their maps. And I just read that Super Cruise won't even change lanes.

If you're right about Autopilot 2.0 and its gajillion cameras - I'll upgrade to a new Tesla for it. That will be the first time I've taken the depreciation hit and traded in a relatively new car - rather than keeping it a decade and 200,000 miles.
 
If you're right about Autopilot 2.0 and its gajillion cameras - I'll upgrade to a new Tesla for it. That will be the first time I've taken the depreciation hit and traded in a relatively new car - rather than keeping it a decade and 200,000 miles.

Well, the EyeQ4 most definitely supports eight cameras. Whether Tesla plans to use all of them is another question. I do believe a trifocal front camera cluster is highly likely. As you probably saw in the video, having a wide angle (for overall scene view) and a more narrow angle (for small road debris detection) is crucial for the next step in safety. We already have a rear-view camera that could eventually be used in a future Autopilot (to help make lane changes safer from fast approaching vehicles). Add two cameras to the front sides and two to the rear sides and you'd have a complete 360 degree view of the environment. I'd also love for one of the front cameras to be night vision or infrared (like the FLIR One), so the car could see heat signatures of pedestrians and animals at the side of the road.

The beauty is that in 2 years, for the same price as the current cameras, 4K with higher dynamic range will be commonplace.
 
We already have a rear-view camera that could eventually be used in a future Autopilot (to help make lane changes safer from fast approaching vehicles).

The beauty is that in 2 years, for the same price as the current cameras, 4K with higher dynamic range will be commonplace.

Do we know for sure the Autopilot is not using any data from the rear view camera? And yes, since higher resolution sensors with higher resolution chips will be common within 2 years, I do wonder if Tesla built any modularity/upgradeability into this system so that a new camera (or trifocal camera like you mentioned) could be retrofitted for a fee into the first Autopilot cars by swapping out the camera housing.

Also, do you know anything about how accurate (or inaccurate) sonar sensors are? Do they have a lot of false positives or - even worse - failures-to-detect - compared to video cameras and radar?

A Tesla which can pull out from my garage and drive itself through city traffic to my final destination (and park and charge itself automatically along the way) will be the last car I ever buy. I'm annoyed, actually, in a way, that I have to replace my meticulously maintained 200,000 + mile SUV because of these here new-fangled electric motor car thingies and their superior fuel economy.
 
Put a piece of tape over your rear camera and go for a drive. If it complains about "camera obstructed" or something like that, then you'll know it's using the rear camera. If it operates normally, then it's not.

That's not to say that they couldn't start using it in the future, of course.

In regard to the OP, Tesla's style so far has been to drop in new hardware without telling anyone, and then announce it after the fact. I would expect this to continue.
 
Put a piece of tape over your rear camera and go for a drive. If it complains about "camera obstructed" or something like that, then you'll know it's using the rear camera. If it operates normally, then it's not.

That's not to say that they couldn't start using it in the future, of course.

In regard to the OP, Tesla's style so far has been to drop in new hardware without telling anyone, and then announce it after the fact. I would expect this to continue.

Assuming the car uses the rear camera, you don't know how they utilize it. There may not be a notification telling you that the camera is obstructed and AP will just work in a slightly degraded mode.

Think rain, when a water drop gets on the camera, it looks like crap, so Tesla is smart enough to [assuming they use it] utilize the camera when it can, and not utilize it when it can't, without throwing an error message.
 
Yeah, I'm not suggesting they're using the rear camera but if they had foresight with the first Autopilot, they could use it for (1) watching cars approaching before a lane change, (2) rear-collision preconditioning, and (3) verification of road lines to help teach Autopilot, since it's not affected by the same lighting or glare as the front camera.

We do know it's a high quality camera that's wired to the center console. So it's not a stretch to think it can or will eventually be used.
 
Last edited:
Well, the EyeQ4 most definitely supports eight cameras. Whether Tesla plans to use all of them is another question. I do believe a trifocal front camera cluster is highly likely. As you probably saw in the video, having a wide angle (for overall scene view) and a more narrow angle (for small road debris detection) is crucial for the next step in safety. We already have a rear-view camera that could eventually be used in a future Autopilot (to help make lane changes safer from fast approaching vehicles). Add two cameras to the front sides and two to the rear sides and you'd have a complete 360 degree view of the environment. I'd also love for one of the front cameras to be night vision or infrared (like the FLIR One), so the car could see heat signatures of pedestrians and animals at the side of the road.

The beauty is that in 2 years, for the same price as the current cameras, 4K with higher dynamic range will be commonplace.

Yeah, I'm not suggesting they're using the rear camera but if they had foresight with the first Autopilot, they could use it for (1) watching cars approaching before a lane change, (2) rear-collision preconditioning, and (2) verification of road lines to help teach Autopilot, since it's not affected by the same lighting or glare as the front camera.

We do know it's a high quality camera that's wired to the center console. So it's not a stretch to think it can or will eventually be used.

I originally assumed that getting the upgraded sensor suite would absolutely require buying a new car, but lately I've had a sliver of hope that the next generation can be retrofit. I believe the EyeQ3 is in the windshield camera housing. Perhaps the housing can be swapped for one with the EyeQ4 and the trifocal cluster you mention, which I believe is supposed to include a front-facing fisheye. If the feed from the backup camera fisheye is available to be bussed to the windshield housing where it can be processed -- or if the rear camera can be upgraded with a module that that has its own EyeQ3 -- then that will give almost 360 degrees coverage. Here's hoping.
 
The triumph of hope over experience! Other than LTE, I don't think Tesla has offered a single hardware retrofit, except on occasional one-off, very expensive, cases.

I originally assumed that getting the upgraded sensor suite would absolutely require buying a new car, but lately I've had a sliver of hope that the next generation can be retrofit. I believe the EyeQ3 is in the windshield camera housing. Perhaps the housing can be swapped for one with the EyeQ4 and the trifocal cluster you mention, which I believe is supposed to include a front-facing fisheye. If the feed from the backup camera fisheye is available to be bussed to the windshield housing where it can be processed -- or if the rear camera can be upgraded with a module that that has its own EyeQ3 -- then that will give almost 360 degrees coverage. Here's hoping.
 
I think the 5 eyeq3 setup is still for testing purposes only. I don't think they have enough data for cross traffic and intersection management. Autopilot 2.0 will follow the same path as autopilot 1.0 IMO which would mean that testing would be going on throughout, but a final commercial version owuld be ready by the time eyeq4 is released commercially + 6 months to 1 year. WHile 360 camera display could make a theoretical system with 5 eyeq3 hardware compatable, the software hasn't even started to be developed. First they'll develop and refine the software on pilot cars.
 
Also, do you know anything about how accurate (or inaccurate) sonar sensors are? Do they have a lot of false positives or - even worse - failures-to-detect - compared to video cameras and radar?

Ultrasonic is vastly inferior to a video camera and radar. It's awesome for parking and small objects, but it's reliable to about 16 feet. At highway speeds, 16 feet isn't enough to mitigate enough collisions. It'll help, but it's not enough.

Based on my knowledge of the sensors, I believe a camera with supplemental radar would be most reliable. And a camera alone would still be better than ultrasonic. LIDAR creates the most accurate 3D environment map, but I like Mobileye's approach of using software to train cameras using lower-cost video.

I'm of the mindset that everything can be done with cameras. As I've stated elsewhere, humans drive with only 2 cameras in a sub-optimal position. Once you have eight high resolution (4K) cameras watching the road simultaneously in 360 degrees, with a forward radar, forward wide angle infrared/night vision/flir, and 360 ultrasonic sensors for redundancy, you just need better software and faster image processors. But eventually, that alone can get you to full autonomy.

- - - Updated - - -

I totally agree, and I mentioned #1 and the first #2 before. The second #2 is s good point too. It's easy to de fish eye the lens.

Hah... I fixed the second "(2)" to be "(3)." But yes, it might not even be necessary to remove the fisheye. The image processor will know the field of view and adjust its results accordingly. I've seen demos of the EyeQ3 analyzing a wide angle lens. I think there's a lot to be learned by simultaneously watching the car in front AND behind you. In a way, the following car will instantly validate the decision of the Autopilot. If the rear car does something different, it can be flagged for review/learning. If it does the exact same thing as Autopilot, you commandeered another brand's "expert driver" to enhance your data.
 
Last edited: