Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot 2.0 Not Imminent Based On Production Model X Design Studio [Speculation]

This site may earn commission on affiliate links.
The lines disappear, but that doesn't mean it can't identify a stop light or stop sign.

I don't see how that helps the situation? The car need to follow the lines. If it loses them, it will focus on something else, like the car ahead, and if that car changes lanes, you will too, because AP thinks that's the direction of the road. So how does a stop sign/light help that situation? You need one more camera to map the intersection and then updated software to draw in the lines. Simple and done. Also, what if there's a yield sign (not just red light, stop sign or green light)? Does the car keep going if it field of vision can't see far enough to know if it should or not. Stop and red/yellow/green create more problems given the narrow field of vision such that even with software upgrades, the current hardware will never properly navigate intersections, in my opinion. I know that people still use it through intersections but the instructions are clear that it is not to be used for roads with intersections (not many of those!), and for very good reason.
 
I'm sorry, this thread has me lost. What is different about this one additional camera that will help the car through an intersection?

The simple answer is field of view.

A future version of Autopilot will likely use a "tri-focal" camera cluster up front. The cameras will be different focal lengths. So, you'll have one that is just looking straight ahead. The left and right edges of the frame are the edges of the road. Then, you'll have another camera with a wide fish-eye lens covering a 180 degree view of the whole forward scene. The wider view will have a lower resolution of the road and vehicles in front of the car, but it will see motion like cross-traffic approaching an intersection. The current camera has tunnel-vision and basically can't turn its head.

Together, a wide angle camera, a road-only camera, and perhaps one in the middle will be able to stitch the scene together. Everything in front of the car will be the highest clarity, able to see a small piece of debris or pothole in the road clearly. The wider cameras will be looking for larger objects, noting their trajectory and also survey for traffic signals and road signs that could be off to the side.

To be honest, I'm surprised we're not hearing more about two cameras placed on each top side corner of the windshield. Having that much separation would allow for the car to "see behind" more objects and counteract glare.
 
Autopilot 2.0 Not Imminent Based On Production Model X Design Studio [Specula...

Multiple focal lengths or stereo or both?

Sometimes though, a driver just had to "know" that things all end up aligning (or nearly aligning) on the other side and grin and bear it through the middle for lack of a better word. Especially on the crest of a hill. Is a computer ready for fuzziness like that?

My first priority is still 360 view.
 
Multiple focal lengths or stereo or both?

Sometimes though, a driver just had to "know" that things all end up aligning (or nearly aligning) on the other side and grin and bear it through the middle for lack of a better word. Especially on the crest of a hill. Is a computer ready for fuzziness like that?

My first priority is still 360 view.

Probably both. The side effect of having multiple focal lengths offset by a few inches is that you're also getting some additional depth information.

In terms of a steep hill, the software would likely have the car slow down with enough time to slam the brakes if there was something unexpected on the other side. In most cases, the speed limit in place is already designed to do that. Of course, the computer will have the benefit of (1) detailed maps (2) perfect attention and (3) perfect reaction times.

Do you mean 360 degree view, like the Infiniti overhead view, or for autonomous driving?
 
So gathering from your views....
1. Current A.P (version 1.0) can identify green/yellow/red lights then cross an intersection but it can not determine to turn left/right at intersection and same as stop/yield signs. Also current A.P can not do auto-steering follow the navigation and can not do auto-valet parking. These issues because it lacks the cameras.
2. Semi-autonomous (A.P version 2.0) with 8 cameras around the car will resolve all issues of version 1.0 but the driver still must be in the seat with the steering wheel and able to engage as needed. A.P version hardware 2.0 will be installed on new S/X sometime 2016 but software will take longer to implement as we learned on version 1.0.
3. Full-autonomous will not come until 2018 or even later because regulations. At that time, the cars with full-autonomous will not have steering wheel.....
 
Try closing one eye and driving through an intersection.

Your brain isn't able to calculate depth information from visual flow, though, where the single camera system is able to above some speed (presumably somewhere near 18 MPH).

Believe it or not, this tech is pretty accurate with one camera while moving in a known direction at a known speed.
 
I'm looking for "overhead view". Have to park very tight in a garage that is side entry (L driveway) and very tight turning apron. I'm about 3-4"" from the wall when pulling in with a small car (330ci).

Of course someday it may help auto pilot, but that's not the part I'm worried about short term.
 
I think Tesla is wise enough and brave enough to brake away from Mobileye crowd and do something on their own (with Nvidia's help) in order to leap forward and stand out from the crowd. If that's true, we will see AP1.x (maybe with trifocal upgrade) on Model S and Model X until 2017.

If I remember correctly, Mobileye Q4 is due for production in 2018 so the time to make this decision is now.
 
Your brain isn't able to calculate depth information from visual flow, though, where the single camera system is able to above some speed (presumably somewhere near 18 MPH).

Believe it or not, this tech is pretty accurate with one camera while moving in a known direction at a known speed.

Absolutely. What most people don't realize is that forward motion does create a "stereo separation." At 60mph, with 30 frames per second, the separation of each frame is just under 3 feet. It's "simple" math to calculate distance based on the changes. Objects further away (think of the moon on the horizon) grow at a slower rate than close objects (like a stopped car). There's already software that can track handheld camera motion in 3D space.
 
I think Tesla is wise enough and brave enough to brake away from Mobileye crowd and do something on their own (with Nvidia's help) in order to leap forward and stand out from the crowd. If that's true, we will see AP1.x (maybe with trifocal upgrade) on Model S and Model X until 2017.

If I remember correctly, Mobileye Q4 is due for production in 2018 so the time to make this decision is now.

The Mobileye EyeQ and Nvidia's Drive PX are hardware processors. It's still Tesla's software. The software and infrastructure will be the most important part here. There's no reason why Tesla couldn't use two EyeQ4s or even one EyeQ and one Drive PX to play up specific strengths of the hardware. They might even help develop some sort of custom coprocessor eventually. But (in my opinion) having Mobileye and nVidia push these chips faster is in Tesla's best interest.

I can see them using custom software and four EyeQs to, for example, process a 4K video image (as opposed to 1080p) by cutting the image into four quadrants. Things like that. There's plenty of ways to break out with software while still using another company's hardware.
 
Absolutely. What most people don't realize is that forward motion does create a "stereo separation." At 60mph, with 30 frames per second, the separation of each frame is just under 3 feet. It's "simple" math to calculate distance based on the changes. Objects further away (think of the moon on the horizon) grow at a slower rate than close objects (like a stopped car). There's already software that can track handheld camera motion in 3D space.

Still the scope of a single camera is too narrow for intersections. But even if it wasn't, there's no redundancy for error correction (multiple frames from a single camera allows for potential errors to simply be repeated rather than corrected by a second camera). Hence, you'll never see a single camera approved for use in intersections, in my opinion, especially when adding another camera is inexpensive.
 
Still the scope of a single camera is too narrow for intersections. But even if it wasn't, there's no redundancy for error correction (multiple frames from a single camera allows for potential errors to simply be repeated rather than corrected by a second camera). Hence, you'll never see a single camera approved for use in intersections, in my opinion, especially when adding another camera is inexpensive.

For intersections I really hope they have redundancy implemented with radar sensors.

I know they are going with a mostly optical sensor suite in the present direction they are heading but dust, snow, rain, fog can all interfere with optical sensors and I really hope they have overlapping and redundant sensor coverage with radar and ultrasonic sensors around the car. The Mercedes sensor array uses sideways radar for intersections and cross traffic alerts.
 
Yes, EyeQ (and Drive PX) do allow for radar redundancy. I believe that it would always be used with 360 ultrasonics as well. Don't forget video cameras can eventually include heat (like FLIR) and infrared.

And yes, one of the great things about three forward cameras is redundancy. BUT it's also why I'd prefer to see them spread out more. Having all three in a small cluster can knock them all out simultaneously with snow/mud or a large leaf.
 
I apologize if this has been covered on this thread (I'm catching up). But it appears the Model X will have the three camera system. It is in the picture in the Model X section: Model X | Tesla Motors

zModel X Photo.PNG


I'd imagine the Model X ships with the three camera system, and the 8 camera system will be available with 5 EyeQ3 chips in 2016 or 2017. I'd imagine Elon did not want to announce this on September 29, 2015 when the Autopilot 1.0 was not in production (after one year since it's introduction).

Moving Closer to Automated Driving, Mobileye Unveils EyeQ4® System-on-Chip with its First Design Win for 2018 - Mobileye

“Today we are already preparing with one of the OEM, a first vehicle based on 8 cameras, one radar and ultrasonic around the vehicle. So this is much wider implementation of the first introduction of semi-autonomous driving and the trifocal is going to be here as we planned, but additional 4 cameras around the vehicle and one camera looking back. The system will run on 5 EyeQ3 chips and all of them will be connected.”

Supplier hints at next generation Autopilot hardware for Tesla as soon as this year | Electrek
 
Founders series don't have that, also it looks like 2 cameras and the rain sensor for the wipers. And here ya go: Stereoscopic cameras on the front?
Well, it is most assuredly more than one camera as evidenced by the two "landing pads" which are used specifically for cameras.

Based on this image it would appear that Model X will have more than one camera on the windshield. Very exciting and I really hope I can get official confirmation as I'll be throwing down some cash.

RLS (Rain Light Sensor) does not require this and is just a puck (circle) attached flush to the windscreen. https://www.google.com/search?q=rai...jLnJAhUD4D4KHZBQAOMQ_AUICCgC&biw=1440&bih=718
 
Well, it is most assuredly more than one camera as evidenced by the two "landing pads" which are used specifically for cameras.

Which statement are you most assuredly contradicting? The one where I said founders series don't have that:
Stereoscopic cameras on the front? <-- pic of the founders car

Or are you most assuredly contradicting that there is more than 1 camera, when I said there is more than 1 camera
it looks like 2 cameras and the rain sensor for the wipers



RLS (Rain Light Sensor) does not require this and is just a puck (circle) attached flush to the windscreen. https://www.google.com/search?q=rai...jLnJAhUD4D4KHZBQAOMQ_AUICCgC&biw=1440&bih=718

Yes, that's why I said:
it looks like 2 cameras and the rain sensor for the wipers