Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What are the chances of Autopilot 3.0?

This site may earn commission on affiliate links.
There is no way they are changing AP2 for at least 2 years. M3 just rolled out with this sensor suite. NOTE: I don't view switching the ECU to be changing AP2. I view AP2 as the sensor suite not what processes/interprets that sensor information.

Elon is terrible with timelines, but I do recall him mentioning hardware improvements every 12-18 months on Twitter when asked about Autopilot. We also know from wiring diagrams and photos that Model S facelift/Model X was slated for an "AP 1.5" with two forward cameras and rear radars, which might have come around 12-18 months after AP1...

So changing AP2 after 12-18 months certainly sounds perfectly possible. AP2 would likely continue just as a subset of an upgraded system, that might add more sensors of some kind.

As for AP2 upgrades, I will point out there is more than just the ECU (which I agree can be upgraded on AP2 cars) and the sensor suite, there are also the adjoining systems (e.g. the much rumoured augmented reality HUD). These might count as upgrades to the AP setup that are not retrofittable.
 
It is for vision and radar, yes it is - so basically all cross-traffic detection and anything that can't be picked up or identified sufficiently by ultrasonics, which are limited.
You mean cross traffic that's within 3 feet of the car and detected by ultrasonics with an 8 meter range?

I'm not sure why this is so hard to explain? Your image does nothing to counter my points about visual blindspots in the nose as well as the blindpots created by blocked B pillar cameras?

AP2 has visual blindspots in certain close-by and urban environment scenarios, at least in following areas (as shown also in my previous message). These could be improved by future iterations of the sensor suite, by adding e.g. more cameras and/or cross-traffic radars in the nose and corners of the car...

1) Left and right side of nose, at bumper level. B pillar front-facing cameras can not see low enough.

2) Front of nose, at bumper level. Windshield cameras do not have the FoV to see what's immediately in front of the car there (and anyway, couldn't see through the bonnet).

3) Front vision is blocked towards the sides when windshield or B pillar FoV is blocked. In the below scenario AP2 is blind towards the sides. A nose camera would help with that part.

In all of these cases, the only redundancy are the ultrasonics, which are useless in cross-traffic detection due to low range and low speed and have various downsides that vision and radars do not.

I have marked with red the areas where a Tesla AP2 is vision blind in this scenario - it can not see there outside of ultrasonics. Ultrasonics might pick up the strollers, but not any faster cross-traffic, and most importantly no visual contigency for the AP2 to act on, if B pillar cameras are blinded by obstacles...

View attachment 238241

1) Cameras don't need to, area is covered by ultrasonics, anything tall enough will also be caught by a camera (such as a car, a person, or even a child who is tall enough).

2) see #1. bumper level is covered by ultrasonics.

3) Yes, as would human vision. You can't see when you cover your eyes.

Ultrasonics would not be needed for cross traffic detection for actual vehicle traffic. It would all be caught on camera. If the camera is obscured just as if human vision is obscured, then it's probably not safe to make a turn. No worries.
 
Looks like it worked and didn't kill the kid. Plus the new ones are upgraded.

Wait did you miss it hitting the doll and plastic can and only when placed height level to the radar that it stop.

Are we gonna ignore the numerous auto parking accidents that relies solely on ultrasonic. Especially those involving poles?

shopping



Or have you forgotten this video?

Side objects and low below the hood height objects WILL be ran over. So if your level 5 tesla parked and later on someone puts a bike on the floor infront of it. When you summon it to pick you up from work. It WILL run over the bike and damage your car

 
Just to be clear, @JeffK, I agree AP2 suite is sufficient for self-driving in certain optimal conditions at least, so that is not what I'm arguing here. I am discussing potential areas for improvements where I think Tesla might consider adding new sensors in future iterations of AP2.

You are not thinking AP2 suite is the final revision?

You mean cross traffic that's within 3 feet of the car and detected by ultrasonics with an 8 meter range?

I mean cross-traffic that is too fast or outside of range of the ultrasonics. Many cars nowadays have cross-traffic alert systems that are based on corner radars and can see where humans can not. Ultrasonics are not just limited by range, they are also limited by the relative speed of the other object compared to the car (i.e. why for AP1/current AP2 blind spot monitoring some moving objects are invisible even when they are withing range).

1) Cameras don't need to, area is covered by ultrasonics, anything tall enough will also be caught by a camera (such as a car, a person, or even a child who is tall enough).

Of course, but these are still blindspots for the visual system, so they are areas for potential improvement - is my point. For example if ultrasonics are "seeing" something, visual confirmation could help an FSD system.

3) Yes, as would human vision. You can't see when you cover your eyes.

Yes, but humans can get out of the car and go check when need be. They can also move their head etc. But more importantly: technology can become safer and better than humans. Perhaps for liability reasons it also needs to. Tesla adding more sensors to their suite seems perfectly plausible from this perspective.

Ultrasonics would not be needed for cross traffic detection for actual vehicle traffic. It would all be caught on camera. If the camera is obscured just as if human vision is obscured, then it's probably not safe to make a turn. No worries.

I guess my thinking is this: what would an FSD system do in the below scenario, if ultrasonics are showing something but it can't see what? Or if ultrasonics are showing nothing but the system also realized the cameras are blocked... what would it do? In the latter scenario it would probably roll out real slow, but it is hardly optimal...

The tight space out of the carpark through the house means that visibility from the cameras is terrible. Yet if it had a nose camera, things would suddenly become much easier. Also front corner radards that see towards the sides would help too.

ap2_nose_blindspots-jpg.238241
 
Wait did you miss it hitting the doll and plastic can and only when placed height level to the radar that it stop.

Are we gonna ignore the numerous auto parking accidents that relies solely on ultrasonic. Especially those involving poles?

shopping



Or have you forgotten this video?

Side objects and low below the hood height objects WILL be ran over. So if your level 5 tesla parked and later on someone puts a bike on the floor infront of it. When you summon it to pick you up from work. It WILL run over the bike and damage your car

Everything you've posted about the ultrasonics was from AP1 vehicles. That said, a human might see the bike but even a human will miss the dufflebag.

Remember we are looking for it to eventually be at least twice as good as a human on average, not foolproof.
 
Everything you've posted about the ultrasonics was from AP1 vehicles. That said, a human might see the bike but even a human will miss the dufflebag.

Remember we are looking for it to eventually be at least twice as good as a human on average, not foolproof.

I think you are missing the point. This thread is about a potential AP3.

Nobody that I see is arguing AP2 can't become twice as good as a human (at least in optimal conditions). We are discussing what and why might be in an AP3. No?

Why wouldn't Tesla aim for an AP3 that is, say, 10 times better than a human in all conditions?
 
Everything you've posted about the ultrasonics was from AP1 vehicles. That said, a human might see the bike but even a human will miss the dufflebag.

Remember we are looking for it to eventually be at least twice as good as a human on average, not foolproof.

No they wont. A human will walk towards their car to get into it and will see any obstacles in their drive path.

A computer cant walk towards the car to survey the environment before driving it out.

A car with ap2 sensor would henceforth run into any and all obstacle out infront of it below its front camera fov. Lastly ap2 ultrasonics doesnt eliminate the blind spots, it only increases the distance.

That isnt twice as good as a human, its not even half as good as human.
 
  • Love
Reactions: lunitiks
What everyone seem to miss when its about redundancy is that redundancy is NOT a question of camera sensors failing. But that ap2 is a camera(vision) only system. Camera(visio) only.

The ultrsonics are useless and the radar has a fov of about 30 eliminating it from 95% of all driving situations.

Meaning the system is camera only. Which means the computer vision has to be 100% accurate at all times. That even a split second of inaccuracy = fatal accident.

Theres no backup. Which is what others in the industry oppose Tesla views.

This is why people use lidar, because lidar can back up camera and camera can backup lidar.
 
  • Informative
Reactions: AnxietyRanger
Meaning the system is camera only. Which means the computer vision has to be 100% accurate at all times. That even a split second of inaccuracy = fatal accident.

Theres no backup. Which is what others in the industry oppose Tesla views.

This is why people use lidar, because lidar can back up camera and camera can backup lidar.

I'm missing this intentionally in this thread, though. I completely agree with you on up to 360 degree camera + radar + lidar + ultrasonic fusion being the way of the rest of the industry. Personally I was more thinking along the lines of, if Tesla sticks to their vision mostly system, what kind of changes might be see in an AP3.

I think more cameras are the obvious answer to that. The nose being one specific area. Visual redundancy towards the rear being another. Driver monitoring the third. I have also not counted out rear corner radars or more front radars.

Lidar... I guess that would take a serious about-face by Tesla. I have a hard time seeing it in AP3. AP4 might be anyone's guess, though...

We shall see how Tesla's offering continues to compare to the rest of the industry.

For some odd reason @JeffK seems to suggest no new sensors are needed or coming. I guess I just don't understand his point.
 
This is why people use lidar, because lidar can back up camera and camera can backup lidar.
Both rely on light and both can be obscured by dirt... The forward driving path is using vision and radar which is better than just camera and lidar. You'd need camera, lidar, AND radar.

Which is what others in the industry oppose Tesla views.
You mean like MobilEye that has promoted a vision only system?

A computer cant walk towards the car to survey the environment before driving it out.
Humans don't generally walk all the way around their car.

Which means the computer vision has to be 100% accurate at all times.
Human's aren't..

That isnt twice as good as a human, its not even half as good as human.
But humans can't see at all directly in front of the hood, whereas a Tesla can see much but possibly not everything.
 
I really think the first upgrades will involve the CPU/GPU setup but not the sensors, unless a flaw is discovered. At least in the next two years.

I believe that is true and that will happen starting today with the Model 3 launching with 2.5 hardware...

However, the point was AP 3.0. What kind of new sensors do you think that might contain?
 
Both rely on light and both can be obscured by dirt... The forward driving path is using vision and radar which is better than just camera and lidar. You'd need camera, lidar, AND radar.

Which is why many are going towards all three... or all four including ultrasonics.

You mean like MobilEye that has promoted a vision only system?

I guess the more relevant reference are the integrators, aka the car makers.

Humans don't generally walk all the way around their car.

I think @Bladerskb's point is - as was mine - that humans can and do examine the surroundings of their car both unintentionally (when walking up to it) and intentionally (stepping out when needed). Just yesterday I was parking in a tight spot and stepped out to see what was going on...

Human's aren't..

For liability and adoption reasons self-driving will be under more attention and better results are likely expected from it than from human drivers.

But humans can't see at all directly in front of the hood, whereas a Tesla can see much but possibly not everything.

Again, humans can walk out.

However, this is still beside the point. Tesla is not seeing as much around the nose as it quite reasonably could see. The suite has an obvious lacking in nose area near vision and nose area cross-traffic vision as well as in rear redundancy, which is covered by single cameras mostly. This could certainly be improved by a better suite of sensors.
 
Cameras in the nose cone would be helpful only in theoretically perfect conditions. The nose area gets hit with bugs and dirt all of the time. No way to clean them without stopping the car and manually cleaning them off. Rain will also render them useless as soon as it starts (BTW the rear camera needs some mod to help keep water from distorting the image too). The three cameras in the windshield can be cleaned via the wipers. The side cameras might get dirty but bug shmutz will not take them down.

The system that is in place now is designed to be as good if not better (most likely because it would not get distracted ) then a human. No human drives with their head sticking out of the nose cone so they can be sure nothing is directly in front of the car. :) Ultra sonics are used for that very purpose and are very effective at low speed IE parking lots etc.

I think 2.5 will be an upgrade to the computer and Elon already hinted that it might be needed for FSD. We'll see. I paid the full price for everything and am very eager to see how this plays out. So far, I think I am at parity with 1.0 .... I think....
 
  • Helpful
Reactions: AnxietyRanger
Cameras in the nose cone would be helpful only in theoretically perfect conditions. The nose area gets hit with bugs and dirt all of the time. No way to clean them without stopping the car and manually cleaning them off. Rain will also render them useless as soon as it starts (BTW the rear camera needs some mod to help keep water from distorting the image too). The three cameras in the windshield can be cleaned via the wipers. The side cameras might get dirty but bug shmutz will not take them down.

This is trivial to fix, though, as Audi has done with their front night vision: integrate a water spray into the nose camera.

But thank you for bringing out a good point. Tesla completely lacks camera cleaning (outside of heating) in four of its AP2 cameras. As you say, the rear camera especially is very susceptible to getting dirty. This will be an impediment for FSD in some conditions...

So a bit of wishful thinking here: A future AP revision thus might include better cleaning for the cameras, e.g. water sprays or mini wipers or something...

So far, I think I am at parity with 1.0 .... I think....

You're not, though: #1 #2... Neither am I, if it helps any. :D
 
  • Helpful
Reactions: GregKo
The system that is in place now is designed to be as good if not better (most likely because it would not get distracted ) then a human.

Frankly, I think AP2 is designed to include the bare minimum of hardware for good weather FSD. Tesla wanted the platform out there as early as possible and that is already a costly exercise. Shipping more sensors than absolutely necessary at minimum does not fit with that idea and would need an even beefier CPU/GPU to handle...

I'm actually thinking Tesla might have even left out the radar and the ultrasonics had they not needed those for AP1-based continuity (after all, the ECU had a place for MobilEye chip too originally). So it is a mix of needed legacy items and bare minimum of sensors for vision-based FSD, so that Tesla can get a head start with that. It may not tell much about any future suite as such.

The AP2 suite probably is, thus, more an accident of history than the optimal way to do FSD in Tesla's opinion.
 
Both rely on light and both can be obscured by dirt... The forward driving path is using vision and radar which is better than just camera and lidar. You'd need camera, lidar, AND radar.

No camera + lidar is FAR superior to camera + radar. Its not even comparable.

You mean like MobilEye that has promoted a vision only system?

Mobileye never promoted a vision only system. They offered only vision only software but have always promoted that a level 3+ car would need triple redundancy.

Infact they called tesla ap2 sensor configuration a beefed up level 2.


Humans don't generally walk all the way around their car.

Wait how in the world is it that you dont understand. Do you drive? Do you park and unpack? Serious question.

Because if you do you would know that whe. Walking to your car if a bike or bag is in front of your drive path then u would see and remove it.

It doesnt require you to walk around your car.
But humans can't see at all directly in front of the hood, whereas a Tesla can see much but possibly not everything.

They can, its called approaching your car in the parking lot. You must not drive. Its the only explanation
 
  • Disagree
Reactions: JeffK
Corner radars for sure. :)

I was very surprised AP2 didn't have this from the beginning, but it's probably not necessary for the 2X a human thing. 10X a human then it's a given.

This is where you guys kill me. You say that there is no proof that you need lidar for fsd cars yet claim at the same time that tesla can get it done with camera only. Talk about major contradiction. This is with tesla having 0% self driving software at the time of ap2 unveiling.