Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Predictions on when we will see AP 3.5 or 4.0 hardware

This site may earn commission on affiliate links.
Fixed it for you!

You are 100% wrong. It is ridiculously absurd to suggest that companies like Waymo have no clue how to get useful info from cameras. They also use camera vision and get plenty of good information out of cameras. They don't use LIDAR to replace cameras. It is not either or. They use LIDAR WITH cameras, because cameras + lidar is better than just cameras.

Please don't "rewrite" my quote to make it say the opposite of what I actually said. It could be confusing to people who think I said it. If you disagree, then quote me and post your reasons in the reply post, but don't twist my quote to say something different than what I said, especially when you are now making it look like I said something so absurdly false.
 
Last edited:
It is called "sunk cost fallacy" and it is real. They've tossed hundreds of millions of dollars at it, kinda hard to just drop it...
Sunk cost fallacy

All the hundreds of millions of dollars, and lidar, got Waymo much better FSD than Tesla. They have true robotaxis while Tesla has yet to even give us stopping at a red light.
 
Here is a great table that illustrates the strengths and weaknesses of different sensors compared to humans. It shows why "camera only" is not good enough for safe FSD. You will have too many limitations like only "fair" object detection and "fair" distance estimation. To do safe FSD, you need sensor fusion that includes camera + radar + lidar. Then, you have a system that works "good" in all situations.

Sensor-Fusion.png
 
To do safe FSD, you need sensor fusion that includes camera + radar + lidar.
No, this is just sh!t ppl say either because they listened to some podcast or just to make themselves feel good. (shiny lidar)
Others use it to slow down those who are ahead of you! (in the case of legacy OEM's saying lidar is a must)

To be safe FSD needs to show that it is better then humans in 99.999% of the situations.
To show that FSD is better then humans, they need to be running it in supervised mode a la Autopilot with NoA until enough miles are driven without incident.
 
The fact remains that Tesla cannot even deliver simple things like auto wipers, auto high beams, self park, etc. when we were promised level 4 autonomy by now.

Some Tesla cars were bought over three years ago based on those lies. I should be able to drive coast-to-coast with FSD by now? I could go on, but the unfulfilled promises with Tesla's approach are getting farcical. Meanwhile, Waymo and others have level 4 running.

If folks put down the Teslaquila (AKA Tesla kool-aide), it is clear we were bamboozled.

BTW: My auto wipers turned on randomly at the start of my drive yesterday on a clear Denver morning. No dew or ice. Thinking it would stop, I ignored the gremlins. It didn't. I had to navigate their menus to figure out how to disable the wipers. Deep learning?
 
I am a firm believer that Tesla will need to add more sensors at some point, including lidar, in order to achieve safe autonomous driving where driver supervision can be removed. But Tesla does seem to change hardware incrementally and only on a strictly as-needed basis.

So, I suspect that Tesla will wait to see how AP3 is working once "feature complete" is released to the public. Tesla will study what areas are still lacking in the hardware or software and make the bare minimum of changes to address the issues.

I could see the Model Y having "AP3.5" that just has better resolution cameras or something minor, not a fundamental change to the sensor layout. But I think 2-3 years from now, we will probably see "AP4" with the new AP4 chip and (I hope) additional sensors like lidar.
From my observation, I think that Tesla will need to add some cameras on the very font side of the car, may be integrated
inside the high beam location, so the car would be able to see perpendicularly on the front right and the front left
for incoming pedestrians or cars at a 'T' intersection or when exiting a garage and trying to merge with the traffic.

Currently FSD relies on the cameras located on the top of the windshield, which is about the location of the eyes
of a driver trying to look left and right by moving the head a little bit forward above the steering wheel.

But even though, it is often difficult to see the incoming traffic when I try to exit from my driveway
because the side view is obstructed when there is a car parked very close near the driveway exit..

In this case, I need to look through the windshield the the car parked to see the incoming traffic
without having to move the front of my car inside the street.

But try to teach FSD to look through a parked car windshield to learn this trick,
which will not be possible anyway if the other car has a sun shade windshield cover.

Also I put sometime myself in difficult situation by detecting only at the very last moment
one of those electrical bicycle ridding very fast and difficult to notice.


Note: On the following picture, you can noticed the sensors located just above the front axial
and on each side of this autonomous car, and where additional cameras would be needed,
to provide peripheral vision.

chevrolet-bolt-lidar-min.png
 
Last edited:
LOL --- if you take out the Lidar column out of your graphic, you still get a "Good" rating across all items. Your own "proof" makes Lidar obsolete!
o_O;)
btw, Tesla's have NEVER been "camera only"

Radar does not work well for stopped vehicles. Also, you will not have any redundancy if the cameras fail. Take you two columns and temporarily take out the camera column to simulate a camera failure, your system won't be "good" in all categories anymore.

No, this is just sh!t ppl say either because they listened to some podcast or just to make themselves feel good. (shiny lidar)
Others use it to slow down those who are ahead of you! (in the case of legacy OEM's saying lidar is a must)

To be safe FSD needs to show that it is better then humans in 99.999% of the situations.
To show that FSD is better then humans, they need to be running it in supervised mode a la Autopilot with NoA until enough miles are driven without incident.

Yes, that is how you validate the software as safe but how are you going to get to 99.999% with cameras if they only have a "fair" rating in object detection, distance estimation, visibility range and dark or low illumination performance and a "poor" rating in poor weather performance? You can't.
 
Radar does not work well for stopped vehicles. Also, you will not have any redundancy if the cameras fail.
To get "good" in all categories, they just combined the rating across the different columns, that means they are used a unified system - my point was simply that based on that graphic, you can get "good" rating in ALL categories WITHOUT lidar.
Radar alone is never used, radar with vision is used to get good object detection.
In Tesla's case there are 3 cameras pointed forward (yes different field of view) but they give redundancy to safely bring the vehicle to the stop.
But, just as I said up-thread, I can see updates to all existing sensors including the housing to help in inclement weather.
I even believe they could have dual radars in the 2 front corners of the front bumper, both for redundancy and for help with lateral perception.

My argument is not against improving sensors only that Tesla will not see Lidar on its cars in this decade.
 
My argument is not against improving sensors only that Tesla will not see Lidar on its cars in this decade.

EM will fire a thousand doubters on Tesla's self-driving development team before reversing his stance on Lidar.

Wont matter because EM will be on Mars before anyone has a chance to take action. The Martian court system will not get its act together before he is long gone. Plus, I think he will be the feudal king of Mars and will ban all Lidar forever under his magnificent rule.

Long live the martian king, EM! :rolleyes:;)
 
To get "good" in all categories, they just combined the rating across the different columns, that means they are used a unified system - my point was simply that based on that graphic, you can get "good" rating in ALL categories WITHOUT lidar.
Radar alone is never used, radar with vision is used to get good object detection.
In Tesla's case there are 3 cameras pointed forward (yes different field of view) but they give redundancy to safely bring the vehicle to the stop.
But, just as I said up-thread, I can see updates to all existing sensors including the housing to help in inclement weather.
I even believe they could have dual radars in the 2 front corners of the front bumper, both for redundancy and for help with lateral perception.

My argument is not against improving sensors only that Tesla will not see Lidar on its cars in this decade.

Yes, Tesla has 3 front cameras and a front radar but they obviously don't give enough redundancy to always bring the car to a safe stop since we've had multiple accidents where the Tesla still crashed into stopped vehicles on the highway. Now, hopefully, Tesla will improve their camera vision more and solve that problem. But in the mean time, we've had several crashes that could have been avoided with a $500 lidar sensor.

Heck, I temporarily lost all autopilot features for about 10 minutes the other day because the cold weather caused some temporary condensation on the inside of the front camera housing. So if some rain and cold can temporarily disable AP features, clearly the hardware does not have enough redundancy.

I get your position and I am glad that you want some changes to the hardware. I know Elon will most likely never ever change his stance on lidar. I just think it is a mistake. Lidar does offer some advantages. Elon is basically rejecting a useful tool just because he is stubbornly determined to accomplish FSD without it.