Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Major FSD camera suite oversight

This site may earn commission on affiliate links.
The car can shift between forward and reverse way faster than a human. Jump out five feet to look and then jump back if there's a car coming. No problem. :p

Interesting point though, I remember during the fsd demo video posted by Tesla recently, at a stop sign off the highway, the car manuvered itself in place so it could see the crossing cars better. Perhaps it can do something similar in this case, but it seems like a hack rather than a real solution.
 
I have a Model 3 with FSD and have been thinking about this a lot lately. Telsa should add cameras up by th front headlights that look left and right. like on the side of the car right up at the front. That way when the car noses into an intersection, they can see cars coming that the driver cannot if there is a parked car in the way.
 
I'll always argue for more sensors, redundant ones, overlapping ones, different vendors and tech (diversity).

I don't care if I pay an extra $1k for my car; I want all the view angles I can get if this car is going to take any active role in driving.

I'm convinced the m3 does not have enough sensors, period. it will never achieve l4 or l5 on its current hw.

tesla deep down knows that and I suspect they are just buying time, at this point.
 
I have a Model 3 with FSD and have been thinking about this a lot lately. Telsa should add cameras up by th front headlights that look left and right. like on the side of the car right up at the front. That way when the car noses into an intersection, they can see cars coming that the driver cannot if there is a parked car in the way.

Yep. Waymo already has this. They are called perimeter sensors. Waymo has both lidar and cameras located in the front of the car above the front headlights, pointing perpendicular to the car as you can see this picture. The purpose is exactly what you mention, to allow the car to peek "around corners".

Transpo-nextgen_2.jpg
 
  • Disagree
Reactions: mikes_fsd
I'll always argue for more sensors, redundant ones, overlapping ones, different vendors and tech (diversity).

I don't care if I pay an extra $1k for my car; I want all the view angles I can get if this car is going to take any active role in driving.

I'm convinced the m3 does not have enough sensors, period. it will never achieve l4 or l5 on its current hw.

tesla deep down knows that and I suspect they are just buying time, at this point.

Karpathy even alluded to it in his most recent talk. He says that the nature of the problem allows them to avoid situations that they aren’t confident in handling. They just need to be fully aware of the situations they can’t handle. It seems like they are encountering limitations in either the machine learning or sensor suite.
 
and again, people such as myself keep bringing up the fact that none of our body-mounted cameras except the cluster of them on the windshield are self-cleaning or self-defogging.

I'm willing to bet the early designs (probably S) had redundant and self-cleaning cameras but that they nixed it due to cost.

its not acceptable to lose a camera because of the time of day and direction you are traveling (sunlight directly hitting a side camera, for example). those cameras are there as necessary components, not frivilous extras. if you go to the effort of putting camera systems in a car's exterior, seems silly to go 90% and not the full 100%, with redundancy and enough angles so that normal sunlight during an AM or PM commute won't blind any single *zone* camera set.

that's just one thing that tesla is missing. and its a show-stopper for l4/l5 imo.
 
and again, people such as myself keep bringing up the fact that none of our body-mounted cameras except the cluster of them on the windshield are self-cleaning or self-defogging.

I'm willing to bet the early designs (probably S) had redundant and self-cleaning cameras but that they nixed it due to cost.

Needed very much. Told it was a software issue.:mad:
MVIMG_20200626_220210.jpg
 
I've been puzzling a bit on why there aren't cameras on the extreme left and right upper windshield. Seems like logical places to see lane markings, see around the car in front of you, and would address the scenario you describe.

Such cameras could be added presumably. I’m speculating that such positioning would not be reachable by the wipers and therefore subject to poor imaging from dirty windshields.
 
A smart driver will be aware of areas where current self driving technology will be challenged and take over manual control.
Currently Auto pilot is a drives aid, not something that is capable of handling poorly designed intersections. Even humans have thousands of crashes daily,
The combination of an intelligent and aware driver, assisted by autopilot for the easy stuff is the best situation.

Nobody has yet designed an Autopilot that will be capable of 100% self driving in every conceivable circumstance.
 
  • Like
Reactions: 2101Guy
The Tesla Cybertruck has been spotted with 2 rear cameras space apart for better stereo perception:

cybertruck-cameras-1024x636.jpg

Tesla Cybertruck prototype is equipped with a cryptic stereo rear camera setup

Speculation is that it could be used for depth perception of objects attached to the back of the truck or for a second rear view mirror when the tonneau is closed. It cold be unique to the Cybertruck.

But I imagine that it could also be used for better depth perception in general.

In any case, it is intriguing. It is further indication that Tesla could be exploring adding extra sensors.
 
I still don't understand the concern here regarding the lack of perimeter cameras. Tesla's goal isn't to make a self driving car with super human abilities. It doesn't need to travel at 100+ MPH at night in the rain or fly through intersections without stopping. It just needs to function as good as a human in regards to capability. If it has that, it will then automatically be safer as it will never get distracted or incapacitated. So as it stands now, the sensor suite is already better then humans as I can't see 360 degrees around me at all times, nor do I have forward facing radar, yet I can still navigate this world. When it comes to blind intersections, I don't get out of the car to get a view that my bumper has. AP should be no different in that it can slowly creep out until it has the view it needs to safely pull though and maybe make a note to avoid routing though that intersection in the future if possible (just like we do now).

My only concern is the viability of true L5 with the current sensors. The pillar cams fog up and have no way to clear themselves. The fender cams get water over the lenses in heavy rain which results in almost no visibility. If a human driver gets their view blocked, they will wipe the window clear to see the side mirrors, but these cameras can't do that. I can see issues where a camera loses visibility and the car will have to go into some sort of limp mode where it has to limit what it can do and then safely pull over for the rider to then clear the camera. Thankfully this isn't an issue in most conditions, but it still happens.
 
I still don't understand the concern here regarding the lack of perimeter cameras. Tesla's goal isn't to make a self driving car with super human abilities. It doesn't need to travel at 100+ MPH at night in the rain or fly through intersections without stopping. It just needs to function as good as a human in regards to capability. If it has that, it will then automatically be safer as it will never get distracted or incapacitated. So as it stands now, the sensor suite is already better then humans as I can't see 360 degrees around me at all times, nor do I have forward facing radar, yet I can still navigate this world. When it comes to blind intersections, I don't get out of the car to get a view that my bumper has. AP should be no different in that it can slowly creep out until it has the view it needs to safely pull though and maybe make a note to avoid routing though that intersection in the future if possible (just like we do now).

First, driving at 100+ mph at night in the rain is a strawman. Unless, you are designing a race car, you don't need to worry about that.

Sure, without perimeter cameras, the car could still figure out ways to get around any limitations like creeping forward but perimeter cameras would make the car be able to handle those situations better. Why not remove that limitation? Humans are limited to just 2 eyes and a brain but autonomous cars are not limited in that way. Autonomous cars can get extra sensors like perimeter sensors. Think of it like side mirrors or rear view mirrors. We add them to make it easier for humans to drive. Likewise, perimeter cameras would make it easier and safer for autonomous cars.

Which is safer/better?

1) Have the car creep forward when it can't see or avoid certain intersections where it can't see. Sure that might work most of the time but it will be the safest way? No.

2) Add perimeter cameras where it is able to see around corners and can smoothly handle situations and does not need to creep forward. This is clearly better.

Sure there are some constraints like cost. We can't just put 100 cameras, 100 lidar and 100 radar on a car. But perimeter cameras are cheap and are a no brainer to help the car handle situations better and safer.

My only concern is the viability of true L5 with the current sensors. The pillar cams fog up and have no way to clear themselves. The fender cams get water over the lenses in heavy rain which results in almost no visibility. If a human driver gets their view blocked, they will wipe the window clear to see the side mirrors, but these cameras can't do that. I can see issues where a camera loses visibility and the car will have to go into some sort of limp mode where it has to limit what it can do and then safely pull over for the rider to then clear the camera. Thankfully this isn't an issue in most conditions, but it still happens.

I've definitely experienced these issues in heavy rain or winter conditions. So yeah, I think the current sensors will definitely have to go into "limp mode" in these cases. Depending on where you live and what your local weather is like, it might be rare or it might happen pretty often.
 
Yes I mentioned that. I said in my post that Tesla could be using the lidar to help train their camera vision.

That would make sense. Accurate 3D point cloud data could be used to improve distance estimation on their vision system or on the Pseudo-Lidar system. That said, the raised pillar cameras makes me think this isn't being developed for any current vehicle's NN.