- The cameras aren't eyes. They actually have better resolution, but have been de-rated explicitly because better resolution isn't better. Go do a little research and see how little area that the eyes have good resolution in. No, the side cameras don't have wipers, but the front does. They don't need to rotate, they can see everything at once. BTW, people get confused in low sun angles. That's why many cities have the sunrise and sundown slowdowns in traffic.
Well, I guess we have to disagree on that. Estimates of the resolution on the human eye by Dr Roger Clark (a recognised expert in vision systems) puts it at around 576 Mp (see
Clarkvision Photography - Resolution of the Human Eye). That figure takes into account the ability of the eye/brain to rapidly scan an area to build a highly detailed metal model of some area of interest. In the driving context, we are all capable of seeing the direction a car's front wheels are pointing in when at a T-junction; or where the driver is looking and the expression on his/her face. I very much doubt the Tesla cameras are even close to doing that.
- And just how do you think that your brain does it. How do you know what a "car" is? A "building" , a "road" that's because you've had years of learning, called childhood, to learn it. And trust me, humans do equally poor in known (even known) situation. That's where wrecks and slow cars come from.
This is obviously a contentious point - but I just don't accept that what the computer is doing is even close to what our brains are doing. Computer image recognition is based on a probabilistic algorithm by which a target image is compared to a library of human-annotated images. A particular target needs to be presented over and over in different configurations, lighting, orientation etc in order for the pattern matching to work. Sometimes odd false recognitions are seen which a human would never even think were close - simply because the algorithm is just computing some correlation score that somehow manages to identify a Skoda as a horse, or such like.
With enough relevant data, I agree that it can get very close to recognising things pretty accurately within a particular narrow scope. Backed by a sufficiently rich model of road traffic behaviour, I'd even accept that it can do some interesting self-driving party tricks.
However, the computer lacks any real "understanding" of what's going on, and it certainly can't infer anything if it's confronted by a situation that it's not been trained or programmed to see. The self-driving system is a closed loop with a finite number of situations it can deal with - but there are always going to be edge cases that it's not seen before - and without any understanding of what's going on, it's going to get it wrong. Anyone who drives a Tesla today on AP knows this. It's an interesting party trick, but it's a long way from FSD. A great example is how the current autosteer completely fails to deal with the markings on UK bus stops - the car thinks that the lines are the edge of a lane and then goes on to steer the car into oncoming traffic. Not even an 8 year old would fail to appreciate that the bus stop markings are there to stop people parking there - not as a lane guide. It's common sense - but the computer has no common sense.
Now, I'm not saying that one day AI won't be able to do human-like things, but it seems to me that it's still a long way off. A human's effective compute power, especially for image processing and building a dynamic model of future events, is still several orders of magnitude ahead of even fancy dedicated neural net processors like those Tesla are using.
So my point isn't that the system that Tesla are working on isn't capable of doing interesting things - just that in order to make it work reliably in all the ways that it needs to work reliably is such a huge step away that it's doomed at current levels of tech. The marketing of "FSD feature complete by end 2020" doesn't resonate at all with what we all experience today.
I'd much prefer it if they concentrated on getting a more limited set of use cases (e.g. basic TACC) working in a robust manner. I applaud Tesla's innovation, but the marketing is misleading IMHO.
So, are you saying that if you were in a control room with a set of 360 degree monitors, that you couldn't drive the car remotely?
Let's turn the table around, seeing that you seem to be a LIDAR pundit, if I put you in that same control room, you could drive?
I do reckon it would be a lot harder to drive a car in a control room with 360 deg cameras - for the simple reason that the monitors would not be as good as my eyes at flicking around the scene rapidly and the ability for my brain to build a model of what was going on would be slower. Then the feedback to my actions would be limited - no accelerative forces, no feel from the wheel, limited peripheral vision etc.
As to LIDAR - I'm not claiming to be a pundit at all (or even a proponent of it). However, I do think a simpler tech based on radar or lidar in order to deliver basic TACC would be much more likely to succeed since using Tesla's tech to do it is basically a sledge hammer to crack a nut - and a sledgehammer that's got a billion working parts to it at that.
People can drive with a single eye. In that case we know that it's all simple perception with distance being extremely hard to determine. Tesla has stereoscopic imaging that can determine the distance and RADAR to give an even better determination. With cameras, you can also get cues, such as color, that isn't available with LIDAR.
After the basic image recognition of either cameras or LIDAR, it's the same set of hard solutions to create. If you take a look at any of the raw image with interpretations shown, it's pretty obvious that Tesla is already got 99+% of the image recognition problem solved.
The question is whether they are at 99% or 9% - which depends of course on what you're measuring. Perhaps they are recognising 99% of typical road vehicles and the road itself and its "furniture", but expand the question to whether they are delivering a dynamic model to control the car safely and reliably, and I'd put it nearer the 9% mark,