Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Day Questions

This site may earn commission on affiliate links.
I watched Tesla Autonomy day presentation and very impressed. However, I have a couple of questions which will help me decide whether or not I should get Autopilot option as a Tesla customer.
Dr Andrej Kaparthy seems to put a lot of confidence in Deep Learning And Neural Net algorithms. He said that instead of looking at individual driving situations, and program it in by hand, the machine will learn to handle all of the situations from visual data training, the more data the better. I'm a bit worried about over reliance on Neural Net because there may never be enough data to cover all situations. It can't generalize like humans. It will not be able to handle infinite number of new situations it hasn't seen before. For instance, if a human wearing a garbage bag or big paper bag costume walking across street, will it think it's just a paper bag, and drive through it? Can it understand subtle human body language and gestures by pedestrian or police officer?
Maybe it will achieve level 4, but I doubt it will get to level 5 without a new approach in addition to just simply Neural Net. It perhaps need some fail safe heurestics progammed in, and let car pull over, and alert driver to take over if it encounters situation it cannot handle.
I'm pretty sure Tesla Autopilot AI will be able to learn all the driving behavior and perfect it, but not so sure about if it can handle human element of negotiating in city traffic well enough to get regulator approval for full autonomy.
 
  • Like
Reactions: Watts_Up
there may never be enough data to cover all situations

1. Karpathy believes he will have more essentially free fleet data to lob at the NNs than scarce money to pay programmers to do heuristics for every conceivable scenario in L5, and I am sure he is not wrong about that. Whether his hunch will pay off in the longer term remains to be seen and maybe HW4 coming down the pipe will be needed for the final shove. However, I expect HW3 should at least get us to a reliable L3-candidate by middle of next year, even though it will still retain the L2 nags for legal conformity reasons until a higher level regulatory approval is granted.

2. With regards to the sensor suite itself, as Karpathy partially explained, there is the prospect that the traditional Tesla perception gap [e.g. not braking for stationary objects from >50mph, doppler radar only detecting moving vehicles] will be patched up with the HW3 upgrade enabling the running of some clever and compute-intensive software which was beyond the tapped-out HW 2.5, such as the advanced algorithms described in these recent papers:

A. Night Vision with Neural Networks
https://arxiv.org/pdf/1805.01934

B. Depth from Videos in the Wild: Unsupervised Monocular Depth Learning from Unknown Cameras
Depth from Videos in the Wild: Unsupervised Monocular Depth...

C. Pseudo-LiDAR from Visual Depth Estimation: Bridging the Gap in 3D Object Detection for Autonomous Driving
Pseudo-LiDAR from Visual Depth Estimation: Bridging the Gap in 3D...

3. With the on-board compute resources Tesla's HW3 supplies, A provides amazingly enhanced night vision and B&C emulate high-resolution LiDAR for 3D-mapping in real-time, hence the sensor-suite problems are, to quote Karpathy, quite tractable.

4. This does not detract from the fact that Waymo and others make a very safe AV system in reliance on LiDAR, but goes to illustrate that Tesla has now revealed a convincingly feasible technological path to the same L5 goal at probably much less unit expense. Only time can tell which bet pays off more handsomely, but I think both will probably do very well at leaving the established auto OEMs in the dust.

5. It is very exciting what can now be achieved with minimal inputs and maximal data-crunching, but whether or not you want to pay to be part of this experiment is up to yourself. Probably though, if you are buying new with HW3 already fitted it will bring many advantages over HW2.5 even if you never pay for AP/FSD, such as faster/more fluid emergency reactions and h.265 dashcam feeds.

For instance, if a human wearing a garbage bag or big paper bag costume walking across street, will it think it's just a paper bag, and drive through it? Can it understand subtle human body language and gestures by pedestrian or police officer?

6. For the former, probably yes, as even the current version stops for pedestrians on a Zebra-crossing. For the latter definitely not at the moment, but presumably that will have to be in the final FSD-candidate for >=L4 approval.
 
I watched Tesla Autonomy day presentation and very impressed. However, I have a couple of questions which will help me decide whether or not I should get Autopilot option as a Tesla customer.
Dr Andrej Kaparthy seems to put a lot of confidence in Deep Learning And Neural Net algorithms. He said that instead of looking at individual driving situations, and program it in by hand, the machine will learn to handle all of the situations from visual data training, the more data the better. I'm a bit worried about over reliance on Neural Net because there may never be enough data to cover all situations. It can't generalize like humans. It will not be able to handle infinite number of new situations it hasn't seen before. For instance, if a human wearing a garbage bag or big paper bag costume walking across street, will it think it's just a paper bag, and drive through it? Can it understand subtle human body language and gestures by pedestrian or police officer?
Maybe it will achieve level 4, but I doubt it will get to level 5 without a new approach in addition to just simply Neural Net. It perhaps need some fail safe heurestics progammed in, and let car pull over, and alert driver to take over if it encounters situation it cannot handle.
I'm pretty sure Tesla Autopilot AI will be able to learn all the driving behavior and perfect it, but not so sure about if it can handle human element of negotiating in city traffic well enough to get regulator approval for full autonomy.

The more training, the better.

But don't judge humans as the pinnacle of safe driving, In Georgia we've had over 1,000 fatal accidents so far this year. Now that I think of it, humans are probably the worse thing to compare it to.
 
6. For the former, probably yes, as even the current version stops for pedestrians on a Zebra-crossing. For the latter definitely not at the moment, but presumably that will have to be in the final FSD-candidate for >=L4 approval.
Actually in the Q&A section of presentation, someone asked Elon whether or not the system will be able to handle police directing traffic, Elon said yes, it's easy, seems he is pretty confident it'll happen. As for stopping for obstacle, I assume any solid objects big enough will cause autopilot to stop. But what about ability to distinguish between a flying plastic bag and something more solid sitting on the road? You wouldn't want the car to stop for a newspaper on the road, but definitely a cardboard box or tire, or at least try to avoid it.
 
Actually in the Q&A section of presentation, someone asked Elon whether or not the system will be able to handle police directing traffic, Elon said yes, it's easy, seems he is pretty confident it'll happen. As for stopping for obstacle, I assume any solid objects big enough will cause autopilot to stop. But what about ability to distinguish between a flying plastic bag and something more solid sitting on the road? You wouldn't want the car to stop for a newspaper on the road, but definitely a cardboard box or tire, or at least try to avoid it.

Stopping for crossing pedestrians works currently from 50kmh on HW2.5 if they are visible from about 100m away, but I wouldn't want to test it at 150kmh. Solid stationary objects do not cause AP to stop if car doing >80kph, or at least not reliably. This is where I expect the pseudo-LiDAR mentioned above to fill the perception-gap. A small piece of newspaper blowing across the road can, I suppose, safely be ignored, but if it were scrunched up/stationary I would hope to see the AV slow down and drive around rather than over it. Still I think this last sort of capability is at least a year away on FSD only.