You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I think the capability of the HW3 NNs (the software, not the hardware) is significantly farther along than we have seen publicly. However, they have a real incentive NOT to let the cat out of the bag, or the rent suddenly becomes due on all the HW2.5 (and worse, HW2/MCU1) cars out there who have paid for FSD.
Spotting a stop sign, which is always located ~6 feet above ground, generally on the right, and facing the vehicle with a clear shape and color, has got to be easier and more important than spotting the differences between a cone, a fire hydrant, or an orange flag on a concrete divider.
Has anyone asked if current set of hardware is good enough for NoA on city streets? It's one thing to follow lane markings and detect cars going the same direction, but it's another to see cars coming out of the driveways/shopping plaza exits, bicyclist, and pedestrians. How confident is Tesla in not hitting a pedestrian?
How confident are you that humans don’t hit pedestrians? I’m always extra careful when crossing streets etc.Has anyone asked if current set of hardware is good enough for NoA on city streets? It's one thing to follow lane markings and detect cars going the same direction, but it's another to see cars coming out of the driveways/shopping plaza exits, bicyclist, and pedestrians. How confident is Tesla in not hitting a pedestrian?
I think detecting cars and pedestrians is pretty rudimentary at this point, perhaps considerably easier than detecting drive-able surfaces. Have you seen the demo from Nvidia, doing “Pixel Perfect Perception” - which unless I misunderstood the hardware they’re using in the demo, is not even as powerful as HW3 from Tesla. Arguably Tesla has been working on the underlying software for these tasks longer and has a larger data set to train from - both assumptions on my part though.That's my concern. I am not sure the cameras have proper field of view and depth of view for proper detect and avoidance capability on city streets. When turning on the street, you would want to detect and determine intention of objects not in your direction from as far away as possible. This is especially true when approaching an intersection with pedestrians. We human can predict people's behaviors, but I am not sure machine is quite capable of doing that. Especially with the hardware Tesla has. A good use case would be driving around Disneyland on a busy day. Tesla needs to address all of the difficult cause unless they are going to severely geofence the city NoA.
Does anyone know how to be a "beta" tester for the FSD? I would really enjoy trying it out and do some testing!!
How confident are you that humans don’t hit pedestrians? I’m always extra careful when crossing streets etc.
I think detecting cars and pedestrians is pretty rudimentary at this point, perhaps considerably easier than detecting drive-able surfaces. Have you seen the demo from Nvidia, doing “Pixel Perfect Perception” - which unless I misunderstood the hardware they’re using in the demo, is not even as powerful as HW3 from Tesla. Arguably Tesla has been working on the underlying software for these tasks longer and has a larger data set to train from - both assumptions on my part though.
Panoptic Segmentation Helps Autonomous Vehicles See Outside the Box | NVIDIA Blog
The ability to read intention or “path prediction” is probably the far harder problem here. Where can we expect that car to be over the next 10-50 frames. While not necessarily a requirement, it seems it would make things far more efficient overall. As long as the car can interpret motion and direction, even before it’s in range of other sensors like Radar or Ultrasonic - it can act to avoid. Tesla’s claim was it could process 2,300 frames per second through it’s HW3 SoC, so this would seem to provide plenty of headroom to act in the moment, even without a competent path prediction.
I’m fairly sure most people are just not careful enough- I see enough cars go right in front of pedestrians trying to cross the street.That's the thing though. Most human drivers know to be careful when approaching an intersection. A machine will as well, but there is a fine line of acting appropriately between too careful and not careful enough. I am just not sure our car can do that yet.
This is a concrete proof that Tesla is moving closer to releasing "automatic city driving".