Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What will happen within the next 6 1/2 weeks?

Which new FSD features will be released by end of year and to whom?

  • None - on Jan 1 'later this year' will simply become end of 2020!

    Votes: 106 55.5%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 3.0 vehicles.

    Votes: 55 28.8%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 2.x/3.0 vehicles.

    Votes: 7 3.7%
  • One or more major features (stop lights and/or turns) to all HW 3.0 FSD owners!

    Votes: 8 4.2%
  • One or more major features (stop lights and/or turns) to all FSD owners!

    Votes: 15 7.9%

  • Total voters
    191
This site may earn commission on affiliate links.
As much as everyone wants these FSD features to be completed by the promised deadline, I would much rather they get it right. Like some have mentioned, it's one thing to have Smart Summon failed and messed up in a parking lot. It's another when the car doesn't recognize the appropriate traffic light signal and cause a major accident. NoA for City needs to be as bullet proof as possible, including ways to make sure drivers are always paying attention. If you think it's bad currently with senators and others calling to ban/stop Autopilot, wait until we get a few dozens of serious accidents on city street because of NoA on City fail to see signals, other cars, and pedestrians...
 
Elon Musk on Twitter

Let's not forget, as of October 12 we were to expect an update to Smart Summon in the coming weeks. It was to be "smooth as silk". That update hasn't appeared, and it has been over 6 weeks now. I'll grant that every week in the future is technically a coming week, but I feel like the intent was to imply something shorter than 6 weeks.

The odds of FSD "feature complete" in any meaningful way by EOY is zero. The odds of having it deployed on a member of the public's vehicle, even as early access, is also zero. They can't make navigate on autopilot follow lanes properly on the highway, so I have very little expectation that they'll be handling general surface streets any time soon.

I'm always willing to be surprised, but the surprise usually comes in the form of someone posting images of their Model 3 driving forward off a ledge and getting beached when smart summon should have backed out of the parking space. That's what you'd expect when your system relies on varying quality map data, something that Elon himself wisely pointed out many months ago.
 
I'm not expecting any miracles by the end of the year. However, here's my theory/speculation. I think the capability of the HW3 NNs (the software, not the hardware) is significantly farther along than we have seen publicly. However, they have a real incentive NOT to let the cat out of the bag, or the rent suddenly becomes due on all the HW2.5 (and worse, HW2/MCU1) cars out there who have paid for FSD.

I think we're seeing the confluence of two roads now - they're dabbling with the hardware replacements, which for a HW2.5 car really isn't that expensive for them, particularly if they shave the time down by optimizing the software installation. And, at the same time, they're dabbling with the first (trivial) software difference or enhancement for HW3 - traffic cone detection - which is fantastic, if worthless. I pass interstate construction every day, with tons of concrete barriers, cones of different dimensions, etc. It spots them all, across 5 lanes of traffic, between cars, etc, and I haven't seen a false positive or a missed cone. A far cry from the "traditional" autopilot visualizations. And, it makes little sense to waste development time on HW3 spotting traffic cones if they didn't already have more fundamental stuff close to complete. Spotting a stop sign, which is always located ~6 feet above ground, generally on the right, and facing the vehicle with a clear shape and color, has got to be easier and more important than spotting the differences between a cone, a fire hydrant, or an orange flag on a concrete divider.

Until they're prepared to deliver the hardware updates to early adopters, or at least the first chunk of them with HW2.5, there is little incentive to release any additional FSD features - only a tiny portion of the unrecognized revenue is from actual HW3 cars. So I wouldn't be surprised to see significant progress soon, for HW3 only, right after we start to see the hardware upgrades going smoother and at a little more scale.

Again, all just my wild-ass-guess, but I think the pieces fit.
 
I think the capability of the HW3 NNs (the software, not the hardware) is significantly farther along than we have seen publicly. However, they have a real incentive NOT to let the cat out of the bag, or the rent suddenly becomes due on all the HW2.5 (and worse, HW2/MCU1) cars out there who have paid for FSD.

Perhaps. But they've already promised FSD computer upgrades by this year, so honestly they're way behind already. We're just used to the delays at this point. But, conversely, consider the massive increase in share price, and consumer and investor confidence if Tesla actually showed they were steadily making real progress. It's something Tesla is heavily banking on for their future, so showing the general public that there are major improvements and huge wins could really be helpful to them.

Even more importantly, consider all of the FSD sales it would cause. Tons of people are skipping the purchase of FSD and instead waiting until they see something worth spending on. The coast to coast drive that's coming in 2017 would be a pretty big win here. Hell, even showing a Model 3 driving from SF to LA including city streets would be a massive shift that would have a pretty serious impact on sales.

Spotting a stop sign, which is always located ~6 feet above ground, generally on the right, and facing the vehicle with a clear shape and color, has got to be easier and more important than spotting the differences between a cone, a fire hydrant, or an orange flag on a concrete divider.

The NN does all of the heavy lifting here, so it's not really a problem of writing software to detect all of these things.
 
Has anyone asked if current set of hardware is good enough for NoA on city streets? It's one thing to follow lane markings and detect cars going the same direction, but it's another to see cars coming out of the driveways/shopping plaza exits, bicyclist, and pedestrians. How confident is Tesla in not hitting a pedestrian?
 
Has anyone asked if current set of hardware is good enough for NoA on city streets? It's one thing to follow lane markings and detect cars going the same direction, but it's another to see cars coming out of the driveways/shopping plaza exits, bicyclist, and pedestrians. How confident is Tesla in not hitting a pedestrian?

I don't think anyone has a definitive answer to those questions yet since we don't have city NOA yet to test for ourselves. Tesla obviously believes that the hardware is good enough. Tesla seems very confident in their software too. Presumably, the HW3 NN that Tesla have in development is much better than what we currently have in our cars and is good enough to detect all vehicles, bicyclists and pedestrians reliably. We shall see. Of course, others will argue that the hardware is not good enough because they don't think the cameras cover enough angles or because they think FSD requires LIDAR.
 
  • Like
Reactions: APotatoGod
That's my concern. I am not sure the cameras have proper field of view and depth of view for proper detect and avoidance capability on city streets. When turning on the street, you would want to detect and determine intention of objects not in your direction from as far away as possible. This is especially true when approaching an intersection with pedestrians. We human can predict people's behaviors, but I am not sure machine is quite capable of doing that. Especially with the hardware Tesla has. A good use case would be driving around Disneyland on a busy day. Tesla needs to address all of the difficult cause unless they are going to severely geofence the city NoA.
 
Has anyone asked if current set of hardware is good enough for NoA on city streets? It's one thing to follow lane markings and detect cars going the same direction, but it's another to see cars coming out of the driveways/shopping plaza exits, bicyclist, and pedestrians. How confident is Tesla in not hitting a pedestrian?
How confident are you that humans don’t hit pedestrians? I’m always extra careful when crossing streets etc.
 
That's my concern. I am not sure the cameras have proper field of view and depth of view for proper detect and avoidance capability on city streets. When turning on the street, you would want to detect and determine intention of objects not in your direction from as far away as possible. This is especially true when approaching an intersection with pedestrians. We human can predict people's behaviors, but I am not sure machine is quite capable of doing that. Especially with the hardware Tesla has. A good use case would be driving around Disneyland on a busy day. Tesla needs to address all of the difficult cause unless they are going to severely geofence the city NoA.
I think detecting cars and pedestrians is pretty rudimentary at this point, perhaps considerably easier than detecting drive-able surfaces. Have you seen the demo from Nvidia, doing “Pixel Perfect Perception” - which unless I misunderstood the hardware they’re using in the demo, is not even as powerful as HW3 from Tesla. Arguably Tesla has been working on the underlying software for these tasks longer and has a larger data set to train from - both assumptions on my part though.
Panoptic Segmentation Helps Autonomous Vehicles See Outside the Box | NVIDIA Blog

The ability to read intention or “path prediction” is probably the far harder problem here. Where can we expect that car to be over the next 10-50 frames. While not necessarily a requirement, it seems it would make things far more efficient overall. As long as the car can interpret motion and direction, even before it’s in range of other sensors like Radar or Ultrasonic - it can act to avoid. Tesla’s claim was it could process 2,300 frames per second through it’s HW3 SoC, so this would seem to provide plenty of headroom to act in the moment, even without a competent path prediction.
 
Does anyone know how to be a "beta" tester for the FSD? I would really enjoy trying it out and do some testing!!

In a sense, we are all beta testers for FSD right now since NOA, Smart Summon etc are labelled as FSD features and are still considered beta after they are released to the public.

But if you mean, a beta tester for the full FSD that is still in development and not released yet, that is only to "Early Access" members. "Early Access" is by invitation only, meaning Tesla has to pick you. And as far as we know, Tesla is not inviting any new members into the early access program at this time.
 
How confident are you that humans don’t hit pedestrians? I’m always extra careful when crossing streets etc.

That's the thing though. Most human drivers know to be careful when approaching an intersection. A machine will as well, but there is a fine line of acting appropriately between too careful and not careful enough. I am just not sure our car can do that yet.
 
I think detecting cars and pedestrians is pretty rudimentary at this point, perhaps considerably easier than detecting drive-able surfaces. Have you seen the demo from Nvidia, doing “Pixel Perfect Perception” - which unless I misunderstood the hardware they’re using in the demo, is not even as powerful as HW3 from Tesla. Arguably Tesla has been working on the underlying software for these tasks longer and has a larger data set to train from - both assumptions on my part though.
Panoptic Segmentation Helps Autonomous Vehicles See Outside the Box | NVIDIA Blog

The ability to read intention or “path prediction” is probably the far harder problem here. Where can we expect that car to be over the next 10-50 frames. While not necessarily a requirement, it seems it would make things far more efficient overall. As long as the car can interpret motion and direction, even before it’s in range of other sensors like Radar or Ultrasonic - it can act to avoid. Tesla’s claim was it could process 2,300 frames per second through it’s HW3 SoC, so this would seem to provide plenty of headroom to act in the moment, even without a competent path prediction.

I am actually not worry about the processing power, though that's very important too. I am just wondering if our cars even have the right camera angles, depth of view, and resolution to detect objects far enough away with enough resolution for the computer to interpret there path and intention of objects. This is very critical on the street, as objects can come from anywhere at any speed, and not all obey the written rules.
 
That's the thing though. Most human drivers know to be careful when approaching an intersection. A machine will as well, but there is a fine line of acting appropriately between too careful and not careful enough. I am just not sure our car can do that yet.
I’m fairly sure most people are just not careful enough- I see enough cars go right in front of pedestrians trying to cross the street.

Usually the problem happens on right turns.

Coming back to city NOA, I don’t think Tesla is still at a stage where they are even trying to determine the intention of pedestrians. Otherwise we’d hear Musk call that as the most difficult problem. Currently they are at crazy traffic lights.
 
  • Like
Reactions: APotatoGod
Not sure you can have proper city NoA without the ability to turn at intersections. Even if just going straight through intersections, I have seen pedestrians crossing red lights because they either weren't paying attention, or they think they can do whatever they want. And then there are jaywalkers. I just don't think you can safely navigate on street without very robust pedestrian detection and avoidance system, otherwise we'll see a repeat of the Uber accident.