This morning my car received the 2021.36.5.2 firmware, and with it the latest FSD beta. Today I drove 38 miles, most of that with FSD engaged. My conclusion is that this software is extremely dangerous. I've seen lots of posts from people who have been eagerly awaiting it, noting how many thousands of dollars they paid for it. My conclusion: This ain't it! The software is extremely flaky. I've never taught a human being to drive, so I'm a little uncertain in the comparison, but it seems to me that it is, at best, at the level of a teenager who's driving a car for the first time. It's dangerously inconsistent -- taking turns at high speeds sometimes and too slowly at other times. It brakes hard to slow down to avoid colliding with a line of cars that had been clearly visible (stopped) far enough away for a much more sedate braking maneuver. It frequently can't decide which lane to be in, and sometimes weaves back and forth unnecessarily. It once planted itself firmly in a right-turn-only lane when it had to go straight, then jerked the wheel back and forth as if uncertain whether to turn or try to change lanes long after it would have been safe to do so. (On this occasion, I had to take control, and was rewarded with an FCW alert because of the car that was turning left from the other direction.)
It's best at highway speeds, but that's just plain old Autopilot. Make no mistake, though -- although both Autopilot and FSD are labeled as "beta," they're nowhere near the same level of beta. I suspect that Tesla called Autopilot "beta" as a CYA tactic, but that now creates the problem that, as something being tested outside of Tesla, FSD is technically beta -- but really it's at an alpha-test level of refinement. (I work in the computer industry, and the level of bugs I've seen in my 38 miles of driving far exceeds what I'd expect from beta software, particularly when the phyiscal danger of software driving a car is taken into consideration.)
I was also fairly impressed at how the car handled a low-speed road around a local park, which was packed with pedestrians. It kept a safe distance from the pedestrians and maneuvered with reasonable speed. This was a task that's similar to the Smart Summon feature, which has been tested extensively for a year or so.
To be clear, I'm sure that FSD will improve with time, and testing on real roads is required for this improvement to occur. I am not trying to be critical of Tesla or even of Tesla's FSD development strategy; I don't see a way around putting poor self-driving software in cars on real roads at some point, if we're to develop self-driving capabilities at all. I am, however, saying that using the software in its current form is an extra job. It is not fun to use, it does not reduce the stress of driving, and it is not worth the $X thousand dollars (or even $200/month) that we've paid for it. It is, at best, early beta-test software (really more like late alpha-quality), and it needs to be treated very cautiously. So if you want to beta-test it, and if you feel up to the task of monitoring a machine with less driving skill than the average teenager who is learning to drive, then do so -- but do so with caution and extra alertness. You're still legally responsible for the car, and if the software drives off a road, slams into another vehicle, or kills a pedestrian, YOU are responsible. Remember that every second you're supervising this Flaky Student Driver.
It's best at highway speeds, but that's just plain old Autopilot. Make no mistake, though -- although both Autopilot and FSD are labeled as "beta," they're nowhere near the same level of beta. I suspect that Tesla called Autopilot "beta" as a CYA tactic, but that now creates the problem that, as something being tested outside of Tesla, FSD is technically beta -- but really it's at an alpha-test level of refinement. (I work in the computer industry, and the level of bugs I've seen in my 38 miles of driving far exceeds what I'd expect from beta software, particularly when the phyiscal danger of software driving a car is taken into consideration.)
I was also fairly impressed at how the car handled a low-speed road around a local park, which was packed with pedestrians. It kept a safe distance from the pedestrians and maneuvered with reasonable speed. This was a task that's similar to the Smart Summon feature, which has been tested extensively for a year or so.
To be clear, I'm sure that FSD will improve with time, and testing on real roads is required for this improvement to occur. I am not trying to be critical of Tesla or even of Tesla's FSD development strategy; I don't see a way around putting poor self-driving software in cars on real roads at some point, if we're to develop self-driving capabilities at all. I am, however, saying that using the software in its current form is an extra job. It is not fun to use, it does not reduce the stress of driving, and it is not worth the $X thousand dollars (or even $200/month) that we've paid for it. It is, at best, early beta-test software (really more like late alpha-quality), and it needs to be treated very cautiously. So if you want to beta-test it, and if you feel up to the task of monitoring a machine with less driving skill than the average teenager who is learning to drive, then do so -- but do so with caution and extra alertness. You're still legally responsible for the car, and if the software drives off a road, slams into another vehicle, or kills a pedestrian, YOU are responsible. Remember that every second you're supervising this Flaky Student Driver.