I interpret it the other way. That FSD doesn't prevent the driver from exceeding the speed limits.
Its a designed in issue that it doesn't slow down with FSD but works great with AP.
AP has had the 5mph over limit for years, but FSD threw it away.
For good reason, of course. FSD appears to be much aggressive at speed limit sign detection, and tends to get them wrong with alarming regularity. If it didn't, we'd all be driving with our foot on the gas pedal because the AI hallucinated that it was driving a truck or pulling a trailer.
One of my biggest complaints about FSD beta is that it ignores school speed zones. That alone should trigger a NHTSA recall.
That is a nearly unsolvable problem. How do you handle signs that say "Speed limit when children are present"? Nope. Not going to happen any time soon. Nor are cars realistically going to be able to handle "Speed limit from 7:30 to 8:30 and 2:30 to 3:30 on school days" signs, because the car has no idea if school is in session or off for the holidays/summer/weekend/*.
School zones are one of those situations where if cities want autonomous cars to follow those restrictions reliably, they're going to have to upgrade the signs with flashing yellow lights indicating that the speed limit is active. And really, such requirements ought to become federal road standards, with states, counties, and cities strongly encouraged (read "threatened with defunding") to follow them
Musk has an uncontrollable propensity for exaggeration. FSD and Santa Clause have always been in same league, myths. FSD has struggled -- and failed -- to achieve L3 autonomous driving. It will never attain L4 or L5 autonomy -- or "FSD" as Musk has peddled it. First, the entire traffic infrastructure (roads, traffic, lights, etc), will need to be updated to deploy geofencing which in turn will enable vehicles to reliably communicate with each other -- and the road.
Umm... no. Even ignoring the fact that GPS is more than adequate for geofencing (with the exception of tunnels, for which you kind of need
other solutions), there is absolutely no benefit whatsoever from cars communicating with one another or with the road in the general case. The reason that the industry has repeatedly shot down proposals for such communication is that the whole idea is fundamentally contrary to safety and security.
If you actually want any level of safety at all, cars have to be able to drive safely using
only what they can see or otherwise perceive directly. Almost without exception, relying on any sort of communication with other cars or road infrastructure breaks that assumption and creates an unfixable security hole.
If you have vehicle-to-infrastructure communication, any bozo who could steal one of the boxes would be able to send out a fake signal that says "The speed limit is 90 miles per hour" right before a school, and cars would follow it. If you have vehicle-to-vehicle communication, any bozo could easily obtain one of those boxes and send out a signal that says "I'm braking hard," and cause traffic to pile up on the freeway. And there is absolutely no feasible way to prevent those sorts of tampering for precisely the same reason that unbreakable DRM is impossible.
The closest you will ever see to communication between vehicles would be pushing notifications about potholes, ice, and other problems up to a server, such that if a large enough number of vehicles report similar behavior, the server can recognize that the reports are probably legitimate and tell cars to slow down a little bit. And even then, it will involve
heavy server-side filtering to require adequate concurrence between devices that show evidence of moving independently. Otherwise, you'll have incidents like that guy
pulling a wagon full of cell phones who caused fake traffic jams on Google Maps.
So, don't be so gullible when Musk makes claims like Robotaxi will launch by the end 2020, etc, etc, etc.
This part I agree with. Any attempt to nail down an exact date when things will be good enough is pretty much guaranteed to be wrong.
Tesla did do a recall to fix the rolling stop
Tesla proactively issued a recall for certain US and Canada vehicles that received software version 2020.40.4.10 or newer that contained the Full Self-Driving (Beta) feature. This recall affects only those FSD Beta participants who installed this software. The release software update introduced...
www.tesla.com
However, that seems to be another delay tactic because it is still an issue for this latest recall.
It isn't a general problem, but there are specific stop signs that mine tends to not stop for, usually involving having just turned onto the road in question, e.g. turning out of a row of cars into a road-like area in a parking lot and immediately encountering a stop sign before leaving the lot and going into an actual street. My guess is that it incorrectly thinks that those don't apply to it because of their position for some reason.