You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
That's so different from my experience. I can't remember the last time that FSD disengaged. I drive 95% on FSD, of which 80% is on surface streets. Very little issues on the vast majority of my miles.This is nonsense. Tesla didn't report non-highway numbers or any FSD-beta metrics / KPI:s. The main reason why the City Streets beta isn't in even more accidents is because it's too unreliable so far. Seems stuck at <10 miles / disengagement and <2 miles per intervention. If this improves to 10x accidents will likely skyrocket due to automation complacency and overtrust.
Non FSD, I drive by the same sign in AP where it goes from 65 to 50 all the time. If a truck is on my right and blocks the sign it stays at 65. If it sees the sign, it drops to 50. Just sayin.
(and it show the sign if it sees it, not if it does not)
Mine will slam on the brakes for the new speed limit. Whether it’s reading the sign or using map data I don’t know but it slows down abruptly. This is on TACC or regular basic Autopilot.Does the MPH drop like this while using TACC and does the car adjust speed?
My Feb 2022 MSLR (does not have FSD and is running 2023.2.12) does not acknowledge a different speed limit on a sign when TACC is on. However, with TACC off, the new sign is read and acknowledged. I have to reset TACC engagement (turn it off then back on) for the car to set the new speed.
Is this on FSD?Mine will slam on the brakes for the new speed limit. What her it’s reading the sign or using map data I don’t know but it slows down abruptly.
Agree, mine changes for speed limits slows down in curves, Slows for caution (yellow) signs.Mine will slam on the brakes for the new speed limit. What her it’s reading the sign or using map data I don’t know but it slows down abruptly.
TACC or basic Autopilot.Is this on FSD?
What version of software is your car on?TACC or basic Autopilot.
2023.2.12. It has done it in past software versions since I’ve had the car.What version of software is your car on?
Tesla have been reporting the total miles driven by FSD beta, if I recall its now over 4 million (may be higher, I'm quoting from memory). Where are all the accidents? You can be sure they would be loudly reported in the press.This is nonsense. Tesla didn't report non-highway numbers or any FSD-beta metrics / KPI:s. The main reason why the City Streets beta isn't in even more accidents is because it's too unreliable so far. Seems stuck at <10 miles / disengagement and <2 miles per intervention. If this improves to 10x accidents will likely skyrocket due to automation complacency and overtrust.
Yes, people have deliberately places speed limit signs at locations and the car has responded. As for the two sources of limits (signs and maps), I don't have data on how the car resolves these, but I would guess it chooses the slower of the two. And no, my theory isnt thrown into "chaos" by it misreading one sign . did I (or anyone else) claim the sign reading was perfect?Yes FSDb. Just because the car passes a thousand speed limit signs and the car changes the speed limit in the car every time does NOT mean it "READ" the sign. That data is not good enough when it is also known that the car uses map data because now you have to be able to separate those two conditions.
Yeah, and when were those tests done? As far as I remember that was as soon as Tesla applied visual sign reading to the MODEL S using the Mobileeye and AP1 hardware. Show me a current test, something in the last 3 years on a model 3.
Your theory can be thrown into chaos by passing just one sign and the car doesn't change the speed limit accordingly. It doesn't mean that there wasn't some other reason, but now you have to figure that out.
And let's look at the NHTSA "recall" .. after all the excitement and "they will remove it" and "they will shut it down" shouting, all they actually did was ask for a few minor changes (that beta testers had been asking for anyway) .. and that was all. And that's from the NHTSA which is incredibly safety conscious. Do you think they would have only asked for that if the car had been crashing or dangerous in all their testing???
I'm also curious .. as you are from Europe have you any personal experience of the FSD beta?This is nonsense. Tesla didn't report non-highway numbers or any FSD-beta metrics / KPI:s. The main reason why the City Streets beta isn't in even more accidents is because it's too unreliable so far. Seems stuck at <10 miles / disengagement and <2 miles per intervention. If this improves to 10x accidents will likely skyrocket due to automation complacency and overtrust.
Yes, people have deliberately places speed limit signs at locations and the car has responded. As for the two sources of limits (signs and maps), I don't have data on how the car resolves these, but I would guess it chooses the slower of the two. And no, my theory isnt thrown into "chaos" by it misreading one sign . did I (or anyone else) claim the sign reading was perfect?
So, yes, the car reads speed limit signs. Nothing to see here.
Yes, people have deliberately places speed limit signs at locations and the car has responded.
There are many if you trust the NHTSA reports (which are public). I disagree with the press being loud wrt FSD. Quite the contrary, given the level of this bait and switch scam.Tesla have been reporting the total miles driven by FSD beta, if I recall its now over 4 million (may be higher, I'm quoting from memory). Where are all the accidents? You can be sure they would be loudly reported in the press.
Yes it is nonsense, since your statement was "Yet the early numbers for FSD beta also look good". Now you're talking about miles, but that's irrelevant given the non-disclosed number of incidents - which typically companies with autonomous ambitions would report to the DMV by law. But not Tesla.So no, it's not nonsense. However, I do agree that there is a danger of complacent drivers "zoning out" as they put more trust than is warranted in FSD. This was one of the factors that drove Waymo to skip L2/L3 and go direct to L4/L5. But the same argument can be made for ANY driver assist, even dumb cruise control from decades ago.
I don't think NHTSA is done. They have decided on a strategy, which seem to be "pick on all the unsafe operations". It's a smart strategy, and really the only way forward as I see it. I think NHTSA will keep doing this until Tesla promises that they've fixed it. Since it's impossible to provide such guarantees in an unbounded ODD, NHTSA will get their chance to strike down on this experiment if they chose to. Given Teslas track-record of non-improvement in terms of reliability, this won't likely take too long... <8 miles / DE for over 20 months, it's laughably bad.And let's look at the NHTSA "recall" .. after all the excitement and "they will remove it" and "they will shut it down" shouting, all they actually did was ask for a few minor changes (that beta testers had been asking for anyway) .. and that was all. And that's from the NHTSA which is incredibly safety conscious. Do you think they would have only asked for that if the car had been crashing or dangerous in all their testing???
No I haven't had a chance to get it delivered to my car and Tesla won't be able too either in the coming 2-3 years. This is something I learned by researching the UNECE R79 amendments and drafts. My car will be six years old by then, and it doesn't even have NoA - and Tesla won't refund. They keep selling FSD in EU to more sheep.I'm also curious .. as you are from Europe have you any personal experience of the FSD beta?
No I haven't had a chance to get it delivered to my car and Tesla won't be able too either in the coming 2-3 years. This is something I learned by researching the UNECE R79 amendments and drafts. My car will be six years old by then, and it doesn't even have NoA - and Tesla won't refund. They keep selling FSD in EU to more sheep.
With regards to reliability, I look at the data. At first a looked at the reports by Dirty Tesla and more recently the Community Tracker: Home - it's alarmingly flat miles / DE. 4-8 miles for 30 months of public beta. Not bullish.
As compared with you, a sample size of one? YouTube anecdotes? Do you have a better data source? What's your miles / DE and miles to intervention? How's it trending?Obviously you are an expert on it because you read the Internet!
The community tracker definitely should be considered a source of accurate data, after all it has awesome penetration with
View attachment 912262
That's 3 percent of 1 percent of the drivers with it.
I use it every day. Been doing so for quite a while.As compared with you, a sample size of one? YouTube anecdotes? Do you have a better data source? What's your miles / DE and miles to intervention? How's it trending?
We all wish Tesla would report these KPI:s, and we all know why they don't and won't.
Of course not. But it's been clearly shown that the car DOES respond to speed limit signs. That doesn't mean it is 100% guaranteed to respond to ALL signs in ALL conditions. Do you know of any system that does provide such a guarantee?So you just choose to ignore reports of signs not being read, displayed, or reacted to?