You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's assanine to feel this much anger for something so inconsequential.I'm sure you've been frustrated getting stuck behind a slow driver before. How would you feel if every other car on the road was driving through intersections the way FSD currently does? It's NOT acceptable and not human-like either. I would like to blend in with the rest of traffic, not be a public nuisance and inspire contempt for autonomous driving.
Yeah...I find that the overwhelming majority of the times I disengage or intervene in v12.3.6 it's not because I had to. It's because the car was taking too long at a stop sign, driving too slow (or occasionally too fast) or because it stayed behind another slow driver when I wanted to go faster and pass them.I'm sure you've been frustrated getting stuck behind a slow driver before. How would you feel if every other car on the road was driving through intersections the way FSD currently does? It's NOT acceptable and not human-like either. I would like to blend in with the rest of traffic, not be a public nuisance and inspire contempt for autonomous driving.
You are absolutely right. The difference of a few seconds is pretty meaningless in the grand scheme of things.What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's borderline stupid to feel this much anger for something so inconsequential.
V12.3.6 almost completely eliminated the FSD Wiggle for me.Wobble: I see the exact same lane change hesitation and have heard it mentioned before.
It's human nature. It's not changing anytime soon and it needs to be factored in. Besides, from a tech point of view, I want the car to drive efficiently and not pass up on perfectly safe opportunities to advance. Whether or not they are disengagement/intervention worthy, they are still mistakes.What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's assanine to feel this much anger for something so inconsequential.
Yes! Why are we acting like humans around here? We must take our medicine. When FSD craps on us, be thankful, and ask for more.What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's assanine to feel this much anger for something so inconsequential.
Useless is a harsh word, but from a utility point of view, with autonomous driving it's really all or nothing. If it has an even modest chance of making a critical mistake, that means the rider must be engaged in the driving process, and thus cannot use that time to be productive. From a tech point of view, I agree 100% that FSD is a marvel that accomplishes things no other system even attempts.That's not to say there aren't a lot of issues to fix still. But I think calling it useless or worthless is a pretty big stretch. It's actually a pretty miraculous system that's better than anything else offered by any other OEM.
It does a terrible job in busy parking lots.Imo Tesla's fsd performance surpassed waymo's as it is forced to handle much more challenging scenarios. This includes all the unreasonable 6 lane blind UPLs, needing to go 15% over the speed limit at all times, navigating through parking garages and extremely busy parking lots, etc etc. Many places waymo doesn't even touch.
Humans are flawed. A robot will wait until there's zero chance of collision. A human will wait until there's an acceptable risk due to a lack of patience. Right now humans want the robot to take similar acceptable risk, then demonize it if something bad happens, but also want a crash free future.Yes! Why are we acting like humans around here? We must take our medicine. When FSD craps on us, be thankful, and ask for more.
You're right, it would probably make it 99% of the time. So do absolutely terrible human drivers, even drunk ones. The 99% statistic doesn't make FSD a good or safe driver.Another week and still no critical safety disengagements in a long time. Some regular disengagements of course.
Got me to thinking. What if I never disengaged or used the accelerator pedal what percentage of my drives would "successfully" reach the destination? Sure that would mean I'd annoy other drivers and when FSD misses or takes the wrong FSD would have to reroute. I think the biggest problem would be coming up to a construction zone where the officer/road crew person wanted me to stop since FSD doesn't respond to hand gestures. Except for this type of disengagement I find it hard to think the percentage wouldn't be over 99%. What do others think?
I've experienced the same thing, for both v12 on city streets and v11 on highways. It seems to strongly prefer the leftmost and rightmost lanes, leading it to frequently make such mistakes (for both left-turnoffs and right-turnoffs), when arguably the default bias should be (when a lane splits) to move toward the middle lanes (away from the leftmost/rightmost lanes) unless making or about to make a turn. It sometimes does this even while the nav visualization / path planner still shows the correct lane/path.For me 12.3.6 has the same problem I’ve had the past few releases. In my home county it’s mostly 2 lane rural roads. At many points where there is a turnoff of the road there has been a bypass lane added so you don’t have to wait for left turning traffic. FSD dives into these bypass lanes and comes out again every time it sees one. Really frustrating and makes you look like an idiot to other drivers. I don’t know what the thing is thinking as you can see it pick the the whole route on the screen so it can see the new lane is is only a few feet long and there is no point to entering it and then immediately be forced to merge back.
Evidence needed. Extraordinary claims require extraordinary evidence.A robot will wait until there's zero chance of collision
That’s a flawed robot.Humans are flawed. A robot will wait until there's zero chance of collision.
The robot will do whatever it's programmed (or for a neural network, trained) to do. The question of how to appropriately balance risk with reward is a highly nontrivial one, for both humans and robots. An AV (or human) trained for zero risk would never leave the driveway. So no, we don't expect AV's to never make mistakes. But we do expect them to never make stupid pointless mistakes.Humans are flawed. A robot will wait until there's zero chance of collision. A human will wait until there's an acceptable risk due to a lack of patience. Right now humans want the robot to take similar acceptable risk, then demonize it if something bad happens, but also want a crash free future.
1 In 20 miles? I haven't disengaged for weeks.
There may be brilliance in exquisitely trained NNs that we cannot conceive of.So no, we don't expect AV's to never make mistakes. But we do expect them to never make stupid pointless mistakes.
Actually it does a good job in parking lots considering it hasn't been trained on them yet. Parking lots (aka ASS & Banish) is coming soon but NOT here yet. In parking lots it is "figuring out" how to drive based on City Street training data.It does a terrible job in busy parking lots.
In a parking garage, Navigation thought it was driving on the street outside the garage. I don't know what it would have done if I had enabled FSD, because it seemed like a bad idea to try it. This was exiting after an event, so traffic was bumper-to-bumper going down the ramps with no room for error.....