You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
What’s the rollout for .2.3 looking like? Still no luck here getting it.
Maybe I missed something but just looks like a normal attempted veer into traffic next to him. This happens all the time. Not a regression.Check out 9:43
Maybe I missed something but just looks like a normal attempted veer into traffic next to him. This happens all the time.
It literally tells you it will do this.i love this forum's perception of safe great, technology. hahahaha
It literally tells you it will do this.
I did not say anything about it being good. I just said it was normal and not a regression (context!).
I mean, that was pretty poor. Not sure why it didn’t slow earlier and faster. Either that or go through on the yellow! Choose one or the other!Also this guy seems to be a very cautious tester, which is good, but it means you cannot just point out his disengagements as failures of the system. Right after this disengagement, there's a red light, the car is coasting to a stop, a pedestrian starts to cross, she gets highlighted on the screen in blue... and then he disengages anyway. The dude's welcome to disengage whenever he feels uncomfortable, but I wouldn't have disengaged in that situation.
DrChaos,This could be hardware limitations. The cameras are only 1280x960. It just doesn't understand what's happening downroad where humans do. Human foveas are much better and are on smart double-gimbals.
I don't think fsd beta currently considers road markings. In a lot videos, people will say that the car "sees" that it's a right or left turn lane (in the visualization) but makes a dumb decision despite the road markings.
Right now, Tesla is likely training their lanes network in more of an "end-to-end" approach, so the network consumes the entire scene along with all road markings and then makes predictions about the lane semantics. So the car doesn't actually "see" lane markings in the way humans do. That's why we still experience dumb lane decisions despite the road markings showing up on the visualization.
The preview visualizations are still there although FSD Beta seems to also draw white driveable and gray non-driveable surfaces on top of those visualizations. So I agree most of the time the arrows seem to be missing but they can appear such as this right turn and bike lane from a recent 10.69.2.3 video:The "FSD Visualization Preview" you get on HW3 without FSDb clearly identifies and displays turn lane markings
Yeah, 10.69 Occupancy Network visualizations show things that are in driveable space and AI Day showed gates and garages as brown boxes:I'm still on 10.69.2.2, but wondering if anyone else has seen something like this with the visuals?
Yep, it's ironic how little regen seems to be used.I'm rather disappointed that I'm still much better at judging a "regen only" stop than FSDb is.
Oh well.
Dirty Tesla provided some insights after chatting with Autopilot team at AI Day:I still doubt they prioritize snapshots. I see so many people snapping preferential / comfort / silly issues constantly. If I were Tesla I'd prioritize hard brakes or steering disengages (more likely safety issues)
It’s also ironic that I routinely get 230 Wh/mi when using FSD. Not sure how to reconcile these two.Yep, it's ironic how little regen seems to be used.
Are you driving an SR+? I get 300 Wh/mi on FSD beta. Significantly worse than when I drive myself.It’s also ironic that I routinely get 230 Wh/mi when using FSD. Not sure how to reconcile these two.