Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
What humans do a better job at is intuitively figure out when to give a larger margin as appropriate.
Humans also just do a better job of keeping a consistent following distance. I don't know why, but I can't argue with the results. Somehow they can just judge distances, keep them consistent, and smoothly adjust them, effortlessly (assuming they're paying attention).

And pretty significant as the first claim of a v12 accident, however minor.
Certainly significant. It is just a matter of time, of course. But the reality is there will be few accidents which we can actually verify as being caused by FSD Beta. We just know they will happen.
Has the guy even posted a picture of v12 installed on his car as proof?

That, with the small rollout of v12 leads me to think he doesn't have v12 and is trying to get some "air time."

Seems reasonable to post a picture of that as evidence. It is definitely possible this guy doesn't know what version of the software he has or whether he received an update. He said this collision left damage to his car that was "barely noticeable to the naked eye." 🤷‍♂️ That being said, I think this sort of behavior turning into a spot would be inconsistent with any v11 behavior. So if it was engaged it seems like it was likely v12.

IMG_0353.jpeg


Kinda amusing that WholeMarsBlog and others are going on the offensive because the guy late last year was receiving warnings when using FSD Beta and I think was suspended at one point

I’m trying to understand how something so obvious as an imminent collision wasn’t intervened with by the driver?

Whole Mars & Co do have a point. The level of negligence or incompetence to get even a single strike is incredible (this likely explains the accident). And this driver got at least two, according to his own post! Many have managed to navigate years of FSD Beta use with buggy DMS software giving premature strikes, and deal with camera failures, without a single strike.

That said, ultimately FSD needs to work when used by negligent and incompetent drivers. On the other hand, for this rollout Tesla had a choice and they decided to roll it out to known-to-be-incompetent and/or negligent drivers, so it's kind of an own goal on their part assuming it was a collision on FSD Beta v12 (which I have no reason to not believe).
 
Last edited:
It's too bad Tesla didn't release V12 to all the beta testers, they lost out on all of the data which would have been collected to feed back into the system. Beta testers are capable of managing V11an its idiosyncrasies, why not V12.

It's like "Pub nite". It's all about how fast you can pour vs. how fast you can swallow. The goal is to not fall off your chair.
 
  • Funny
Reactions: FSDtester#1
Interesting case. Technically, FSD had the right of way so it handled that turn correctly. But I think the human driver thought that they had the right of way so they felt like FSD cut them off and got annoyed. To be fair, it does look like FSD was being a jerk even though it had the right of way. Probably FSD maybe should have yielded just to be safer, to avoid any possible collision. After all, FSD could have waited for a bigger gap so as not to cut off the other driver like that. That would have been safer IMO.

Left turn yields unless there is a green arrow for left turn.
 
Left turn yields unless there is a green arrow for left turn.
True, but if the opposite direction has a green arrow (ie protected left turn), the FSD vehicle would not have had a green light.

Edit: I interpreted this as if you were suggesting that in this case it was possible the oncoming traffic had a green arrow. On rereading it you're not necessarily implying that, so ignore...I don't think I'm contradicting you or suggesting anything that you don't already know :).
 
I think FSD should have ceded that right in order to avoid a potential collision
Do you really think there was significantly higher potential of collision here? The honking vehicle did not even enter the intersection by the time 12.x was already exiting the intersection. Even if there were no inner-left turn vehicles to wait for, I would think the impatient vehicle would still have honked given their road rage behavior.

12.2.1 right green.jpg


If instead the Tesla had basically been at the similar progress of completing an outer-left turn as the vehicle directly in front completing the inner-left turn, you're saying there would be a potential collision at a distance of 2 cars ahead of the outer-left turn honking vehicle that is last to enter the intersection? If that's the threshold of potential collision for you, stopping at any red light is potential for anybody behind not stopping in time, but FSD shouldn't have to cede right of way.
 
Really my biggest annoyance (and sounds like other v12 testers too) is the car's tendency to be a granny and take longer to reach my speed overage setpoint, let alone speed limit, on city streets. I messed with toggling to Assertive mode (from Average) for a bit, and all it did was make takeoffs from lights/stops more forceful, but still hesitated to reach and/or maintain faster speeds.
Did you try out Automatic Set Speed Offset? Or it was already too slow for what you would want anyway? Sounds like the driving profile might have some affect on max acceleration while set speed should affect max velocity. Although I wonder if end-to-end is taking those as inputs versus some control wrapper limiting allowed acceleration and velocity?
 
Even with 11.4.9 I can only use it for 30 seconds and I have to disengaged… V12 feels like you have your own uber driver… Also when you arrived home it will pull to your driveway.
How much longer would you say you keep 12.x engaged compared to 30 seconds with 11.x? Sounds like you kept it on through the destination, so do you think it's comfortable and safe enough for you to generally keep it on for say 300 seconds or more now?
 
Do you really think there was significantly higher potential of collision here? The honking vehicle did not even enter the intersection by the time 12.x was already exiting the intersection. Even if there were no inner-left turn vehicles to wait for, I would think the impatient vehicle would still have honked given their road rage behavior.

View attachment 1022251

If instead the Tesla had basically been at the similar progress of completing an outer-left turn as the vehicle directly in front completing the inner-left turn, you're saying there would be a potential collision at a distance of 2 cars ahead of the outer-left turn honking vehicle that is last to enter the intersection? If that's the threshold of potential collision for you, stopping at any red light is potential for anybody behind not stopping in time, but FSD shouldn't have to cede right of way.

I am talking about a few seconds later here. You can see the Tesla FSD is in the path of the other vehicle. Luckily, the other driver was paying attention, honked and slowed down or yes, there could have been a collision.

UbwOzUC.png
 
12.2.1 shortcomings:

1) Speed stuttering with no lead car
2) Parking lot hesitations, can't find a spot, wheel turning back and forth
3) Micro-stuttery creep on unprotected left from stop sign with obstructions on sides
Hopefully these moments of confusion are generally detected of inconsistent speed and steering. Seems like a shadow mode trigger to find examples of stutter that end up doing the same behavior without the stutter could be sent back from the wider fleet for 12.3 training to smooth out. Do we know if "Minor Fixes" 2024.2.7 newly going out around the time of 12.2.1 could be running some of 12.x in the background to collect data?
 
Day 2 12.2.1:

1) Still great overall
2) A bit clunkier accelerating / stopping than first drive, I'm fairly confident some of the training videos are of V11
3) Had to accel push twice at a stop sign / UPL out of driveway because it was taking a couple seconds too long and I had places to be
4) Unprotected rights are fairly aggressive on "aggressive" setting, it would try to go asap
5) Impressive parking lot navigating performance, still poor parking spot finding / decisions
6) The driving in general is totally different than V11, it's a whole new unpredictable beast with whole different set of behaviors
 
I am talking about a few seconds later here. You can see the Tesla FSD is in the path of the other vehicle. Luckily, the other driver was paying attention, honked and slowed down or yes, there could have been a collision.

UbwOzUC.png
FSD should absolutely never force another driver to slow down so FSD can take its right of way. Thats asking for trouble.

(actually no one ever should, never know if they are looking, or if their foot will get stuck, or they will confuse the gas for the brake. Just safer to never depend on another driver to prevent a crash, if you can help it).
 
FSD should absolutely never force another driver to slow down so FSD can take its right of way. Thats asking for trouble.

(actually no one ever should, never know if they are looking, or if their foot will get stuck, or they will confuse the gas for the brake. Just safer to never depend on another driver to prevent a crash, if you can help it).
You say prevent but imo it should "not cause". The other driver clearly was not giving the right of way. Imo that the not surrendering its right of way in NYC are the worst bugs we can find, imo says a lot about the level of this beta release.
 
Last edited:
FSD should absolutely never force another driver to slow down so FSD can take its right of way
And yet that's what happens when FSD slows down forcing others behind to slow down even when these vehicles want to keep going:

This can happen at stop signs too where people behind can expect you to enter sooner or roll resulting in honks. Given that Tesla has been training compute limited, could Tesla have addressed more urgent safety issues or avoided some of the issues people have experienced with 12.2.1 if not for other required training?
 
  • Funny
Reactions: Thp3