Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Did you try out Automatic Set Speed Offset? Or it was already too slow for what you would want anyway? Sounds like the driving profile might have some affect on max acceleration while set speed should affect max velocity. Although I wonder if end-to-end is taking those as inputs versus some control wrapper limiting allowed acceleration and velocity?
i did try out Auto Speed Offset for a few days, but for the routes i drive, it's just too slow. it'll pace surrounding cars just fine, but once i catch an empty stretch of road, the car will hover closer to speed limit, then ramp up as other cars catch up and/or overtake. I find that even using the old setpoint offset limit, the car isn't zooming up to the setpoint like I'm used to v11.x doing, so this might be somewhat intended v12 behavior that i just have to get accustomed to.
 
  • Informative
Reactions: rlsd
That makes no sense, if a human tries to aggressively do something, they're also hallucinating?

Hallucination in this context is like swerving around a non-existent car or obstacle, seeing an non-existent situation and responding to it
Point taken, not a hallucination, just a bad decision. It ignored two strong material facts it should have recognized: double yellow line and no visibility of oncoming traffic. Somehow, it has training saying its OK to go around a stopped car, but not enough training to include all the exceptions to the training. It seems to severely lack enough situational awareness to veto itself.

It begs the question: how many other actions does it allow itself to do, but lacks the ability to know when any "OK" action is safe? I think this is a fundamental flaw of NN implementation prior to somehow validating every possible scenario, which is impossible.

Someone said above Tesla is training compute limited? If true, that's insane. Train the models on all available historic (good driving) video, then test inhouse with paid trained drivers.

Letting a version out into the wild that allows a car to careen across a double yellow into head on traffic is like shipping an airplane missing the bolts holding a door on. Its a quality breach that should result in a grounding of the code. Turn it off until you can prove it's safe. There's not really any grey area here.
 
  • Like
Reactions: powertoold
But it was a right turn only lane (going straight would have sent you up the off ramp of the freeway). My interpretation is that regardless of the law there is a social understanding that both directions will turn at the same time and there is some subtlety to that negotiation that is not captured in the video.
I'm just saying in general.

California law does *not* differentiate between a right-turn only scenario and other unprotected lefts. I transcribed the California code a few posts above:

The driver making the left turn must *always* yield to any oncoming driver making the right turn if the turn is not protected (which it can't be in this case since the FSD car had a green).

You are right that in this case, since it's a right-turn only lane there isn't really risk of a head-on collision and so the oncoming drivers were probably more likely to go for it than in a typical 4-way intersection. But the law is very clear that there is no protection for the driver turning left, and no allowance for them to do so as long as everyone stays in their own lane.

And in general it's just a really bad idea to trust that someone turning right is going to stay in their lane or that someone turning left is going to stay in theirs--just as it's a bad idea to assume that when someone's turn signal is on that they're actually going to turn.

There was an earlier post indicating that some states allow an unprotected left at the same time that oncoming traffic is turning right, provided the drivers stay in their own lane. The only place I can think of where this makes sense is if the car turning right turns into its own lane (usually there's a protective median of some sort there). I may be naive but I highly doubt that simultaneous right turns and unprotected left turns onto a normal road are permitted by law--I'd like to see the letter of that law. If that's the case, those states have incompetent legislators and probably high insurance and vehicular manslaughter rates!
 
It didn't try to pass, and it wasn't a blind curve.

I couldn't tell if V12 was just going around the parked car just before the end of the stopped line of cars, or if the fact that the last car in line came off its brakes without moving much, making V12 think that it was a stopped car which it should go around. Once the brakes came back on, V12 dropped in behind it.

The steering wheel didn't give off any indication that it was confused. The movements were entirely fluid.

The blue path shows it attempted to get into the oncoming lane. The speed indicates it wasn't planning to tuck-in behind the lead vehicle. It successfully corrected itself at the end but it looked like an uncomfortable stiff brake (we could easily calculate g-force). And looking ahead you can see the line of vehicles bending around the curved roadway.
 
  • Like
Reactions: JB47394
The blue path shows it attempted to get into the oncoming lane.
Thanks for pointing that out.

The noodling matches up with the car's brake lights. The brake lights go out and V12 decides that it's a stopped car that is double-parked (typical in downtown areas), so it plans to go around (despite a line of cars ahead of it). The brake lights come back on and V12 decides that it's traffic in its lane, so it plans to stop.

More training is needed.
 
  • Like
Reactions: kabin
Thanks for pointing that out.

The noodling matches up with the car's brake lights. The brake lights go out and V12 decides that it's a stopped car that is double-parked (typical in downtown areas), so it plans to go around (despite a line of cars ahead of it). The brake lights come back on and V12 decides that it's traffic in its lane, so it plans to stop.

More training is needed.
Yes, maybe FSD thought the cars were parked. Very similar to those narrow residential Berkeley Hills streets?
 
  • Like
Reactions: JB47394
There was an earlier post indicating that some states allow an unprotected left at the same time that oncoming traffic is turning right, provided the drivers stay in their own lane. The only place I can think of where this makes sense is if the car turning right turns into its own lane (usually there's a protective median of some sort there). I may be naive but I highly doubt that simultaneous right turns and unprotected left turns onto a normal road are permitted by law--I'd like to see the letter of that law. If that's the case, those states have incompetent legislators and probably high insurance and vehicular manslaughter rates!
For Michigan, right turns must go into closest lane. Left turns are only restricted in the case of a one way road to a one way road.
This allows simultaneous left and right turns onto a 4 lane road.
However a double left single right into two lanes? That's cray cray, and I can't find a right vs left yeild priority rule beyond just yeilding to vehicles in the intersection.

SmartSelect_20240227_115427_Firefox.jpg
 
I definitely get that impression from what I have seen of V12 so far. It can do some things really good and even impressive and yet do some things really bad.
We can't, because V12 isn't telling us what its perception of reality is. We'd need to compare that to our own to know if it was hallucinating. The V11 visualization lets us spot perception hallucinations, but stuff like phantom braking remains a mystery in certain cases because V11 doesn't tell us why it's braking. It might be perception issues, but it might also be reasoning problems (derivative reactions to perception). V12 tells us nothing except what we can infer from its behavior, and that's a tough row to hoe.
I've read recently that models are getting better at answering the "why" question. Think how much our own decision making gets when we understand the "why?"
 
  • Like
Reactions: JB47394
Whole Mars & Co do have a point. The level of negligence or incompetence to get even a single strike is incredible (this likely explains the accident). And this driver got at least two, according to his own post! Many have managed to navigate years of FSD Beta use with buggy DMS software giving premature strikes, and deal with camera failures, without a single strike.

That said, ultimately FSD needs to work when used by negligent and incompetent drivers. On the other hand, for this rollout Tesla had a choice and they decided to roll it out to known-to-be-incompetent and/or negligent drivers, so it's kind of an own goal on their part assuming it was a collision on FSD Beta v12 (which I have no reason to not believe).
Yup it's mostly amusing because they should instead be going after Tesla's FSD processes and standards that allow stuff like this to happen, this is how you end up with the regulator stepping in to tighten things up further.
 
Point taken, not a hallucination, just a bad decision.
I think in the terminology of AI and LLMs, this sort of behavior potentially could be hallucination. It hallucinated the correct path to take even though it perceived everything correctly. Just a “solution” it came up with. In this case maybe it is something much more mundane - it’s impossible to know what happened (for us).

Hallucination can just mean making up an incorrect solution, as I understand it. It doesn’t have to mean it saw something that did not exist.
And in general it's just a really bad idea to trust that someone turning right is going to stay in their lane
In California you can usually turn into whichever lane you want (there are obvious exceptions). In this case it looks like going directly to the far lane (not quite what FSD did which was part of the problem) was legal.
 
Last edited: