Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Guess v11 is broken for you. No idea in that use case why it would have so many problems.

So you see this eyes on the road AND with both hands on the wheel (at 9 and 3, not at the bottom, in your lap)?
For example, this shows none of that. The nags are infrequent, though as I said they need to work on driving down false positives (perhaps one here
It isn’t always like that - I’ve actually noticed significant variability in the nagging with no clear reason why. On the instance I described I was sitting, looking straight forward with my hands on my knees next to the wheel, positioned so I could reach up with my fingers and tug the wheel whenever it started to nag. I’ve tried holding my hands on the wheel and there was no change in the nag. I actually tried covering the camera with my thumb and shortly got a blaring “malfunction! Take over immediately!“ alarm that kept going off even though I disengaged FSD completely. (It wasn’t a time out because I could reengage as soon as I uncovered the camera.)

There's less stop sign confusion but even that wasn't perfect
Except humans routinely get confused at stop signs. How often do you see multiple cars at a 4 way stop and the drivers are unsure of who should go first, with 2 cars starting and stopping then one driver waving at another to go?
This doesn't track. They are obviously training individual areas as we see test drivers testing Chuck's UPL.

It's way too early to say "they did it". We are still seeing level 2 ADAS that still makes mistakes on curated drives. I don't see anything that doesn't suggest robotaxi is imminent. TBH, it appears to be another small step from 11.4.9 comparing his videos, but hopefully the backend will be easier to make substantial changes as the progress with FSD has been incredibly slow.
Testing and training in specific areas is not the same as specifically programming heuristics for individual sites or situations. The former is generalizable, the second is not.
 
No, it's not. You are making leaps from what they have said to what you believe.

Tesla absolutely hired test drivers in the 2nd half of last year. The positions were posted here. We've seen them on the roads in videos.

Never once has Tesla said they have employed drivers out there providing training data for V12, lemme know if you have other evidence

At this point, I'm more correct

Also, I already told you that testing is different than training
 
Never once has Tesla said they have drivers out there providing training data for V12, lemme know if you have other evidence

At this point, I'm more correct

Also, I already told you that testing is different than training

Tesla has been hiring drivers since July to help FSD. We obviously see those drivers are still working as of December.
 
There's no more concept of overfitting wrt to V12, overfitting was regarding the heuristics in V11
There are different usages of "overfitting" where I would guess Elon Musk saying it might come from the machine learning technical meaning (actually mathematics) as opposed to when non-machine-learning people might use that word. Overfitting can happen when a neural network is "too big" and training results in various neurons to pick up on noisy signals that aren't actually relevant to the desired behavior. One way to avoid overfitting is to have more diverse training data.

V12 neural networks might actually make the potential for overfitting worse, but Tesla's training process with many millions driving videos from around the world probably helps reduce overfitting.
 
  • Disagree
Reactions: uscbucsfan
There are different usages of "overfitting" where I would guess Elon Musk saying it might come from the machine learning technical meaning (actually mathematics) as opposed to when non-machine-learning people might use that word. Overfitting can happen when a neural network is "too big" and training results in various neurons to pick up on noisy signals that aren't actually relevant to the desired behavior. One way to avoid overfitting is to have more diverse training data.

V12 neural networks might actually make the potential for overfitting worse, but Tesla's training process with many millions driving videos from around the world probably helps reduce overfitting.

When the idea of overfitting to the Bay Area first came up (I think in an Elon post), I thought it was regarding the heuristics and road geometries familiar to the Palo Alto engineers working on FSD.

But ya, there's also a different kind of overfitting that relates to NNs
 
Except humans routinely get confused at stop signs. How often do you see multiple cars at a 4 way stop and the drivers are unsure of who should go first, with 2 cars starting and stopping then one driver waving at another to go?
If there was one question on the driver test asking which car at a four way has the right of way, most humans would fail. FSDb should pass this test with 100%. BTW, the person on the left must yield to the person on the right ... i had to look that up 😅
 
If there was one question on the driver test asking which car at a four way has the right of way, most humans would fail. FSDb should pass this test with 100%. BTW, the person on the left must yield to the person on the right ... i had to look that up 😅
The car arriving first has the right of way. If two cars arrive at the same time the car on the right has the right of way. If 4 cars arrive at the same time the drivers need to get out and do Rock-Paper-Scissors to decide who goes first.

THe real problem arises in that it’s often not completely clear who arrives first, especially since humans usually don’t fully stop. In these situations I’ve developed the habit of intentionally stopping a bit harder so the car jerks enough so the other driver can see that I stopped later and then assumes they have the right of way. Usually works but not always.
 
Never once has Tesla said they have drivers out there providing training data for V12, lemme know if you have other evidence
Ok, Tesla hasn't explicitly said it, but they are hiring for that position now...
What to Expect
We are looking for a highly motivated self-starter to join our vehicle data collection team. The Vehicle Operator role is responsible for capturing high quality data that will contribute to the improvement of our vehicles performance. As a Vehicle Operator, you will be driving an engineering vehicle capable of dynamic audio and camera data collection for testing and training purposes. Access to the data collected is limited to the applicable development team.
 
The problem is the nag-o-matic becomes disruptive. I've complained about this before - if they are doing gaze tracking and it can tell I'm looking at the road, why does it nag me every 10 seconds? (Yes, I've timed it.) I've had it nag me when I'm looking down to change the climate settings. I've had it nag me when I look down to see what the alert that just popped up was (the alert said "pay attention to the road" 🙄) I've had it nag me the I was looking at the nav screen to make sure I would have enough battery to make it home.

The latest changes really have made it more of a chore to use FSD without improving my focus (and at times disrupting my focus.) As such I view them as a failure.

Obviously keep eyes on the road or wear sun glasses and now additionally I hang the wrist on the steering wheel
Prior to 12, this works for a very long time
Wrist hurts after a long while
 
Obviously keep eyes on the road or wear sun glasses and now additionally I hang the wrist on the steering wheel
Prior to 12, this works for a very long time
Wrist hurts after a long while
I’ve tried. I haven’t found a way to hang my wrist on the wheel that prevents the nag, doesn’t disengage and is comfortable for me.

Again - if they can detect gaze, why do they need anything else?
 
Again - if they can detect gaze, why do they need anything else?
Because hands on the wheel are critical for controlling the car. It can make very sudden moves and having a secure hold of the wheel on the spokes ensures this will result in near immediate disengagement. There’s no way around this. You cannot just be looking at the road. It also makes disengagement if anything unusual occurs less likely and slower which can be dangerous.

Anyway I tried today with hands off in my lap (a very bad place for them) and everything seemed fine. Occasional prompts to torque the wheel. Just can’t look at the screen - it can directly warn you to pay attention and I think it may increase likelihood of wheel torque input requirement (not sure about that).

And I didn’t see anything unusual in the v12 videos either. Maybe it nags a bit more but hard to say with such a small sample size.

Just put hands at 9 and 3. It is super comfortable and makes you feel like you are in full control (which you are). At least try it for a while to see whether the issues are reduced.
 
I think what Mistral AI is doing is important. They're addressing the "why" question in LLMs. Helps the model understand context and reasoning, relationships and meaning. More than just memorizing patterns. It's called "concept-based learning."

Concept based learning breaks down text into core concepts and then teaches the model to understand the relationships between the concepts. Ipso facto logic and reasoning.

Mistral is also working on "causal inference." Helps the model understand cause-and-effect between actions and events.

To me, this is the bridge between LLMs and FSD. Baby AGI.

I'd be very surprised if Tesla doesn't have similar capability.
 
Ok, Tesla hasn't explicitly said it, but they are hiring for that position now...

Well, the point of contention is here:

Is Tesla employing drivers to create "good" driver videos to add to the V12 dataset

Or

Is Tesla curating the good driver data from cars in the fleet

Perhaps it is both, but we don't have any evidence of the former. Elon and Ashok talked about curating videos of good drivers, along with difficulty of finding drivers that stop to 0mph. They never mention hiring people to go out there and stopping to 0 to get videos.

Your job posting could be related to anything. We already know that Tesla has data collection cars out there with lidars attached, so that job posting may have nothing to do with employing drivers to add to the V12 dataset.
 
  • Disagree
Reactions: uscbucsfan
True there is a clear improvement on comfort and driving style. But the issue is that the Tesla faithful don't see it as that. As usual on the eve of every new version they proclaim it as a "Technological breakthrough on scale never seen before. A game, set, match!"
So by "Tesla faithful", you mean YouTubers shilling for views?

I'm a hard core follower of FSD progress and I've never thought any one release was a breakthrough.

However, I do see V12 as the most important release yet. With V12, I think we have a release that is far less likely to hit another local maximum. If it does, it will probably be due to hardware limitations. I've never been able to say that about any previous release.