Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
After Tesla releases FSD beta to all paying customers what metric can we use to measure capability?

Does the graph show miles driven by FSD Beta participants? Or miles driven with FSD Beta active? Because if it's the latter, and you hold the number of participants constant, then an increase in miles driven is a decent proxy for the participants finding FSD useful (all else equal, more use indicates it's more useful).
 
  • Like
Reactions: BnGrm
Does the graph show miles driven by FSD Beta participants? Or miles driven with FSD Beta active? Because if it's the latter, and you hold the number of participants constant, then an increase in miles driven is a decent proxy for the participants finding FSD useful (all else equal, more use indicates it's more useful).
It's miles driven using FSD Beta. It's about 3 miles a day per user. The slope appears directly proportional to the number of FSD beta users showing approximately no change in miles driven per user. Maybe if you squint you can see in January after the huge increase in users the rate was slightly higher and then dropped off a little bit during the first half of this year when no users were added.
 
  • Helpful
Reactions: willow_hiller
For all we know safety is mostly a function of the capability of the driver monitoring system and that's why Tesla felt comfortable releasing it to more people.
Good point. Might be the plan for wide release by end of year - add more capabilities, detect defeat devices. Monitoring is definitely more rigorous than it was when we started the wider FSD Beta program.

Hopefully they can keep on tightening it up and really make it inconvenient with the wide release (especially after any inattention).

To be clear: I don’t see wide release by end of year. It’s possible they’ll make it an optional wide release (with lots of strict monitoring and a low strike limit still in place), I guess?
 
Last edited:
  • Informative
Reactions: FSDtester#1
Mine stays on and the car goes straight then turns it off 😅😅
Is this at a fork in the road by any chance.

I have had a long term issue, pre FSD even, on an exit ramp. My car would get in the service lane to exit. it should go straight as shown with GPS, but would take the wrong exit and turn right onto the wrong ramp. The service lane is very short and thinks after exiting the highway it also needs to make an immediate right when it should not.

With later versions of FSD I come to the critical point where it must commit and go straight. The car now blinks the left turn signal once or twice then proceeds straight. It doesn't turn left back into traffic, but it no longer screws up and turns right. I just chuckle and go with it. Assuming the computer is thinking out loud to make the proper maneuver. Prior to that update, I would personally hit the left turn signal to make it go straight. (same as overriding the car from jumping into right hand turn lanes. (I still have to do this near my office.)
 
I'd like to see them try. I use a sock with some ball bearings in it, they bounce around a bit so the torque applied is uneven.

I highly doubt the company that can't figure out how to make reliable auto wipers will be able to defeat this.

Btw I only use this on straight freeways on long trips where I am paying attention.
You and @2101Guy are a pair. Let's see: There's stuff in the release notes with the beta that state, "The car will do the wrong thing at the wrong time."

You go, more or less: "I can make my own decisions! I'm immortal! Screw everyone else!"

I wonder, do you wander around without a driver's license and insurance? Do you put counterfeit license plates on the car, so you can save on the registration costs? Are traffic laws Up To You and You Do What You Want?

The people who designed the car said, "Hands on at all times. It's dangerous."

You say, "La-de-da! I can do what I want!"

Y'know, people have reaction times. We're not infinitely fast. Having one's hands on the wheel means that one doesn't have to do that therblig of getting the hands to the wheel. The people who designed the FSD-b know about reaction times and, for safety's sake, they said, "Keep a hand on the wheel, dammit!" You took that design information and threw it out the window.

Now, if you were just going to have fun immolating yourself around that next tree, that'd be fine. Go drive on empty roads, fine. But roads aren't empty: They're full of other people. Some of those are rough, tough, hatin' people who anybody would want to see dead. The next minivan over is full of tiny babies. The next car over has an 80-year-old grandmother with her five-year-old grandkids and Mom in the back, she doesn't have great reaction time, but it's safe for her to drive, because nobody's expecting an out-of-control FSD-b Tesla whose driver is fumbling for the steering wheel to smacker her car on the right, sending her over the guardrail and rolling down the hill.

You're a blinkin' hazard to those around you. If there was any justice in the world, you'd be out of the FSD-b program, out of a drivers license, and getting around on public transportation. A bicycle on the road would be too dangerous for others with you on it.

Idiot.
 
I do wish Tesla would incorporate anonymized location-based collective driving experience into its AI model.

.2 is smoother, but even today it tried to get into the left lane about 150 feet before a right turn - in traffic, and for no reason. There was no car to pass, traffic was just breezing along.

Clearly vision needs to be the primary decision maker as road conductions can change. But somewhere knowing that 100% of people are in the right lane before making a right turn needs to influence its decision making model. This wasn’t a complex situation with a special turning lane. Just a 4 lane road, upcoming right turn.

I did use the camera report.
 
You and @2101Guy are a pair. Let's see: There's stuff in the release notes with the beta that state, "The car will do the wrong thing at the wrong time."

You go, more or less: "I can make my own decisions! I'm immortal! Screw everyone else!"

I wonder, do you wander around without a driver's license and insurance? Do you put counterfeit license plates on the car, so you can save on the registration costs? Are traffic laws Up To You and You Do What You Want?

The people who designed the car said, "Hands on at all times. It's dangerous."

You say, "La-de-da! I can do what I want!"

Y'know, people have reaction times. We're not infinitely fast. Having one's hands on the wheel means that one doesn't have to do that therblig of getting the hands to the wheel. The people who designed the FSD-b know about reaction times and, for safety's sake, they said, "Keep a hand on the wheel, dammit!" You took that design information and threw it out the window.

Now, if you were just going to have fun immolating yourself around that next tree, that'd be fine. Go drive on empty roads, fine. But roads aren't empty: They're full of other people. Some of those are rough, tough, hatin' people who anybody would want to see dead. The next minivan over is full of tiny babies. The next car over has an 80-year-old grandmother with her five-year-old grandkids and Mom in the back, she doesn't have great reaction time, but it's safe for her to drive, because nobody's expecting an out-of-control FSD-b Tesla whose driver is fumbling for the steering wheel to smacker her car on the right, sending her over the guardrail and rolling down the hill.

You're a blinkin' hazard to those around you. If there was any justice in the world, you'd be out of the FSD-b program, out of a drivers license, and getting around on public transportation. A bicycle on the road would be too dangerous for others with you on it.

Idiot.

Cry me a river.

I still get nags with my hands on the wheel on straight stretches of road. You realize throwing a sock over the wheel only works on straight roads right? There is also no law saying I can't have a sock on the wheel that I know of.

Contrast this with FSDb YouTubers who only put their hands on the wheel to remove the nags when they come up.
 
I do wish Tesla would incorporate anonymized location-based collective driving experience into its AI model.

.2 is smoother, but even today it tried to get into the left lane about 150 feet before a right turn - in traffic, and for no reason. There was no car to pass, traffic was just breezing along.

Clearly vision needs to be the primary decision maker as road conductions can change. But somewhere knowing that 100% of people are in the right lane before making a right turn needs to influence its decision making model. This wasn’t a complex situation with a special turning lane. Just a 4 lane road, upcoming right turn.

I did use the camera report.

Yes. My guesstimate is that over 50% of issues with FSDb are due to bad lane selection. It seems like an easier thing to tackle compared to perception, ULTs, etc, so it's strange that they aren't really addressing it. Maybe there is some underlying issue with maps or something that they cannot resolve. Wonder why.
 
WRONG on all 4 points. Why? For starters. You have zero evidence to support #1. Lets start there.
For 1-4? As indicated in every T&A, document, form, manual, instruction..you name it. The DRIVER of the vehicle is always 100% accountable for anything that happens with the vehicle. Whether the car is on AP, FSD, FSD(b), FSDB, manual mode...whatever. The DRIVER is always responsible. Under the law of every state in the USA (and probably every other country as well).

If a person runs over a pedestrian in a tesla and kills them? The driver of the Tesla is 100% accountable. There is no getting out of a claim under any circumstance with an excuse of "the tesla was on FSD at the time of the crash". With or without a weight. Also, the law cares not one bit about whether the car was on FSD, AP, or whether or not some thing was or was not on the steering wheel. The driver will get charged with manslaughter or murder (depending on state) and will be sued. (as they should) REGARDLESS.

Uh… everything you wrote argues my points. 🤷‍♂️

Using a device that allows you to bypass the safety device which tries to ensure you ARE in control of the vehicle so that you can operate the vehicle with LESS control is 💯 increasing the chance of crash.

And absolutely will land solely on you. Hard. Through every legal means.

This is the purpose of the dice and it’s intended use according to the operator’s own words bragging about it.
 
You and @2101Guy are a pair. Let's see: There's stuff in the release notes with the beta that state, "The car will do the wrong thing at the wrong time."

You go, more or less: "I can make my own decisions! I'm immortal! Screw everyone else!"

I wonder, do you wander around without a driver's license and insurance? Do you put counterfeit license plates on the car, so you can save on the registration costs? Are traffic laws Up To You and You Do What You Want?

The people who designed the car said, "Hands on at all times. It's dangerous."

You say, "La-de-da! I can do what I want!"

Y'know, people have reaction times. We're not infinitely fast. Having one's hands on the wheel means that one doesn't have to do that therblig of getting the hands to the wheel. The people who designed the FSD-b know about reaction times and, for safety's sake, they said, "Keep a hand on the wheel, dammit!" You took that design information and threw it out the window.

Now, if you were just going to have fun immolating yourself around that next tree, that'd be fine. Go drive on empty roads, fine. But roads aren't empty: They're full of other people. Some of those are rough, tough, hatin' people who anybody would want to see dead. The next minivan over is full of tiny babies. The next car over has an 80-year-old grandmother with her five-year-old grandkids and Mom in the back, she doesn't have great reaction time, but it's safe for her to drive, because nobody's expecting an out-of-control FSD-b Tesla whose driver is fumbling for the steering wheel to smacker her car on the right, sending her over the guardrail and rolling down the hill.

You're a blinkin' hazard to those around you. If there was any justice in the world, you'd be out of the FSD-b program, out of a drivers license, and getting around on public transportation. A bicycle on the road would be too dangerous for others with you on it.

Idiot.
You have so many fundamentally flawed points that I dont even know WHERE to start. 🤣

But lets start here: Are you even aware that Tesla has programmed the AP/FSD software to allow you to stop the nag by simply adjusting the stereo volume? Meaning no, your hands are NOT needed on the wheel. Are you also aware that unlike other car makers who incorporate sensors into the steering wheel that detect presence of hand, Tesla intentionally did not put those into their wheels/yokes?

I wont even get into the apples/oranges of you comparing things that are required by law (license/registration) to something that is 100% legal to use in every single US state/territory. :D
 
And absolutely will land solely on you. Hard. Through every legal means.
Thats all you had to type. Because its what I said: Regardless of counterweight on wheel or NO counterweight on wheel, THE DRIVER...is going to be held 100% accountable for anything that happens with the car. Period. There is no question mark, no asterisk. There is No "Well, you will get hit with additional charges if you kill someone with a Tesla and you have a weight on the wheel."

No..same charges.

Next.
 
  • Love
Reactions: Cheburashka
All the graph shows is that Tesla released FSD beta to more people between Oct 2021 and Dec 2021 and then again in May.
Actually what it shows is that more miles were driven.

We 💯 agree on what the graph shows. Please stop arguing that which we agree.

But the context of a graph reveals more information behind it. (Imagine a graph showing gender birthrate in China a while back which shows 55% boys and 45% girls. That graph shows only birthrate but in the context of what we know about China during a certain period, the graph ILLUSTRATES a whole lot more than it shows.)

So the miles on FSD graph illustrates a WHOLE lot more than it shows. If you understand the context of those miles.

To be able to correlate that to capability of the system you need to be able to understand the context as to who has been driving, where, and with how many accidents.

You could NEVER see that many miles in that short time on the first FSD beta on those different roads, with those different ability testers, (they wouldn’t be driving it much because it was so unpleasant). And if you did have them try that many miles there is NO WAY there wouldn’t be more accidents.

The fact is that the better and more capable it gets at handling various driving situations and roadways the more each tester uses it. And the more testers it is released to who aren’t hyper aware and hyper focused.

This rollout is VERY risky. One bad crash will be a media and Tesla-hater feeding frenzy. (For good or bad.) Geeze just look at they hey made over the bollard boop. It could set the program back years in regulatory hell.

Tesla understands this. Everything they are doing with the Beta rollout is tied to this concept. They only widen it or release a major change to the most careful and trusted testers.

Before they roll it out wider they need to be sure it is going to be capable of safely handling all the varied roads and traffic situations thrown at it by a less-and-less careful/trustworthy pool of testers.

The fact is IS being rolled out much wider and IS being used way more is illustrated by the graph which shows only the accelerating pace of miles driven.
 
Last edited:
  • Like
Reactions: FSDtester#1
Thats all you had to type. Because its what I said: Regardless of counterweight on wheel or NO counterweight on wheel, THE DRIVER...is going to be held 100% accountable for anything that happens with the car. Period. There is no question mark, no asterisk. There is No "Well, you will get hit with additional charges if you kill someone with a Tesla and you have a weight on the wheel."

No..same charges.

Next.

Uh this is completely wrong.

Drivers kill people every day at their own fault. If they're demonstrating full care and attention, they do not lose their insurance coverage for that accident, they do not (usually) get sued to oblivion for negligence (thanks to lower damages and insurance coverage), and they do not go to jail for negligent homicide.

All of that is WAAAAAAAAAY more likely if are found with a device in the car that is made to allow you to drive with less care and attention, epsecailly if they find you bragging on the internet that you use the device so you can exercise less care and attention.

You need to look up:

- the terms in your insurance policy
- the civil tort definition of liability
- the criminal law criteria when an accident becomes criminal manslaughter
- the criminal law criteria when an accident becomes negligent homicide
 
Uh this is completely wrong.

Drivers kill people every day at their own fault. If they're demonstrating full care and attention, they do not lose their insurance coverage for that accident, they do not (usually) get sued to oblivion for negligence, and they do not go to jail for negligent homicide.

All of that is WAAAAAAAAAY more likely if are found with a device in the car that is made to allow you to drive with less care and attention, epsecailly if they find you bragging on the internet that you use the device so you can exercise less care and attention.

You need to look up:

- the terms in your insurance policy
- the civil tort definition of liability
- the criminal law criteria when an accident becomes criminal manslaughter
- the criminal law criteria when an accident becomes negligent homicide
Nope. Tell you what. Lets use real world examples vs hypotheticals. Considering that many thousands of these devices have been sold, considering how many Teslas are on the road and how many miles they have driven:

Lets start here:
Please provide evidence..any at all. From any country. Any locale. Any case. Any article. Anything at all I'll accept. From any time from ~2014 to today in 2022. That supports that a wheel weight does this(as you say it does):

1. Increases the chance you kill yourself and/or others.

Just one confirmed case. Just one. From any country.


Ready..set..go!
 
Uh this is completely wrong.

Drivers kill people every day at their own fault. If they're demonstrating full care and attention, they do not lose their insurance coverage for that accident, they do not (usually) get sued to oblivion for negligence (thanks to lower damages and insurance coverage), and they do not go to jail for negligent homicide.

All of that is WAAAAAAAAAY more likely if are found with a device in the car that is made to allow you to drive with less care and attention, epsecailly if they find you bragging on the internet that you use the device so you can exercise less care and attention.

You need to look up:

- the terms in your insurance policy
- the civil tort definition of liability
- the criminal law criteria when an accident becomes criminal manslaughter
- the criminal law criteria when an accident becomes negligent homicide

Uhmm no. There are some cars from other manufacturers that don't even require you to keep hands on the wheel, and it's perfectly legal.
 
Actually what it shows is that more miles were driven.

We 💯 agree on what the graph shows. Please stop arguing that which we agree.

But the context of a graph reveals more information behind it. (Imagine a graph showing gender birthrate in China a while back which shows 55% boys and 45% girls. That graph shows only birthrate but in the context of what we know about China during a certain period, the graph ILLUSTRATES a whole lot more than it shows.)

So the miles on FSD graph illustrates a WHOLE lot more than it shows. If you understand the context of those miles.

To be able to correlate that to capability of the system you need to be able to understand the context as to who has been driving, where, and with how many accidents.

You could NEVER see that many miles in that short time on the first FSD beta on those different roads, with those different ability testers, (they wouldn’t be driving it much because it was so unpleasant). And if you did have them try that many miles there is NO WAY there wouldn’t be more accidents.

The fact is that the better and more capable it gets at handling various driving situations and roadways the more each tester uses it. And the more testers it is released to who aren’t hyper aware and hyper focused.

This rollout is VERY risky. One bad crash will be a media and Tesla-hater feeding frenzy. (For good or bad.) Geeze just look at they hey made over the bollard boop. It could set the program back years in regulatory hell.

Tesla understands this. Everything they are doing with the Beta rollout is tied to this concept. They only widen it or release a major change to the most careful and trusted testers.

Before they roll it out wider they need to be sure it is going to be capable of safely handling all the varied roads and traffic situations thrown at it by a less-and-less careful/trustworthy pool of testers.

The fact is IS being rolled out much wider and IS being used way more is illustrated by the graph which shows only the accelerating pace of miles driven.

What exactly is risky any why?

People have been killed on Autopilot before. This didn't stop Tesla from marketing or selling AP. The driver assumes the risks, not Tesla.
 
A couple of "minor" issues can make a version annoying to use. My two most annoying issues:

1) accel and decel in certain situations is too jarring / slightly nauseating
2) unnecessary braking in random spots, difficult to predict or anticipate
Hmmm, I'm definitely not seeing such things in my area. In fact, far better in .2 than in .1. In an 8 mile stretch of surface streets and neighborhood that used to give me issues in 9 different locations, 7 of them are now corrected in .2, yet introduced 1 new issue in a traffic circle. So much better that the wife thought I was driving instead of the car. But sure, I except it isn't going to work well for everyone, and I would like to see it fix problems in some areas out near the gigafactory.
 
  • Like
Reactions: powertoold
All of that is WAAAAAAAAAY more likely if are found with a device in the car that is made to allow you to drive with less care and attention, epsecailly if they find you bragging on the internet that you use the device so you can exercise less care and attention.
Especially when NHTSA has weighed in and said that those devices should not be sold: NHTSA issues cease and desist for Tesla Autopilot Buddy

But most are now marketed as being for a different "fake" purpose from what they are really intended to do.
 
Had a weird situation with 10.69.2 build. We have a side by side garage, that's about 100ft down the driveway. I tend to back our cars into the garage, and last night when taking the Model 3 out for a drive the Lane Departure Alarm went off as I was driving forward out of my garage at <5mph. Then as I straightened out the wheel to drive down the driveway, it went off again.

Hadn't seen that on 10.12.2 nor 10.69.1.1, so I'll try to see if it happens again tonight.
Help me out on this one. I am not understanding. Are you putting the car into FSD at your garage? If so, I wonder if you will soon be posting one of those videos where it ends up in your living room. :):eek: BUT while on the point, I tried something with FSD to see what would happen. At the end of our carport where it meets the neighborhood road, I put it into FSD and it actually pulled out of the driveway and onto the neighborhood road. Yea, I know. Not a big deal. I just thought it was interesting. Not anything significant thought.
 
  • Like
Reactions: aronth5