Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
And the FSD equipped AI bot demo seemed to have similar Parkinsonian tremors. It's probably just a coincidence though. :)
The robot is a piece of junk. It was the equivalent of FSD. Not there yet but he's gonna sell them at 20K. I can see them being used in nursing homes. We can fire the staff and if the robot drop grandma we can pull up the FSD rule. Be ready for osmething really bad to go wrong. I don't see a lawyer suing the robot. Maybe the nursing home.
 
The robot is a piece of junk. It was the equivalent of FSD. Not there yet but he's gonna sell them at 20K. I can see them being used in nursing homes. We can fire the staff and if the robot drop grandma we can pull up the FSD rule. Be ready for osmething really bad to go wrong. I don't see a lawyer suing the robot. Maybe the nursing home.

I thought it was pretty clever, using the same FSD computer and cameras in the robot. I was driving by a driveway the other day, where the path was blocked by a single, thin chain hanging across two posts. I don't think FSD registered the chain as an obstacle, and it occurred to me that we as drivers achieve our understanding of the world at the human scale. A thin chain doesn't seem like a major obstacle to a heavy vehicle, but it's something you come across while walking that indicates "Do not cross." Regardless of the sensor, camera, LIDAR, radar, ultrasonic, I think any of them would have trouble picking up the chain, but camera vision over multiple successive moving frames probably has the best odds.

So the robot might be able to start with a rudimentary understanding of the physical world via vehicle training data, but then it may be able to augment and improve that training dataset with observations taken from a human view-point at a human scale.
 
"
After all the complaints I've had about FSD it actually dawned on me yesterday that all the times I've had to take over is where FSD couldn't get the job done. So I thought, I'm a reasonably good driver and I can negotiate traffic problems when FSD can't why can't the Neural Network learn from ME how to drive?
Why do I have to send data (if they look at it at all) so they can "fix a routine".)? It should learn from me and incorporate it

Individual cars "learning" from individual drivers would be an absolute nightmare for any sort of troubleshooting or performance consistency.

Fleet-wide interventions are used fleet-wide for training/improvements.
 
Ah, well, back into the snake pit.

So, I'll be the first to admit that on any given day, on any given drive, I'm pretty much hammering away on that video button; five or so times when I'm taking the interstates, ten or 15 when on local drives. So, yesterday, on the way in on local roads: The usual 10-15 hits on the video including a couple of interventions. This one spot where one comes over the brow of a hill, going from one lane to three, and the car needs to get into that center lane; nope, intervention time. Oh, well.

On the way back, however, Zero Interventions or video button hits.
  1. Local road into a T-junction with a blinking red light and need to turn left. When I had driver's ed, back in the day, a blinking red was the equivalent of a stop sign. Every Single Time I'd hit this intersection on a left, the car would stop, jerk the wheel back and forth, and freeze, with or without traffic. Gassing it would get the car to actually go through, it'd be lined up with the correct lane, and it would still stay stuck for 30 feet or so, then come to its senses and start moving. This time: Came to a halt. Crept up a bit. No traffic. And.. it powered up and took the left. About 20 seconds at the intersection, wouldn't have done it if there was somebody behind me, but I've waited longer before and Nothing Happened.
  2. All the rest of the way home: 20-odd miles. Two stop lights with protected left turns into the interstate on-ramp, a seriously heavy-duty merger from one interstate to another, an off-ramp to major local road with a bad merger and a light, a couple more lights, then a left turn, then home. I usually have to take over at least once, with a couple of, "that's stupid" video icon hits. This time, nothing.
It doesn't happen often; and I thought, given the vagaries of 69.2.3 and the previous releases, that it would take a major release to get through that first blinking red light in one piece, but it did it.

This morning it was back to the 10-15 hits on the video button, so it's not like it's suddenly started acting massively better or anything. Improved over the last two releases, still braking in inadvertent spots, yeah.

As to those who think the End Will Never Come: I still have serious hopes that Tesla will get there. We'll see.
 
  • Like
Reactions: Electroman and DJT1
Well, 10.69.3 put me in a dicey situation yesterday. It was coming to an intersection (2 lanes each direction), making a right turn. Instead of committing, it hesitated at the last second, slowing and leaving me exposed in the lane as the light turned red. I decided to hit the gas and pull into the intersection, mostly cutting off an oncoming vehicle (angry dude in an older Chrysler Minivan). He ended up speeding up next to me and staring me down. I didn't make eye contact. I was imagining what I would say to a cop if I got pulled. "The Tesla was on autopilot I swear!"

So your car was turning right, stopped before pulling into intersection while waiting for oncoming traffic, and you decided to go anyway and pissed off oncoming car? I don’t think I understand…
 
Last edited:
Map data seems SUPER important for the success for FSD. In my area the map data must suck because it is constantly getting into lanes that turn into turning lanes and trying to go straight. Sometimes it gets in the complete right lane when it needs to go left and vice versa. 80% of my disengagement are from awful lane selection.
I think they get their maps from TomTom, check their maps to see if it's accurate for your location. Also, lane selection is handled by the planner neural net which is very much not working right, but Tesla is working on it so hopefully we'll see improvements as the updates roll out.
 
So your car was turning right, stopped before pulling into intersection while waiting for oncoming traffic, and you decided to go anyway and pissed off oncoming car? I don’t think I understand…
Yes this is like FSD driving through a red light and then me claiming that FSD did it, rather than me!

I see a lot of videos of very bad human driving now (see my video of myself above - not good driving (no one was behind me though)!).

As to those who think the End Will Never Come: I still have serious hopes that Tesla will get there.
Yes, when you realize that we are just perhaps 1% of the way there, it just boggles the mind how good it will be at later stages of development.
 
Last edited:
  • Funny
Reactions: Dewg
Lane assignation seems to be the largest flaw in my FSD experience. My family believes that solving lane choice (meaning it doesn't have to re-navigate because it misses a turn, or end up in a lane inappropriate for where it needs to go, etc.) would increase the utility of the car between 80-90%. A huge jump. It seems to be one of the last major issues.

I've just started cancelling inappropriate lane changes, or using my turn signal to force it to make them where needed. Much smoother to integrate those small interventions than to let the car pick incorrect lanes and disengage later.
 
Lane assignation seems to be the largest flaw in my FSD experience. My family believes that solving lane choice (meaning it doesn't have to re-navigate because it misses a turn, or end up in a lane inappropriate for where it needs to go, etc.) would increase the utility of the car between 80-90%. A huge jump. It seems to be one of the last major issues.
Yes, the planner NN needs serious tweaking. I know it's something they're working on, but don't know when we'll see improvements. It's a hard problem I think because Elon is leaning towards driving without map data, but map data with lane information is very helpful. Don't know how they're going to end up once the planner starts getting some love.
 
  • Like
Reactions: GregalseT
Well today it just tried to overtake a car stopped in front of the red light and with obvious brake lights. It can’t be more wrong than that.
Not to excuse the behavior but there are many times in urban areas where someone will double park, or simply be sitting in a driving lane and you need to go around them. Tesla needs to program FSD to handle these situations without trying to go around cars that are simply waiting for a light or traffic. How many times have you come upon someone and waited behind them, trying to to figure out if they're going to go or if you should drive around them?
 
Yes, the planner NN needs serious tweaking. I know it's something they're working on, but don't know when we'll see improvements. It's a hard problem I think because Elon is leaning towards driving without map data, but map data with lane information is very helpful. Don't know how they're going to end up once the planner starts getting some love.
IMO unless it can read ALL traffic signs, understand them and interpret traffic flow patterns the way we humans do you NEED at least some Map data to rely on. It often does a terrible job with which lane to use since it doesn't "see" the signs we humans use.

EDIT: sleepydoc hits on a BIG problem especially in the city. On a 4 or 5 lane one way you will often have cars/trucks double parked or construction blocking. Beta will drive in the far lane and I can CLEARLY see a blockage up ahead and it just barrels forward until out of space and slower/stops to get over.

Here is human handling it correctly.

Screen Shot 2022-10-13 at 7.02.38 PM.png
 
Last edited:
  • Like
Reactions: sleepydoc and Dewg
IMO unless it can read ALL traffic signs, understand them and interpret traffic flow patterns the way we humans do you NEED at least some Map data to rely on. It often does a terrible job with which lane to use since it doesn't "see" the signs we humans use.
I read somewhere a long time ago that sign reading was a problem due to some contract with Mobileye, when AP1 was in partnership with them. I wonder if there is some legal issue they are working through to this day, or perhaps waiting for some timeframe to pass in the previous contract. It may just be a licensing issue, not sure, but it could explain why sign reading so far has been limited to speed limit signs.
 
IMO unless it can read ALL traffic signs, understand them and interpret traffic flow patterns the way we humans do you NEED at least some Map data to rely on. It often does a terrible job with which lane to use since it doesn't "see" the signs we humans use.
I think it needs even more, since as humans we often dont understand what to do by signage alone and we only learn through trial and error. So the car needs a memory and the ability to learn from it too.
 
  • Like
Reactions: JulienW