Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Sounds pretty much the same as v12.3.3.


Screenshot 2024-04-11 192818.png
 
Hell I don't know that it makes too much difference. Even on Max V12 seems to drive at whatever speed it decides.
For sure!
Even with auto speed off, it's driving along at 52mph in a 60 with set speed at 65 with other drivers getting bored and overtaking us. Still having to goose it to get it get a move on, then it suddenly wakes up and accelerates hard.
Many times it drives like a human, the rest of the time the human its driving like is like having a drunk/overtired taxi driver that gets too close to the lane lines and is constantly surprised by other road users and can't plan ahead.
 
I have seen it do that. It is a problem when there are ambiguous lane markings such as a highway ramp lane marked with a solid white and FSD is wondering how to get to that lane
It sometimes makes these mistakes even when the lane markings are crystal clear. There's a spot near my home where I often need to go straight through an intersection, and there's a well-marked left-turn-only lane. The car always tries to get into that lane and go straight from it, even though the markings show up correctly in the visualization. Obviously the car can see it and should be respecting it. I hope this gets fixed soon.
 
For sure!
Even with auto speed off, it's driving along at 52mph in a 60 with set speed at 65 with other drivers getting bored and overtaking us. Still having to goose it to get it get a move on, then it suddenly wakes up and accelerates hard.
Many times it drives like a human, the rest of the time the human its driving like is like having a drunk/overtired taxi driver that gets too close to the lane lines and is constantly surprised by other road users and can't plan ahead.
In "poor" driving conditions it will often insist on a super low speed. I had a drive on the freeway yesterday afternoon heading into the sun (not the best visibility, but not terrible), and FSD refused to go more than 10mph _below_ the speed limit. I could keep the accelerator pressed, but this resulted in constant "Auto cruise control will not brake" warnings. So I had to just disable it. This is an example where I suspect the hardware may be the ultimate limiting factor. If FSD can't properly drive at certain times of day due to being blinded by glare, that's pretty much a showstopper for the robotaxi idea (until they upgrade the hardware suite with more robust cameras, which may or may not be retrofittable).
 
So V12.3.3 didn't fare very well in Texas for us, esp off interstates and tollroads.
Many roads were being worked on where the lane lines were not well defined or were not there as they hadn't been repainted yet. That and unrestricted access to 75 mph highways was quite troublesome.
The NN is going to need some serious training to handle those situations.
 
So V12.3.3 didn't fare very well in Texas for us, esp off interstates and tollroads.
Many roads were being worked on where the lane lines were not well defined or were not there as they hadn't been repainted yet. That and unrestricted access to 75 mph highways was quite troublesome.
The NN is going to need some serious training to handle those situations.
I recently spent a lot of time on Texas highways (with grass median) getting to and from the eclipse. It's still running the v11 stack on such highways of course, but it just LOVES to get into short left-turn-only lanes at 80mph and then swerve out of them. I hope that once highways are on the end-to-end stack, this behavior goes away.
 
  • Like
Reactions: tmoz and Optimeer
This, and other examples, are a good reason to think that eventually signs will need to be held to some sort of "machine-recognizable" standards to make self-driving cars safer.
As Elon has long recognized, changing road infrastructure to suit the car's limitations is a losing battle. The reason is that there are literally thousands of different things one may encounter in driving that can be "non-standard"; not just road signs, and it is not practical to standardize or even pre-define all of them. If a human is able to understand a non-standard road sign or marking, it should in principle be possible to make a FSD system that can understand it too. As one gets into the long tail of edge cases, solving them starts to require true general intelligence. This is why L5 autonomy is so hard; driving is not nearly so "narrow" a task as it may appear. But once the system starts gaining this sort of general intelligence and reasoning capability, which may still be several years off in Tesla's case, non-standard stop signs will be a piece of cake.

A quick example: I was in mostly-stopped traffic on a four-lane highway, due to road construction ahead. Periodically one lane would move a bit, then the other. FSD was constantly trying to dart into the moving lane, with "re-routing around obstacle" messages. A system with general intelligence would have a wider time horizon, and would understand the broader context of how and why the cars were moving that way, and understand that the polite and correct thing to do was to stay in one's own lane, and that a stopped car in front was not an "obstacle". We will eventually get there, but as I said, I think it is still quite a few years away.
 
OK, so what's the full story about "crowd sourced map data" ? As this has the potential of being such an 800 lb. gorilla in the room, why have I heard essentially nothing about it except an occasional mention here on this board? Someone posted recently that they were able to go in and update the maps to make FSD able to find their driveway. I'd sure love to be able to do that if I could... or maybe fix my funky intersection down the street lol.
I sent a correction to Google Maps about a road what was one-way, and routing would take me to the wrong end. About a week later, it routed correctly. More recently, I sent one where Tesla would take the middle of two lanes, where the lane divider paint had worn off. I told Google it was two lanes, and lately it seems to be staying on one side or the other. Hard to be sure, because often there is a car to follow, but it is looking good.

On the other hand, there were a couple of temporary stop signs creating a short one-way lane after a washout. But the signs are long gone, except on the Tesla route mapping. I recently sent a note to Tom Tom, but have yet to see any change. Google seems not to have signs and lights in their mapping, so perhaps Tom Tom is where that data comes from.

If only the mapping of where the mapping data comes from was as good as the mapping itself, then we could help more to make the maps better.... ;-)
 
  • Funny
Reactions: Ben W and spacecoin
Yes, it's excellent. I really like how Tesla is leaning in on FSD now.



How is that going? Happily going 90mph?



Yes, this is an impossible problem and will require AGI.
Lol there's never a time I should be going 90 using an automated system (or manual driving). This is 20 mph above the speed limit on the interstate while losing my range. Also we have the most autopilot accidents from 2016-2019 per car sold vs today. Did the radar really help?
 
Lol there's never a time I should be going 90 using an automated system (or manual driving). This is 20 mph above the speed limit on the interstate while losing my range. Also we have the most autopilot accidents from 2016-2019 per car sold vs today. Did the radar really help?
I for one would like to have a little more margin for getting locked out of using AP until parking, when passing people on the freeway when unexpected hazardous scenarios arise (for example, someone coming from behind at 95-110mph quite suddenly), which make going above about 87mph necessary for safety. Obviously if I have to get locked out when I forget to disengage in an emergency, it is not a big deal. But it does happen in fairly pedestrian situations.

Did the radar really help?
Guess not. Raise the limit to 90, it sounds like you are saying?
 
Last edited:
  • Like
Reactions: APotatoGod
I for one would like to have a little more margin for getting locked out of using AP until parking, when passing people on the freeway when unexpected hazardous scenarios arise (for example, someone coming from behind at 95-110mph quite suddenly), which make going above about 87mph necessary for safety. Obviously if I have to get locked out when I forget to disengage in an emergency, it is not a big deal. But it does happen in fairly pedestrian situations.


Guess not. Raise the limit to 90, it sounds like you are saying?
Maybe it was the 90mph that killed people? (hence why the cops will not only ticket you, but arrest you if you go any faster)

I have never been in a situation when a car going over 100 not moving around me or others. The best thing to do is to ignore that car. You speeding up for safety is an oxymoron.
 
Good thing. Has anyone reported the wiggle happening with a car in the blind spot in the direction of the wiggle? I mean, FSD should see there's a car there and at least not "do the wiggle" when it would be cutting off another vehicle...
I had it wiggle a lane change with a car in my blind spot. Disengaged and had to fight it a bit to keep it in the left turn lane.

Previously wiggled exactly the same in this same turn, without traffic, and gave up, handed it back to me with no warning, and rolling slow into the median.

Abject failure. Like shut down FSD level failure. Unethical to let this out in the world kind of failure.

But OK, whatevs, lol. It is both "mind blowing, fire, awesome sauce" AND "you are the driver responsible what part of L2 do you not understand, you CLICKED ACCEPT ON THE WALL OF TEXT BOILER PLATE DISCLAIMER, you should know all the ins and outs of it, dude!"
 
I have never been in a situation when a car going over 100 not moving around me
You are fortunate. It’s often pretty frightening. It is not that uncommon in California unfortunately. I have actually had to get out of the way this way in the past. Reducing closing speed is important, as is getting out of the way ASAP. Slowing down can be an option but gotta be quick about it and is not always possible.
 
Yes. Now is the time to start designing roads and signs that work best with self-driving cars. Sure, we currently have to work with what we've got, but smarter standards will help.

For example, I think just slapping TRUCK above a speed limit sign should not be allowed, and my Tesla can't understand it.

View attachment 1037864
You have no idea the complexities involved in this wishful thinking.

Due to budget cuts, lane striping in Spokane county was being managed by ONE driver of the striping truck. He didn't have any idea how to choose which roads to stripe first, so just went a did whatever he could find.

The senior engineer intervened and made him a spreadsheet with a prioritized list of streets weighted by complaints from citizens.

He said "whats a spreadsheet?"

The Senior engineer quit soon after.

How the f--/ are you going to maintain any street infrastructure to some level needed for robots to drive in this reality we live in?

Step one: stop cutting budgets. Support paying taxes for our shared infrastructure. We are hosed, full stop.

True story. Engineer is a good friend of mine.

People, autonomous driving is not happening in our lifetimes, not at the scale needed to make any difference to any meaningful safety metrics.

Y'all need to get a grip.