Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
This is almost spooky since it is almost like it was able to "read" the sign (which we know it can't) and make a decision.
Keep in mind that Tesla vision does already read certain signs -- like speed limit signs. Maybe "road closed" as well. If not today, then it certainly will need to in the future. This is something LIDAR -- by itself -- cannot do.

Whether it read the sign or not, Tesla vision did at least recognize the barricade or some part of it because it rendered a cone in the middle of the lane on the display.
 
  • Like
Reactions: willow_hiller
Keep in mind that Tesla vision does already read certain signs -- like speed limit signs. Maybe "road closed" as well. If not today, then it certainly will need to in the future. This is something LIDAR -- by itself -- cannot do.

Whether it read the sign or not, Tesla vision did at least recognize the barricade or some part of it because it rendered a cone in the middle of the lane on the display.
It "reads" standard type Speed Signs, Stop Signs and Yield Signs. I believe that is the FULL list but Road Closed is not one yet and because there is NO STANDARD, the same for Right On Red and it will require a LOT of training to recognize all the MANY, many variations of these non standard signs. Even non standard Speed Limit, Stop and Yield are NOT recognized or confused.

Also I have had mine head straight for a Road Closed with barricades SEVERAL times (no cones) and I had to disengage, unless it was going to do a panic stop. Even ran under some taped off Closed Road (with signs) to see if it would stop but it didn't. If there are no orange cones/barrels it doesn't generally recognize construction site/closed roads, at least so far for me.
 
Last edited:
  • Informative
Reactions: hybridbear
Keep in mind that Tesla vision does already read certain signs -- like speed limit signs. Maybe "road closed" as well. If not today, then it certainly will need to in the future. This is something LIDAR -- by itself -- cannot do.
I agree, Tesla will need to read those signs. The problem is that it currently does not. This is not a Vision vs LIDAR issue, all companies use vision to read those signs. This is a Tesla issue if they are not reading the signs.
 
Keep in mind that Tesla vision does already read certain signs -- like speed limit signs. Maybe "road closed" as well. If not today, then it certainly will need to in the future. This is something LIDAR -- by itself -- cannot do.

Whether it read the sign or not, Tesla vision did at least recognize the barricade or some part of it because it rendered a cone in the middle of the lane on the display.
there are a couple speed signs around me that it doesn't read. They are the normat white rectangle but look just a little smaller. My wife thinks they look a little smaller too. Nothing noticeable unless you're looking at a normal one then these. FSD misses them every time.
 
  • Informative
Reactions: VanFriscia
there are a couple speed signs around me that it doesn't read. They are the normat white rectangle but look just a little smaller. My wife thinks they look a little smaller too. Nothing noticeable unless you're looking at a normal one then these. FSD misses them every time.
Exactly, we humans view them relatively and a slight difference is easily unnoticed. However with Beta if it is NOT 100% STANDARD it is completely disregarded.
 
No doubt, lots of work remaining for reading signs. For example, signs for No Right Turn On Red!

However, I've seen it read and display temporary speed limit signs put up for constructions zones. (Temp signs low to the ground on barricades, but recognized and displayed as a regular speed limit sign.)

For the the barricade in the video, it showed a cone, that seems to have worked OK in this situation. Work in progress!
 
Got a reporting question.

Does disengaging via the brake or steering wheel generate a report. Lately I've seen arguments that it doesn't but I thought it did. Or do I need to hit the camera after one of those disengages?

my bitch - searched the tesla site and google. Although I think I got a few instructions last Oct, I can't find any 'user guide' to the beta.
 
Since getting FSD Beta in early June I had mostly been using the "chill" profile. In the first few days of using it when it was brand new to me I felt most comfortable with "chill". But "chill" has crazy aggressive acceleration after turns. After a comment by DirtyTesla in his recent video about the different behavior between profiles & "chill" making more mistakes, I decided to switch to "assertive". So far I feel like "assertive" is doing better. The car still makes many of the same mistakes, but the driving experience seems improved in "assertive".
 
Since getting FSD Beta in early June I had mostly been using the "chill" profile. In the first few days of using it when it was brand new to me I felt most comfortable with "chill". But "chill" has crazy aggressive acceleration after turns. After a comment by DirtyTesla in his recent video about the different behavior between profiles & "chill" making more mistakes, I decided to switch to "assertive". So far I feel like "assertive" is doing better. The car still makes many of the same mistakes, but the driving experience seems improved in "assertive".
I use the assertive profile and it's usually fine...but the aggressive accelerations are super annoying, especially when it's in the city! Frankly, what shocks me is: when it passes a speed limit sign that is higher than your current speed, it'll aggressively accelerate BUT when you use the scroll wheel to decrease speed, the car very slowly decreases speed (takes several hundred feet to reach targeted speed).

There are certain sections in my city where the aggressive acceleration occurs...one of which is literally in the middle of a rotary. I've sent an email to the FSD Beta team about 6-8 months ago but the same thing still occurs.
 
  • Informative
Reactions: hybridbear
It "reads" standard type Speed Signs, Stop Signs and Yield Signs. I believe that is the FULL list but Road Closed is not one yet and because there is NO STANDARD, the same for Right On Red…
Also there is also no standard for this…
1660657631732.jpeg


Driving on I-15 into Southern California there are dozens of these and of course NoA only reads the “55”.
 
Last edited:
Tesla should be able to upgrade FSD Beta to read and follow such signs just as it already does with the general speed limit signs. Might need a little natural language processing, but doable. It could also use the rear camera to detect an attached trailer to know when the sign applies.
That should be what AI is all about. Disappointing if it can't
 
I was thinking yesterday that if my Google Home, Siri and Alexa can translate what I speak into text format the technology is there for Tesla to do the same with their vision. With all the cameras, software and engineers behind the scenes I would expect our cars to perform the basics. I still love my car and for me the pros outweigh the cons.

The technology has been around for decades. It's commonly known as optical character recognition (OCR), and it's the same sort of thing the post office uses to read addresses off of letters.

But reading the text is the easy part. The hard part is understanding how the text affects how the vehicle should drive given the context of the situation. They can hard-code behavior for the most common types of signs, like "No turn on red." But eventually an autonomous vehicle would need to be able to read and parse any given sign. Even those dot-matrix programmable road signs.
 
The technology has been around for decades. It's commonly known as optical character recognition (OCR), and it's the same sort of thing the post office uses to read addresses off of letters.

But reading the text is the easy part. The hard part is understanding how the text affects how the vehicle should drive given the context of the situation. They can hard-code behavior for the most common types of signs, like "No turn on red." But eventually an autonomous vehicle would need to be able to read and parse any given sign. Even those dot-matrix programmable road signs.
That sounds like it would be programming the street signs into commands. How hard could that be? I currently do that with my Google Home when I’m setting up a device or creating a scene.
 
I'm a bit concerned about the prospects for FSD handling school zones properly. Around here, it is common for there to be an activating sign which either says something like "School zone, speed limit, nn mph when children are present", or "School zone, speed limit, nn mph when blinking". There is also a rule that when a school zone is active your car must not pass another car. Depending on the type of school, nn can be 15, 20, or 25 mph.

I went to jury duty on a street at an hour with an active 15 mph school zone (light blinking) yesterday, while running in FSD beta, and it seemed the car took no notice of it whatever. I braked and complied.

This one may be tough for FSD ever to get right consistently, but I wonder if it has even got started on trying?

As with most speed limits, one sticky matter is that the actual compliance of human drivers varies quite widely.
 
This one may be tough for FSD ever to get right consistently, but I wonder if it has even got started on trying?

Oddly enough, they already do this for basic AP in Europe. I don't know why it hasn't made it to North America, yet. For e.g. here's what a time-based conditional speed limit looks like in Germany:

1660748355775.png


And even a conditional speed limit based on winter conditions (note the snow flake in place of the clock) in Finland:

1660748400532.png