Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
I'm one of the many still on 10.8.1, its been 40 days since an update for me. Whenever this next update comes, it better be good!
Call the NHTSA. They're mostly to blame. Tesla is obviously being targeted with petty recalls.
Absolutely.

I think we should make our displeasure known. Afterall NHTSA is recalling with zero evidence - just based on one reporter or hearsay.
 
Timecode 0:22 to 0:30 where the car runs a red light is an entirely different intersection. I couldn't place it, but it's a different time and place. We have no other information on whether he manually ran the red light or FSD Beta did as the FSD display is not shown, except for his statement that FSD ran the light.
The video actually shows the car INSIDE an intersection with the light at red. Was the light red when the car entered the intersection? There is another car also in the intersection going the other way, which tends to suggest not (unless it also ran a red light at the same time). In addition, since (as others have noted) this was clearly an anti-FSD video, wouldn't they have shown the car ENTERING the intersection on red if it had done that (since that would be more damning).
 
Redlightgate. We need CNBC to release all the footage so we can get to the bottom of it!
The fact is FSD Beta will occasionally run red lights but only if the safety driver lets it, that's why it's BETA. Running red lights is the easiest FSD Beta failure to correct and report.
It does seem like Tesla should be kicking people who are incompetent off the beta. Of course this guy would argue that he's being kicked off the beta because he's speaking negatively about Tesla.
 
The fact is FSD Beta will occasionally run red lights but only if the safety driver lets it, that's why it's BETA. Running red lights is the easiest FSD Beta failure to correct and report.
I'm most skeptical of redlight running - mainly because we'd hear more about it if it happens often. I mean - we see a lot of problems with unnecessary lane changes, taking left & right unprotected turns at the wrong time, wide turns, hesitation etc ... so I'd expect all those on such a "demo" run on CNBC. But running a Red light ... ?

ps : Definitely FSD runs yellow lights. I almost always stop when I see yellow light, but FSD continues if it can make it through yellow. I don't know whether it is dumb luck or they calculate to make sure they can cross the intersection in Yellow (or are assuming its ok to be still crossing when it turns red, as long as it was yellow when they crossed the line).
 
Redlightgate. We need CNBC to release all the footage so we can get to the bottom of it!
The fact is FSD Beta will occasionally run red lights but only if the safety driver lets it, that's why it's BETA. Running red lights is the easiest FSD Beta failure to correct and report.
It does seem like Tesla should be kicking people who are incompetent off the beta. Of course this guy would argue that he's being kicked off the beta because he's speaking negatively about Tesla.
FSD Beta will occasionally (xxxxxxx) but only if the safety driver lets it.

Isn't that true of just about everything though? Can you think of anything FSD Beta does that is entirely uncontrollable by the driver? In fact if there was, that would probably be grounds to immediately stop using it. I don't think there is anything that can't be overridden by an alert driver... Fact is, it does run red lights, we've seen this on multiple videos, also driving into oncoming traffic, passing stopped cars over a double-yellow, etc. Also it has been seen to drive well too.

Aren't we holding FSD Beta to some kind of hands-off testing expectation, otherwise what's the point? Sure, don't let it DO the dangerous things, but isn't it reportable that it's ATTEMPTING to do dangerous things?
 
  • Like
Reactions: thewishmaster
FSD Beta will occasionally (xxxxxxx) but only if the safety driver lets it.

Isn't that true of just about everything though? Can you think of anything FSD Beta does that is entirely uncontrollable by the driver? In fact if there was, that would probably be grounds to immediately stop using it. I don't think there is anything that can't be overridden by an alert driver... Fact is, it does run red lights, we've seen this on multiple videos, also driving into oncoming traffic, passing stopped cars over a double-yellow, etc. Also it has been seen to drive well too.

Aren't we holding FSD Beta to some kind of hands-off testing expectation, otherwise what's the point? Sure, don't let it DO the dangerous things, but isn't it reportable that it's ATTEMPTING to do dangerous things?
It's reportable, but if the video is mainly a cut of the "dangerous" things, then that's hardly a fair look, especially if it's suggesting it's uncut and a balanced view.
 
  • Funny
Reactions: Daniel in SD
I'm most skeptical of redlight running - mainly because we'd hear more about it if it happens often. I mean - we see a lot of problems with unnecessary lane changes, taking left & right unprotected turns at the wrong time, wide turns, hesitation etc ... so I'd expect all those on such a "demo" run on CNBC. But running a Red light ... ?

ps : Definitely FSD runs yellow lights. I almost always stop when I see yellow light, but FSD continues if it can make it through yellow. I don't know whether it is dumb luck or they calculate to make sure they can cross the intersection in Yellow (or are assuming its ok to be still crossing when it turns red, as long as it was yellow when they crossed the line).

Mine has tried to run red lights before several times, typically at slightly convoluted intersections. The couple that were memorable - one of them it attempted to treat as a stop sign, the other I was behind a Cruise car which caught a very late yellow (red right after it crossed) and FSD beta did not react to in any way. I disengaged both times so I can't prove that it would have run them.

I think what was common between the two was FSD beta's concept of where the stopline was - the one it tried to treat as a stop sign, it stopped way short of the actual stopline, and once I nudged it to the actual place where a human driver would stop, I bet it assumed it was no longer bound by stoplight rules and started creeping into the intersection to make the turn once things were clear (I also tried disengaging and re-engaging and that was the same result, so it wasn't the act of nudging itself that caused it to go). In the other case, the first stoplight is way before the intersection, and there is no actual stopline at the intersection itself, only a crosswalk, so my guess is it thought it was already crossing the intersection on a yellow and thus wasn't planning to stop where humans normally would.
 
I'm most skeptical of redlight running - mainly because we'd hear more about it if it happens often. I mean - we see a lot of problems with unnecessary lane changes, taking left & right unprotected turns at the wrong time, wide turns, hesitation etc ... so I'd expect all those on such a "demo" run on CNBC. But running a Red light ... ?
Yeah I'm skeptical that it happened while they were filming. I bet what happened is the car got stuck in the intersection and the light turned red. FSD Beta will definitely try to run red lights though. Just do a search of this thread. For example:
Aren't we holding FSD Beta to some kind of hands-off testing expectation, otherwise what's the point? Sure, don't let it DO the dangerous things, but isn't it reportable that it's ATTEMPTING to do dangerous things?
Well some of us believe the goal is robotaxis, so yes. However it's silly to try to correlate the number of failures to the safety of testing. I would argue that the biggest safety issue with FSD Beta right now is Tesla fans trying to make zero intervention videos and Tesla haters trying to make fail videos (though this is a much smaller number of people). There is the video of the driver purportedly on FSD Beta (I believe he was) overcorrecting after the car crossed a double yellow so that's a potential safety issue. Of course any safety issues may be more than offset by the hyper vigilance induced by driving a system that's so erratic. I believe the real safety problem will be when the system gets much better and people trust it too much.
It's reportable, but if the video is mainly a cut of the "dangerous" things, then that's hardly a fair look, especially if it's suggesting it's uncut and a balanced view.
Say a Waymo vehicle was hitting a pedestrian every 1000 miles. Should the media have to show a thousand miles of problem free driving before reporting it in order to be balanced?
 
Say a Waymo vehicle was hitting a pedestrian every 1000 miles. Should the media have to show a thousand miles of problem free driving before reporting it in order to be balanced?
That's not what I'm saying. If it was just a short blurb in the news, that's a different thing. However, this is a long video that suggests it's a comprehensive, largely uncut, representative review (and designed for a general audience), not something that was trying to put FSD in a bad light. That's completely different.

Same thing with compilation videos. If the video itself was for example a compilation of all FSD failures (or on the flip side, successes), and was titled as such, then the video watcher would come in with the proper expectation.
 
Last edited:
  • Like
Reactions: EVNow
Here is an interesting video - analyzing the data collected about FSD trips. Unfortunately named "robotaxi report".

For 10.8.1 he says a full 50% of the trips had zero engagements. 328 trips in all - so, not a trivial amount.

1645559290767.png


Here is the video ...

 
@ red-light running, we've seen Beta do this as recently as 10.9 and I have little doubt it still will. Dirty Tesla has a recent video of his vehicle coming to a stop at a big intersection, likely misinterpreting an adjacent green light for a green light in his left-turn lane, and trying to proceed through a red before he disengages.

The system obviously knows not to plow through a correctly-identified red light, but it does not correctly identify lights in all cases.
 
  • Informative
Reactions: EVNow
I had a very interesting experience today with FSD shifting into reverse. Yes, I know the ability to go into reverse has been greatly debated and from videos posted so far I have been skeptical. But today I had it go into reverse, and it was quite undeniable and aggressive. Furthermore, it did not happen when trying to exit a tight parking spot or at an intersection where it pulled too far out. Rather, this happened in the middle of a road, slowing, then reversing, from 41 mph.

There is a truck in the opposite lane of a two-lane road. While the truck was not over the centerline, it was very close. If I had been driving myself, I wouldn't have thought twice about continuing. But, for whatever, reason FSD predicted a conflict with the truck. So my car came to a stop, then reversed several feet until the truck passed. Then it continued forward.

Unfortunately since this happened rather quickly and it was totally unexpected, I didn't get the chance to look at the display and see what the PRND indicator said, nor did I attempt to hit the accelerator while the car was backing. So these behaviors are still unknowns.

And before someone chimes in and says "the car just coasted backwards," note that I was going down a rather significant hill and, in fact, after the car initially stops, it coasts forward a few inches before reversing.

I am attaching the front, left repeater, and rear camera views. It is most apparent how quickly and aggressively the car backed from the repeater and rear cameras.


 
I had a very interesting experience today with FSD shifting into reverse. Yes, I know the ability to go into reverse has been greatly debated and from videos posted so far I have been skeptical. But today I had it go into reverse, and it was quite undeniable and aggressive. Furthermore, it did not happen when trying to exit a tight parking spot or at an intersection where it pulled too far out. Rather, this happened in the middle of a road, slowing, then reversing, from 41 mph.

There is a truck in the opposite lane of a two-lane road. While the truck was not over the centerline, it was very close. If I had been driving myself, I wouldn't have thought twice about continuing. But, for whatever, reason FSD predicted a conflict with the truck. So my car came to a stop, then reversed several feet until the truck passed. Then it continued forward.

Unfortunately since this happened rather quickly and it was totally unexpected, I didn't get the chance to look at the display and see what the PRND indicator said, nor did I attempt to hit the accelerator while the car was backing. So these behaviors are still unknowns.

And before someone chimes in and says "the car just coasted backwards," note that I was going down a rather significant hill and, in fact, after the car initially stops, it coasts forward a few inches before reversing.

I am attaching the front, left repeater, and rear camera views. It is most apparent how quickly and aggressively the car backed from the repeater and rear cameras.


Much like the "rolling stop", this "reversing on FSD Beta" is a function that needs to be explained. When exactly does it reverse, how does it decide, and does it do this safely?

Since Tesla only seems to respond when forced, I would like to see someone with FSD Beta report this to the NHTSA and get an official response from Tesla, ideally before anything dangerous should happen. It only took one person to get Passenger Play investigated.

Tesla apparently has rules for these behaviors, so what are the rules for reversing?

e.g.
NHTSA Recall No. : 22V-037
...
The “rolling stop” functionality is designed to allow the vehicle to travel
through an all-way-stop intersection without coming to a complete stop when
several operating conditions are first met. The required conditions include:
1. The functionality must be enabled within the FSD Beta Profile settings; and
2. The vehicle must be approaching an all-way stop intersection; and
3. The vehicle must be traveling below 5.6mph; and
....

etc
 
I had a very interesting experience today with FSD shifting into reverse. Yes, I know the ability to go into reverse has been greatly debated and from videos posted so far I have been skeptical. But today I had it go into reverse, and it was quite undeniable and aggressive. Furthermore, it did not happen when trying to exit a tight parking spot or at an intersection where it pulled too far out. Rather, this happened in the middle of a road, slowing, then reversing, from 41 mph.

There is a truck in the opposite lane of a two-lane road. While the truck was not over the centerline, it was very close. If I had been driving myself, I wouldn't have thought twice about continuing. But, for whatever, reason FSD predicted a conflict with the truck. So my car came to a stop, then reversed several feet until the truck passed. Then it continued forward.

Unfortunately since this happened rather quickly and it was totally unexpected, I didn't get the chance to look at the display and see what the PRND indicator said, nor did I attempt to hit the accelerator while the car was backing. So these behaviors are still unknowns.

And before someone chimes in and says "the car just coasted backwards," note that I was going down a rather significant hill and, in fact, after the car initially stops, it coasts forward a few inches before reversing.

I am attaching the front, left repeater, and rear camera views. It is most apparent how quickly and aggressively the car backed from the repeater and rear cameras.


Interesting……it didn’t reverse much (what distance would you say?). I’m just glad you didn’t have anyone tailing behind you (did you check the rear view and did the back up camera come on when it reversed?). I wonder what the car would have done with someone close behind? Raises more questions than it answers if any.

Ski
 
Interesting……it didn’t reverse much (what distance would you say?). I’m just glad you didn’t have anyone tailing behind you (did you check the rear view and did the back up camera come on when it reversed?). I wonder what the car would have done with someone close behind? Raises more questions than it answers if any.

Ski
Yes it definitely does raise a lot of questions.

To your point, the only reason I let it slow and come to a complete stop is that there is nobody behind me, which you can plainly see from the rear camera video.

As for distance, as I mentioned it is much easier to judge from the rear and repeater views. It was actually much farther than it appears to be from the front video. I’d say it was a good 5-6 feet.
 
Yes it definitely does raise a lot of questions.

To your point, the only reason I let it slow and come to a complete stop is that there is nobody behind me, which you can plainly see from the rear camera video.

As for distance, as I mentioned it is much easier to judge from the rear and repeater views. It was actually much farther than it appears to be from the front video. I’d say it was a good 5-6 feet.

Don't you think you should do something other than asking random people on the forum (or perhaps you are, IDK)? Your car came to a full stop from driving speed and reversed on a street. Seemingly for no valid reason. Even if it thought the truck was going to enter your lane, is stopping and reversing a reasonable thing to do? Most of us would avoid the truck by, you know, avoiding it.

If this is new behavior and curiously bad behavior it should be reported and investigated. At least by Tesla service (like that will do any good, but at least you've reported it and will get an actual response). Also to the FSD Beta team and/or the NHTSA (you probably won't get a response from either of them but at least you've reported it to them too).

Stopping and reversing without warning is very bad. Stopping and reversing with warning is still pretty bad. You're driving on a street FFS.
 
Since Tesla only seems to respond when forced, I would like to see someone with FSD Beta report this to the NHTSA and get an official response from Tesla, ideally before anything dangerous should happen. It only took one person to get Passenger Play investigated.
Er, why report it to the NHTSA? What, specifically, do you think the car did that was dangerous or illegal? Or do you think anything the car does should be reported just because of reasons?
 
Don't you think you should do something other than asking random people on the forum (or perhaps you are, IDK)? Your car came to a full stop from driving speed and reversed on a street. Seemingly for no valid reason. Even if it thought the truck was going to enter your lane, is stopping and reversing a reasonable thing to do? Most of us would avoid the truck by, you know, avoiding it.

If this is new behavior and curiously bad behavior it should be reported and investigated. At least by Tesla service (like that will do any good, but at least you've reported it and will get an actual response). Also to the FSD Beta team and/or the NHTSA (you probably won't get a response from either of them but at least you've reported it to them too).

Stopping and reversing without warning is very bad. Stopping and reversing with warning is still pretty bad. You're driving on a street FFS.
I am certainly not asking anyone for their opinion. I am documenting behavior that has been sporadically reported in the past. When did I ask anyone for their assistance?

This also might be an example of when FSD attempted (erroneously) to evade an accident. If FSD truly believed the truck was in my lane, and the only way to avoid a collision was to back away from it, this is perhaps the first documented case. That's the only reason I speculate it stopped and backed. And, as I said in my original post, there was no danger of collision. But if FSD believed there was (again, erroneously), this was an evasive maneuver.