Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
#FSDBeta 8.2 - 2021.4.11.1 - Obstructed and Unprotected Left Turn Test

A nice simple test showing how FSD Beta handles obstructed views.

It appears to ignore the risk that an obstructed view could be concealing a hazard.

Only a collision or successful avoidance of a moving car will convince me either way. I have yet to see enough testing of this scenario with actual risk.

I suppose there could be "unmanageable situations" in a full release of Level 2 and we just have to live with it.
 
  • Disagree
Reactions: mikes_fsd
A nice simple test showing how FSD Beta handles obstructed views.

It appears to ignore the risk that an obstructed view could be concealing a hazard.

Only a collision or successful avoidance of a moving car will convince me either way. I have yet to see enough testing of this scenario with actual risk.

I suppose there could be "unmanageable situations" in a full release of Level 2 and we just have to live with it.


Im not sure I would call this a simple test. How would a human handle it? I think it did a decent job and non of us have any idea on what it made it’s decision on. Obstructed views are dangerous, period.
 
Im not sure I would call this a simple test. How would a human handle it? I think it did a decent job and non of us have any idea on what it made it’s decision on. Obstructed views are dangerous, period.
By simple I meant that it's just a t-intersection, a bush, and two cars. It's certainly an important issue. If the obstructed car was moving into a collision position at a high speed it would have been more useful as a test but Chuck does not need to risk his own car just to show us.

Yes, without seeing the Tesla's decision process all we have to go on is that the obstructed car didn't show on the display until the Tesla had committed to the turn and was in the road. If the obstructed car was moving at high speed it is likely the Tesla would have been in front of it.

As to how would a human handle it? A human was handling it. Chuck was in control, though he did know the test parameters. If this was a true blind test (excuse the pun), then Chuck would not have known whether there was a car hiding there but we've seen him in these situations before. The Tesla has often advanced in front of an oncoming car and Chuck has to disengage and hit the brakes.

In essence Chuck is trying to prove a proven point. The Tesla has no camera to see around an obstruction in this scenario. However we know that already. What is not always clear is the result. Here the FSD process ignores the possibility of a risk and proceeds anyway. He's just trying to show us that FSD is NOT going to prevent an accident here (well, it will actually cause an accident). The proper human driver response is to take control as you would do yourself if you were actually driving. Creep until you can see. No safe driver would just drive out blindly into the road.

What would a better FSD response be? Well that's the question. Certainly it should not be putting the car in a collision position. However it appears that it does just that.
 
Not sure if anyone shared this.


More or less abysmal failure in city driving.

Getting pretty obvious Tesla is at least 5 years or more behind Waymo and others.
I know the official statement is no accidents on FSD Beta, but I really have to wonder if they are just not reporting them. Several near collisions in this video, surely some of the 1000 2000 drivers are not quick enough to disengage. Perhaps the definition of accident needs to be clarified, are we talking NTSB-investigation-level accident, or collided with street pylon and wrecked the bumper accident?

I would define an accident as hit something, caused damage or injury to someone/something, or caused another to have an accident. Does hitting the curb and damaging the wheels count, because we've seen that happen? I have the feeling that accident means something different to them. Plus, how do we know they even report FSD Beta as being engaged when there is a crash? I bet 90% of cops would not ask the question, assuming they even knew FSD Beta was on the road. Do they ask every Nissan driver if ProPilot was engaged, or GM drivers if SuperCruise was engaged? Doubtful, and even if they did ask, the driver could just say no. They would probably only investigate and demand logs for a serious collision or if they saw the driver having no hands on the wheel. Again there is zero proof there are no accidents. I'm sure Tesla has the data though.

What would I like to see? Well I suppose an independent oversight system. Presumably if there are no regulations demanding it, that's not likely to happen. Most news stories are reporting verbatim everything they are told about the wonder that is FSD Beta, and that becomes the official version.
 
Last edited:
Ummmm, not even close.
Lets talk about the lidar crowd, the "these current sensors are not enough" crowd, or the monitoring crowd or the general "it must be done this way" gestapo crowd.

These forums are filled with examples.

But, see all those stem from the fact that it's sold with the intent of L5 self-driving. Elon has been pretty consistent with what its intended as.
 
I know the official statement is no accidents on FSD Beta, but I really have to wonder if they are just not reporting them. Several near collisions in this video, surely some of the 1000 2000 drivers are not quick enough to disengage. Perhaps the definition of accident needs to be clarified, are we talking NTSB-investigation-level accident, or collided with street pylon and wrecked the bumper accident?

I would define an accident as hit something, caused damage or injury to someone/something, or caused another to have an accident. Does hitting the curb and damaging the wheels count, because we've seen that happen? I have the feeling that accident means something different to them. Plus, how do we know they even report FSD Beta as being engaged when there is a crash? I bet 90% of cops would not ask the question, assuming they even knew FSD Beta was on the road. Do they ask every Nissan driver if ProPilot was engaged, or GM drivers if SuperCruise was engaged? Doubtful, and even if they did ask, the driver could just say no. They would probably only investigate and demand logs for a serious collision or if they saw the driver having no hands on the wheel. Again there is zero proof there are no accidents. I'm sure Tesla has the data though.

What would I like to see? Well I suppose an independent oversight system. Presumably if there are no regulations demanding it, that's not likely to happen. Most news stories are reporting verbatim everything they are told about the wonder that is FSD Beta, and that becomes the official version.

I think its simply hasn't been good enough to induce the kind of inattention that would result in any accident significant to set off an airbag. They're probably going off airbags activations to determine if there has been an accident or not. Or perhaps an accelerometer that exceeds some preset threshold to upload video to the mothership.

It was also only recently increased to 2000, and the new 1000 people probably haven't had much time with it.

Some of the recent threats to kick people off the early access due to bad behavior is probably Tesla reviewing youtube video, and not video from the car itself. A way of preventing bad publicity. Like if I was a youtube FSD person I absolutely would not post footage of FSD hitting a curb. Especially not now that I know they're cracking down.

I'd be really surprised if Tesla hits 10,000 FSD beta testers without a release of FSD footage of it crashing with a fairly significant crash. Like like running a red light for example.

All it takes is the kind of person that doesn't take personal responsibility for his/her failure or someone who strong believed it either induced them into inattention or it happened too quickly for them to intervene.

The better FSD beta is the more like this incident will happen
The more people who get it the more likely this incident will happen

How the initial incidents happen will likely shape peoples perception of how dangerous this L2 City Driving experiment is.

All this assumes FSD beta is released widely without restrictions. I don't think that will actually happen though. I'm eagerly awaiting 8.3/Friday to see if Tesla adds restrictions. I expect them too because releasing it as-is is pretty risky.
 
After all this sturm and drang, I'm predicting that after the FSD beta is widely released, there will be people here saying, "I stopped using it. It's too boring."

Is this true of any FSD feature?

Most of the features people don't use (as in widely use) because they don't work at a satisfactory level.

It's too slow
There is too big of a delay between the moment a human driver would do something, and the car does it.
There are too many mistakes
It's too inconsistent

So my prediction is a certain percentage of people will stop using FSD because of some of the above.
Another percentage of people will stop using it simply because they feel uncomfortable with it, and the oddity of taking full responsibility of something that does nearly 100% of the driving.

Some percentage of people will be hardcore FSD people driven by the idea that they're playing a role in improving something. This might turn out to be a lot of people due to the report button. If the report button survives, and if Tesla is proactive in fixing reported problems it could lead to a significant number of people using it quite often.

I'm not sure what my usage will be.

Interesting enough it comes down to whether its boring.

My commute to work, and back is purely city streets. If it does fairly well I could see using it all the time because for something like a commute boring works just fine. I don't need excitement from my commute back, and forth to work.

It won't replace moments regardless of how good or how bad it is. The kind of moments where driver aids are unwanted like a curvy mountain road.

I like that a Tesla can be a contradiction.

A car geared to saving the environment, but you don't have to drive like you're saving the environment
A car that has FSD for the work week, and track mode for the weekend
 
How the initial incidents happen will likely shape peoples perception of how dangerous this L2 City Driving experiment is.
I've read a lot of comments from those who feel you have to break some eggs to make an omelette, aka you have to take risks to make progress. Also the people who state "in the long run FSD is gonna save lives". This strategy could run headlong into the general public and MSM who decidedly don't take this approach especially when some poor unfortunate has been flattened. Risky times, but interesting ones.
 
Really useful drone perspective there. I actually don't think the car poked out much into the lane.
Not sure if there was some distortion from the perspective or actual curves that would make these "parallel" line estimates be completely wrong, but yeah it seems like nearly a whole lane width to the crosswalk.

lines.jpg
 
#FSDBeta 8.2 - 2021.4.11.1 - Unprotected left turns with Drone 3rd Person View in 4K. Check it out.

Great video. Thanks. This is one of the more difficult maneuvers that a self driving car needs to make. I've read so much about the 1.2MP front cameras that I wonder if a higher resolution would help see the median better. It appears as if the car doesn't see the median.

On another note, if the front radar can help the car sense objects in heavy rain, fog and snow, wouldn't a rear radar also help for the same reasons? Changing lanes in these conditions I would guess also warrant a rear radar, but we don't have one.
 
Yeah agree, great video with the drone angle.

The unprotected left turns obviously still need a lot of work.

On a side note, I can see how the visualization on the touchscreen is more helpful than the IC visualization in the MS/MX for these situations. I wonder if Tesla considered adding it to the center display.

Also just watching that video was stressful...
 
Wow, three messages in a row bashing FSD. One wouldn't do.
If you look at the purpose for each message you'll realize none of them had the purpose of bashing FSD at all.

FSD is as much about a strategy towards autonomous driving as it is the implementation of technology. A strategy will have pro's and con's. Tesla has a very clear strategy of leveraging their fleet of FSD vehicles to evolve an L2 driver assist system into an autonomous system.
 
#FSDBeta 8.2 - 2021.4.11.1 - Unprotected left turns with Drone 3rd Person View in 4K. Check it out.

You are definitely putting it into some difficult situations, but they are real life ones that autopilot will need to master. The drone angle was great and although you thought you were sticking out and cars were avoiding you it didn’t appear to be that noticeable from above. Nice job, keep them coming. At least at 50’ it’s not saying “retard, retard” 😁