Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Yes, I'm a professional pilot but trust me there are a lot of private pilots with AP that receive zero training on the system.
As someone that designs AP's for small aircraft for a living, the FAA has never once allowed me to release one (or even an update) without a manual that describes how to use it and what the restrictions are. It's technically illegal to operate a plane without reading this and understanding it. Meanwhile, Tesla doesn't even try and give a manual.

Since we are on planes the autopilot is a system that needs to be watched constantly and you want to be ahead of it at all times.
So how often have you had an AP disconnect and try and crash the aircraft that required you to react within half as second?
The FAA literally requires AP's to be designed such that their failure does not require immediate attention from the pilot. You're saying that you never take your eyes off the gauges when IMC our out the window when VFR when the AP is on? It does zero workload reduction? Then why do you use it? Why is use of it often required by airlines or even laws to shoot some approaches if it's so unreliable?

We all seem to agree that FSD beta will try to kill you if you don't pay attention. Meanwhile, aviation has proven autopilots to be a great safety tool. I hate people calling them the same thing (at this point).
 
Last edited:
  • Like
Reactions: pilotSteve
As someone that designs AP's for small aircraft for a living, the FAA has never once allowed me to release one (or even an update) without a manual that describes how to use it and what the restrictions are. It's technically illegal to operate a plane without reading this and understanding it. Meanwhile, Tesla doesn't even try and give a manual.


So how often have you had an AP disconnect and try and crash the aircraft that required you to react within half as second?
The FAA literally requires AP's to be designed such that their failure does not require immediate attention from the pilot. You're saying that you never take your eyes off the gauges when IMC our out the window when VFR when the AP is on? It does zero workload reduction? Then why do you use it? Why is use of it often required by airlines or even laws to shoot some approaches if it's so unreliable?
An autopilot will only do what it's told to do and often you have to intervene or help the plane make crossing and or speed restrictions. You are also leaving out the fact this is a Beta. Where did i ever say AP in aircaeft are so unreliable? You tend to make a lot of stuff up to make a point. I'll answer the question i asked you, my take is you don't have FSD and like talking as much crap about it as you can. If you feel the need to argue for arguments sake I'm out...enjoy the day.
 
I don't see anything wrong with the way Rob tests his car. He's disengaging way too late for some people's tastes, but he is still alert and responding in time to prevent collisions. Sometimes you just want to see if the car will respond, even if late.
I understand what you are getting at but doing it that way serves no one but just titillating visual feedback for viewers of the video. Tesla does not need testers to let the car almost hit a barrier to disengage. If you think the car is about to make a bad maneuver, disengage. From that disengagement data, they can recreate the situation forward in time and see if the car would have hit the barrier.
 
TeslDaily / Rob does seem to drive a bit dangerously (i.e. intervenes rather late).


As they say, if you need a manual - you can't be a tester ;)

The correct way to handle this would be to figure out what use cases are clearly not yet designed to be handled (apparently things like parking lots, closed roads) and not try them. Edge cases they are looking for come from other normal driving the CNN & the planner are currently designed for.

Ofcourse, since they don't actually list cases not yet designed to be handled - we have to just figure these out ourselfs.

But but you made fun of GM for being transparent about 95% of driving scenarios. Tesla on the other hand will tell you its L5 but it can't even handle 25% of driving scenario well.
 
I understand what you are getting at but doing it that way serves no one but just titillating visual feedback for viewers of the video. Tesla does not need testers to let the car almost hit a barrier to disengage. If you think the car is about to make a bad maneuver, disengage. From that disengagement data, they can recreate the situation forward in time and see if the car would have hit the barrier.

It also gives the tester a baseline behavior to see if subsequent iterations perform better. I don't think he disengages late to make his videos more engaging. But agreed that it's not necessary to disengage late to help Tesla improve.
 
I understand what you are getting at but doing it that way serves no one but just titillating visual feedback for viewers of the video. Tesla does not need testers to let the car almost hit a barrier to disengage. If you think the car is about to make a bad maneuver, disengage. From that disengagement data, they can recreate the situation forward in time and see if the car would have hit the barrier.
Elon has literally said the car will avoid obsticals that it can't recognize; he has repeatedly talked about the car being able to stop even if a UFO crash lands on a highway.

We also know that the car has such a capability, with the voxel outputs that have been demoed before, and the monorail test that (finally) recently, barely passed.

The expectation of the car not driving into a "road closed" sign has been set by Elon and demonstrated to be possible (albeit poorly) by the underlying technology. So it's not like he put the car into an unreasonable situation.
 
Since we are on planes the autopilot is a system that needs to be watched constantly and you want to be ahead of it at all times. Kinda like this Beta.

Where did i ever say AP in aircaeft are so unreliable?
Right above. You said you need to keep your eyes on an aircraft AP just as much as you do this beta. Why would you need to do that if an aircraft AP was highly reliable?

You are also leaving out the fact this is a Beta.
Something the FAA would never allow to be released to a bunch of public aircraft without a well defined test plan that didn't risk the public.
 
Elon has literally said the car will avoid obsticals that it can't recognize; he has repeatedly talked about the car being able to stop even if a UFO crash lands on a highway.

We also know that the car has such a capability, with the voxel outputs that have been demoed before, and the monorail test that (finally) recently, barely passed.

The expectation of the car not driving into a "road closed" sign has been set by Elon and demonstrated to be possible (albeit poorly) by the underlying technology. So it's not like he put the car into an unreasonable situation.
I'm not saying he put the car in an unreasonable situation, using the car in a construction zone is ok, using the car in unusual scenarios is ok as well, that is how they are able to source varied data to use for training. An overarching point is also that, as a tester, Tesla just needs you to use the car as you normally would. Letting the car complete or partially complete a maneuver that you can see is dangerous before intervening does not give them any more useful data than disengaging early. They are capable of simulating forward in time to see what the car would have done. That is literally my point, they have the tools and technology available at their disposal to play the event through in a virtual environment.

In my opinion, watching the video shows that he was alert and ready to take over at any point during the entirety of the event, however, the fact that he had to back up in order to then proceed to the correct lane showed that he intervened too late. The car made no attempt to slow down when approaching the barricade.

It also gives the tester a baseline behavior to see if subsequent iterations perform better. I don't think he disengages late to make his videos more engaging. But agreed that it's not necessary to disengage late to help Tesla improve.
He does, I have watched his other videos wherein he lets the car make a dangerous maneuver before intervening and he just laughs and says "so we let it go, people can't fault us for that". Some viewers like to complain about testers disengaging, purely because they are interested in seeing what the car will do without the disengagement. Kim was attacked by fanatical viewers for the same thing. Chuck has done it before and also addressed it in his intro to FSD beta videos to new testers. They don't learn anything from letting the car complete or struggle to complete a maneuver, disengaging and intervening safely is what gets them the data they need to improve the system.
 
They don't learn anything from letting the car completing or struggle to complete a maneuver, disengaging and intervening safely is what gets them the data they need to improve the system.
Hmm, Maybe if Tesla put this in a manual or email to the testers, instead of just leaving us all in the dark to guess?
Why should it be on a completely uniformed tester to know when to disengage and what helps Tesla learn and what doesn't?
I mean, all Tesla tells you to do is click a button and send them an email when you see something you don't like. They don't even make it clear that normal disengagements help them at all.
 
... using the car in a construction zone

...They are capable of simulating forward in time to see what the car would have done. That is literally my point, they have the tools and technology available at their disposal to play the event through in a virtual environment.

... The car made no attempt to slow down when approaching the barricade.
This happens time and time again. If they can simulate this situation it would be nice to see them address and solve it. Driving full speed at objects is not an edge case.
 
This new beta tester NukemNak also has a problem with a road block at the end of the video. What is wrong with 10.2 NNs that doesn't recognize those?

Training a NN to recognize and react to occlusions would be great. I would be happy if it would at least stop for things that are in front of it. It doesn't have to recognize them and route around them - although that would be nice - it could firstly not try to hit them.
 
Some notes from Dirty Tesla on 10.2:

  • FSD will now use the washer fluid as well as the wipers if it thinks the windshield is dirty.
  • You can no longer block the cabin/selfie cam. (It won't let you use FSD with it covered.)
  • If you are using your phone at a red light it will beep at you and tell you to pay attention when the light turns green
 
  • Informative
Reactions: S4WRXTTCS