Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • Want to remove ads? Register an account and login to see fewer ads, and become a Supporting Member to remove almost all ads.
  • Tesla's Supercharger Team was recently laid off. We discuss what this means for the company on today's TMC Podcast streaming live at 1PM PDT. You can watch on X or on YouTube where you can participate in the live chat.

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
But you'd never use the current AP software to make unprotected left turns, so that means the current AP software is likely safer when talking about that particular use scenario.

They're not going to release this to the wild any time soon. 3 months definitely (not), 6 months maybe (probably not).
Right. But I do use it for normal lane keeping and it struggles mightily every single day. FSD Beta improves upon that.

Let's not pretend that I'm advocating putting on a blindfold while FSD does its magic. You can't do that with AP and you can't do that with FSD so what's the ultimate difference?
 
When Musk recently talked about releasing FSD before end of 2021, I think he was talking about FSD Beta going GA - not robotaxi (that he always says is 1 year away).

I listened to this interview just to listen to this part carefully. It's always hard to know what the heck Musk is referring to, but in this case I definitely do not think Musk was referring to the FSD beta.

I don't think we need to reset expectations to end of 2021. I don't think its likely to come by end to the year though (you never know though). But I think @turnem 's timeline seems reasonable or maybe a little longer. Sometime in Q1 maybe pushed into Q2...

If the Tesla FSD team thought the FSD beta release may not come until end of 2021.... in that case I think they would definitely prepare an intermediate release that would come end of this year / early next year. This intermediate release may be the rewrite/ enhances perception, tracking, and neural net arch, etc. but without the FSD features. Or it could be a very gimped/nerfed FSD beta release... Ie. extremely frequent nags (if you enable FSD beta) and confirmation requirements for many tasks.

Another thing I am wondering everyone's opinion on.... is it out of the question for Tesla to require everyone who bought FSD and wants to use FSD beta to sign some additional waiver?

Also, what is the current fleet size of Tesla owners with FSD and HW3? is it up to 1 million?
 
# of outside testers is directly proportional to the maturity of the sw / confidence of the engr team.

Very true. But also typically Tesla releases features to various early access circles under NDA and they cannot post content. For this FSD beta they are explicitly allowing them to talk about it and share content. and I think this suggests it has been tested internally or under NDA of employees for a long time and is a sign of maturity in the software and confidence in eng team.
 
I just feel the stakes are going to be so much higher, especially if you are not ready to take over in a second. That is a difference with today’s autopilot.

how/why is that different from today's autopilot? Are you suggesting that users using FSD beta are going to be less likely to be ready to take over than people using today's autopilot? Why? I'd argue the opposite.
 
how/why is that different from today's autopilot? Are you suggesting that users using FSD beta are going to be less likely to be ready to take over than people using today's autopilot? Why? I'd argue the opposite.

I’m saying that if the FSD beta was rolled out to a lot of people in its current form, there would be a lot of difference from what I have seen in the videos.

I drive on AP all the time, on both freeway and city/rural streets. The level of attention needed is less since there are no turns and there are limitations when approaching stop lights when not following other cars. When a potential issue happens, there is more time to react.

The videos I have seen with the beta require more attention in the newer maneuvers that could turn quickly into an accident, something that Tesla will need to prevent at all costs.

The other option is that the software progress to a point that it is bulletproof, but I think we would be deluding ourselves. I’m just saying that there needs to be a way to force greater driver attention.
 
  • Like
Reactions: AlanSubie4Life
how/why is that different from today's autopilot? Are you suggesting that users using FSD beta are going to be less likely to be ready to take over than people using today's autopilot? Why? I'd argue the opposite.
Exactly!!

For some reason I feel like folks are setting the bar higher for FSD (in terms of human engagement) than existing AP. I can' think of a single good reason to substantiate this line of thinking.
 
how/why is that different from today's autopilot? Are you suggesting that users using FSD beta are going to be less likely to be ready to take over than people using today's autopilot? Why? I'd argue the opposite.
It's just inherently more difficult to monitor. Take a right turn on red as an example. You look left to make sure no cars are coming and then look right as you go. Beta FSD might go while you're looking left to see it's clear and you could hit something or someone on the right.
The margin for error is also smaller. The car has to inch out to get a view in order to make an unprotected left. How quickly can you hit the brake if it goes when it's not safe? Many people I'm sure can do it but I suspect that a large enough fraction of customers cannot. You have to be able to monitor and react to unpredictable behavior from the car and also monitor your surroundings at the same time.
It's a whole other can of worms when it actually gets deceptively good at driving and people are tempted to relax their vigilance.

It does seem like they should release the Autosteer and TACC improvements though (i.e. "autosteer on city streets" that they currently promise to release this year).
 
  • Like
Reactions: HighZ
It's just inherently more difficult to monitor. Take a right turn on red as an example. You look left to make sure no cars are coming and then look right as you go. Beta FSD might go while you're looking left to see it's clear and you could hit something or someone on the right.
The margin for error is also smaller. The car has to inch out to get a view in order to make an unprotected left. How quickly can you hit the brake if it goes when it's not safe? Many people I'm sure can do it but I suspect that a large enough fraction of customers cannot. You have to be able to monitor and react to unpredictable behavior from the car and also monitor your surroundings at the same time.
It's a whole other can of worms when it actually gets deceptively good at driving and people are tempted to relax their vigilance.

It does seem like they should release the Autosteer and TACC improvements though (i.e. "autosteer on city streets" that they currently promise to release this year).
The scenarios that you describe can happen to people in full control of their cars. I would even argue that if the car is doing the work AND the human is overseeing that you have an even BETTER chance of preventing an accident.
 
The scenarios that you describe can happen to people in full control of their cars. I would even argue that if the car is doing the work AND the human is overseeing that you have an even BETTER chance of preventing an accident.
Unintended acceleration is very rare and if caused by the car is a safety defect. It's almost always caused by pedal misapplication which beta FSD does not prevent and might even make more likely. If you look at the unprotected left turn example you have to have your foot hovering ready to press either the accelerator or brake (accelerator if it goes too slowly to avoid oncoming traffic, brake if it goes when it's unsafe).
 
  • Like
Reactions: Matias
I drive on AP all the time, on both freeway and city/rural streets. The level of attention needed is less since there are no turns and there are limitations when approaching stop lights when not following other cars. When a potential issue happens, there is more time to react.

The videos I have seen with the beta require more attention in the newer maneuvers that could turn quickly into an accident, something that Tesla will need to prevent at all costs.

Yea I am not sure I can agree. You are saying that with normal autopilot there are points where less attention is needed, and I think I disagree there because full attention is always needed and anyone using it should always have full attention.... I think on normal autopilot stuff... There is GREATER potential to be over confident and stop paying attention for a very rare case where it suddenly does something wrong... where say going through an intersection, I think the potential for misuse is way smaller.

It's just inherently more difficult to monitor. Take a right turn on red as an example. You look left to make sure no cars are coming and then look right as you go. Beta FSD might go while you're looking left to see it's clear and you could hit something or someone on the right.
The margin for error is also smaller. The car has to inch out to get a view in order to make an unprotected left. How quickly can you hit the brake if it goes when it's not safe? Many people I'm sure can do it but I suspect that a large enough fraction of customers cannot. You have to be able to monitor and react to unpredictable behavior from the car and also monitor your surroundings at the same time.
It's a whole other can of worms when it actually gets deceptively good at driving and people are tempted to relax their vigilance.

It does require more focus to monitor yes and is far less relaxing... but this also means that the people that cannot or are not going to focus as much as they need to will simply use the feature less in these situations... just like autopilot today... most users probably don't use autopilot in challenging situations and just go back to manual. But enthusiasts put it to the test and use lots of caution and focus.

I do have lots of experience safety driving fully autonomous cars on public roads... it does take a lot of focus, and needs to be well understood. I also have experience hiring and training people for the job of safety diving fully autonomous systems on public roads. I think Tesla FSD enthusiasts are more than capable for the job and will use it safely.

It does seem like they should release the Autosteer and TACC improvements though (i.e. "autosteer on city streets" that they currently promise to release this year).

It's really hard to say, but someone could argue that the auto steer improvements are the MORE potentially dangerous part about this update, and the turns at intersections are less.

I do have 1 main point of concern however, which makes me less confident in wide FSD release coming soon. And that is the difference in media/ public perception when there is an accident/death. I feel most people understand today when there is an autopilot accident... oh it was a driver assistance system where the driver messed up. (however that wasn't always the case). But now that the product name is "FSD beta"... I think media and public will have a lot harder time understanding that it is just a driver assistance system which is different from self driving cars... and it will make a very bad public image for real the self driving industry,
 
Unintended acceleration is very rare and if caused by the car is a safety defect. It's almost always caused by pedal misapplication which beta FSD does not prevent and might even make more likely. If you look at the unprotected left turn example you have to have your foot hovering ready to press either the accelerator or brake (accelerator if it goes too slowly to avoid oncoming traffic, brake if it goes when it's unsafe).
What? Who's talking about unintended acceleration? But if you want to go there then what about phantom braking? In either scenario the driver has to have their feet ready to intervene. Both are equally dangerous.
 
What? Who's talking about unintended acceleration? But if you want to go there then what about phantom braking? In either scenario the driver has to have their feet ready to intervene. Both are equally dangerous.
The scenarios that I listed were unintended acceleration (i.e. car accelerating when it shouldn't). I think phantom braking is way less dangerous than many people seem to believe. Phantom braking can't cause you to be t-boned or run into a pedestrian.
 
  • Like
Reactions: Matias
The scenarios that I listed were unintended acceleration (i.e. car accelerating when it shouldn't). I think phantom braking is way less dangerous than many people seem to believe. Phantom braking can't cause you to be t-boned or run into a pedestrian.
Unintended acceleration and unintended braking BOTH require the driver to be ready to apply their foot to the appropriate pedal. BOTH could result in a collision.

Determining the severity of the outcomes of either scenario is a moot point.
 
Basically failed every scenario, which is understandable given the difficulty. The problem is how it’s failing: it shouldn’t be stopping in the highway. It’s failure mode should be to just sit at the stop sign and wait for a big enough gap both ways or ask for user intervention instead of creeping forward into the highway slowly.

Tesla might be better off just trying to avoid unprotected left turns like this for now. Adjust their FSD nav to just turn right and make a U, when they enable U’s, or just reroute around it. Or, if that’s too difficult, throw up warnings and ask for user intervention.

edit - 4:36 - nice improvement at the end where it went into the correct lane where in a previous vid I saw it start to turn into the opposing lane instead before Mr Cook took over.

edit 2 - would be curious to know what mspisars disagreed with.

I don't understand why you would want Tesla to avoid difficult situations. I small beta group is exactly where you want Tesla to make changes to address the edge cases. Now if the same problems are still occurring after several more updates perhaps your suggestion makes sense but certainly not now. Do you actually believe Elon would tell his team now it's too difficult and lets punt for now? Not a chance.