Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

OFFICIAL BUTTON WATCH

This site may earn commission on affiliate links.
It's the permanently clueless who are best ignored. But no, I'm not saying anything about known bugs or how best to accelerate progress. I'm just saying that safety doesn't necessarily mean safety today. And I agree that unless Tesla stays within cultural limits of acceptable carnage (which are pretty extreme in the US), then it may cause problems for their engineering process.
Obey today for tomorrow’s safety! Seems like a nice authoritarian slogan.....
 
  • Funny
Reactions: Cheburashka
Hopefully nobody is under the impression that hand gesture technology would be used for anything productive any time soon, there are far more structured and safety-intensive applications where it could be used but isn't and I'm sure for very good reasons.

Crane operation in construction, for example, depends on sets of gestures used to communicate between signalers on the ground and operators of equipment. People are specifically trained in which hand gestures to use to communicate clearly, the cranes are worth many millions of dollars, the immediate consequences of improper communication can be huge risk to human life or massive property damage. That's the first place the technology would be used if it were feasible and added much value.
 
FSD actively moves the controls of the car, and can suddenly change path, requiring immediate, decisive control. Look at the videos that drive the car right into a post.

No properly done Functional Hazard Analysis would rate an FSD error on the same level as changing the radio controls.

Controls moving in a non-expected way is basically what happened to the 737-Max and pilots did not react the way the designers intended. Are you saying that was a failure of personal responsibility and was no bigger deal than when they change the frequency on the radio? The FAA generally disallows any failure that can cause harm if not dealt with within 3 seconds, as human factors studies show this is unrealistic.
If the pilots were informed, please keep your hands on the controls and pay attention constantly, as autopilot may abruptly do the wrong thing at the worst time - than yes, I would say my analogy fits.
 
If the pilots were informed, please keep your hands on the controls and pay attention constantly, as autopilot may abruptly do the wrong thing at the worst time - than yes, I would say my analogy fits.
We don't give someone a driver's manual and turn them loose on the roads, there is a test. How do we know those simple instructions are enough training? I would argue that using beta FSD isn't really analogous to anything except autonomous vehicle testing (and all the companies that do that now have safety driver monitoring, many have two safety drivers in the vehicle, and of course all of them have training).
 
We don't give someone a driver's manual and turn them loose on the roads, there is a test. How do we know those simple instructions are enough training? I would argue that using beta FSD isn't really analogous to anything except autonomous vehicle testing (and all the companies that do that now have safety driver monitoring, many have two safety drivers in the vehicle, and of course all of them have training).
Yeah but EVERY TIME you engage any assistive/automation/autonomous feature, it says you're responsible and tells you exactly that. Keep your hands on the wheel and be ready to take over at all times. So can we agree that, at least for this matter, this is enough information? Otherwise we can switch up to signs on cliffs saying "don't jump".
 
Yeah but EVERY TIME you engage any assistive/automation/autonomous feature, it says you're responsible and tells you exactly that. Keep your hands on the wheel and be ready to take over at all times. So can we agree that, at least for this matter, this is enough information? Otherwise we can switch up to signs on cliffs saying "don't jump".
No, I don't think it's enough information.
It's more analogous to having a sign at the cliff saying "be alert, dangerous cliff", standing there, getting pushed from behind, and getting blamed for not being fully aware of your surroundings.
 
  • Like
Reactions: gearchruncher
No, I don't think it's enough information.
It's more analogous to having a sign at the cliff saying "be alert, dangerous cliff", standing there, getting pushed from behind, and getting blamed for not being fully aware of your surroundings.
Well, I guess we then have to agree to disagree. I think it's enough information. Almost to the point of being annoying. I'd rather acknowledge and never see that again.
 
Well, I guess we then have to agree to disagree. I think it's enough information. Almost to the point of being annoying. I'd rather acknowledge and never see that again.
It's basically zero information! haha.
So, how does one make a right turn on red while using beta FSD? I learned how to do that in driving school but how do I do it when I'm not controlling the timing of the accelerator?
Now I could be wrong and beta FSD could turn out to be safe enough with zero training when used by the Tesla owning population. I just don't think it's safe to assume that.
 
Yeah but EVERY TIME you engage any assistive/automation/autonomous feature, it says you're responsible and tells you exactly that.
It was just argued that adjusting the radio is just as dangerous as using AP and could kill you.
Why would I take my eyes off the road when engaging AP?
Temporary disclaimers while under motion are an awful way to enforce behavior.
Why not a disclaimer every time you get in the car that must be acknowledged to use AP on that drive?
 
It's basically zero information! haha.
So, how does one make a right turn on red while using beta FSD? I learned how to do that in driving school but how do I do it when I'm not controlling the timing of the accelerator?
Now I could be wrong and beta FSD could turn out to be safe enough with zero training when used by the Tesla owning population. I just don't think it's safe to assume that.
I think right turns on red are part of the FSD beta, but I fail to see how it's different from a yield right turn? Either way, we can ask someone with FSD 9 to test.
 
It was just argued that adjusting the radio is just as dangerous as using AP and could kill you.
Why would I take my eyes off the road when engaging AP?
Temporary disclaimers while under motion are an awful way to enforce behavior.
Why not a disclaimer every time you get in the car that must be acknowledged to use AP on that drive?
AutoPilot / FSD are not enabled by default on a new Driver profile. You have to acknowledge that. So, I'd say it's done. Then the activation message is a reminder of your acknowledgement. I don't see a problem here. I see a problem that people don't own up to their actions and always try to find someone else to blame.

Why is everyone complaining about Tesla but no one to the DMV? Other MFGs have similar functions. It's probably time for the DMVs to acknowledge that and update their exams.
 
  • Like
Reactions: Silicon Desert
It's basically zero information! haha.
So, how does one make a right turn on red while using beta FSD? I learned how to do that in driving school but how do I do it when I'm not controlling the timing of the accelerator?
Now I could be wrong and beta FSD could turn out to be safe enough with zero training when used by the Tesla owning population. I just don't think it's safe to assume that.
I successfully taught my kids how to drive and was a vigilant safety instructor from the front passenger seat. We've all experienced bad moves by various AP/Cruise Controls, including Teslas, and have safely taken over to correct the errors. These have been much easier and safer from the driver's seat than most of my interventions and all of my disengagements while training my kids to drive. Given your fear, you probably should consider opting out by not opting-into "FSD beta" - and as a 'Driver's Ed Instructor,' too.
 
  • Funny
Reactions: rxlawdude
I successfully taught my kids how to drive and was a vigilant safety instructor from the front passenger seat. We've all experienced bad moves by various AP/Cruise Controls, including Teslas, and have safely taken over to correct the errors. These have been much easier and safer from the driver's seat than most of my interventions and all of my disengagements while training my kids to drive. Given your fear, you probably should consider opting out by not opting-into "FSD beta" - and as a 'Driver's Ed Instructor,' too.
This is exactly my point. I don't think anyone would claim that sitting in the passenger seat and supervising your kids is even remotely as safe as driving yourself. It's a necessary risk we take because we want people to learn to drive. We mitigate the risk somewhat by requiring that the first 6 hours of training are done by a professional (at least in California).
I think right turns on red are part of the FSD beta, but I fail to see how it's different from a yield right turn? Either way, we can ask someone with FSD 9 to test.
It is part of FSD beta but how do you monitor it?
If I were teaching someone to drive I would tell them to look left to make sure no one is coming and then look right to make sure the path is clear and then go. I suppose with beta FSD you should have your foot over the brake and swivel your head repeatedly left to right? Are there situations where you should disable the system because it can't be safely monitored? Are there things near the crosswalk that the system can't see? It seems like training might be needed because it's not the same thing as driving and it might not be intuitive to everyone what those differences are.
 
It is part of FSD beta but how do you monitor it?
If I were teaching someone to drive I would tell them to look left to make sure no one is coming and then look right to make sure the path is clear and then go. I suppose with beta FSD you should have your foot over the brake and swivel your head repeatedly left to right? Are there situations where you should disable the system because it can't be safely monitored? Are there things near the crosswalk that the system can't see? It seems like training might be needed because it's not the same thing as driving and it might not be intuitive to everyone what those differences are.
I think you are taking a very specific approach to a general concept. Everything in FSD or any driver ASSIST should be monitored/overseen. So yes, look around. Yes, take over, Yes, disable if you don't trust it. It will get better over time, so eventually, as the driver, you should reevaluate your limits.
 
I think you are taking a very specific approach to a general concept. Everything in FSD or any driver ASSIST should be monitored/overseen. So yes, look around. Yes, take over, Yes, disable if you don't trust it. It will get better over time, so eventually, as the driver, you should reevaluate your limits.
Maybe you'll be right and people using beta FSD will be just as safe (or safer) than people not using it but you sure haven't made much of an argument.
And of course trusting the system is the biggest mistake you can make.
 
Which appears to be a reason you don't have FSD in your Tesla?
I have Enhanced Autopilot but I don't think the currently available FSD feature is worth the money (stoplight response). But yeah, I don't think I would pay for what I see in the beta videos. I'll have to see if people I know like it and how much they use it.

The man himself says trusting the system is the biggest cause of collisions while using Autopilot.
"One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.

When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it."

Elon Musk
 
Maybe you'll be right and people using beta FSD will be just as safe (or safer) than people not using it but you sure haven't made much of an argument.
And of course trusting the system is the biggest mistake you can make.
We can argue words but I'd rather have a discussion where we don't pick on each other for not using the 100% technically correct terms.
 
  • Love
Reactions: Silicon Desert