Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
Personally, as someone who doesn't use Twitter, I recall no such thing, but it certainly didn't appear to happen in this video.

Should we be concerned that the driver didn't read the release notes, or concerend that the driver didn't understand them?

"a wide angle lens"? A lens judging distance based on motion probably wouldn't be good enough at all. AFAIK, there are multiple lenses and stereoscopic vision involved with the field of view that this pedestrian would have occupied. Keep in mind that FSD is, in fact, NOT human, and should, accordingly, be able to accurately and consistently account for any distortion that MIGHT exist in the stereoscopic view using math, which computers happen to be pretty good at.

"Buzzed the turn?" In my experience, you'd be hard-pressed to get FSD to do anything that might even have a remote possibility of triggering a skid outside of driving on ice, it's not even remotely aggressive when it comes to cornering, and using the accelerator in an attempt to override that will lead to a wider turn, even if it crosses double-yellows or goes off he shoulder.

Who is complaining? Unless it's people on Twitter, you are being overly dramatic. Prior to this post to which I'm replying, all discussion here since you posted the link to the video has involved why this happened or distinguished between fault and feature. I certainly haven't seen any complaint about the video post.

Frankly, your post with the tweet was reasonable, but in my opinion, your arguments about why other people should be as concerned as you claim to be are a bit out there.
It's pointless to argue here. FSD Beta cannot please everyone, and someone will always find fault. The car made the right turn, and some people are upset because they think it was too close to the pedestrians and is a safety hazard. If the same video was released showing the Tesla braking and waiting for the pedestrians to finish their move across the crosswalk, there would be people complaining that the car had plenty of room and braking was just pissing people off behind the Tesla and potentially causing a rear-end collision. Just like the debate on rolling stops at stop signs and red lights. The car just can't win no matter what behavior it displays.
 
It's pointless to argue here. FSD Beta cannot please everyone, and someone will always find fault. The car made the right turn, and some people are upset because they think it was too close to the pedestrians and is a safety hazard. If the same video was released showing the Tesla braking and waiting for the pedestrians to finish their move across the crosswalk, there would be people complaining that the car had plenty of room and braking was just pissing people off behind the Tesla and potentially causing a rear-end collision. Just like the debate on rolling stops at stop signs and red lights. The car just can't win no matter what behavior it displays.
Goes to show how we humans have so many different ways of driving, and of course everyone thinks "the way I drive is the BEST way" ;)
 
The O'Dowd crowd sent Elon a message with their comical music video rendition of AC/DC's Highway to Hell.


You're easily fooled.

That would drive for all of a few minutes. Tesla implemented wheel-weight detection that watches for a lack of variance of pressure on the wheel. That, and the cabin-camera would pretty quickly detect the lack of a driver as well.
 
  • Funny
Reactions: kabin
You're easily fooled.

That would drive for all of a few minutes. Tesla implemented wheel-weight detection that watches for a lack of variance of pressure on the wheel. That, and the cabin-camera would pretty quickly detect the lack of a driver as well.

Right! You'd have a point if Tesla's software was effective at detecting steering weights or humans in the seat versus helium balloons, stuffed animals, etc. Heck even a human slumped over in a sleeping position behind the wheel wasn't detected as a concern. All their recent test drives used steering weights and light weights in the seat. If only the software worked.
 
Right! You'd have a point if Tesla's software was effective at detecting steering weights or humans in the seat versus helium balloons, stuffed animals, etc. Heck even a human slumped over in a sleeping position behind the wheel wasn't detected as a concern. All their recent test drives used steering weights and light weights in the seat. If only the software worked.

If you really believe it, try it yourself and see. Within a minute or two, the system will detect the defeats and disengage. Just because O'Dowd shot an ad showing it driving for a few seconds doesn't mean it's not working as intended.
 
So my 2017 X100D was recently upgraded from FSD beta 11.4.9 to FSD "Supervised" 12.3.4. Interesting they dropped the word beta, but I'm guessing it's still a beta considering the wipers are presumably still tagged as such (based on how poorly they work). I guess I missed a lot of 11.x and 12.x versions between these upgrades, but the good news is it's behaving as well or better than the version I liked the most some 18 months ago in a lot of ways. Unfortunately, it also still has some bad behaviors that were added after that in the long downhill regression+crapifying fest that was v11. The new auto speed setting is interesting, but the note about it basically says "you are responsible for keeping it from going too fast [and by the way, we removed all ways for you to do that other than shutting it off]." If I could use my stalk with single+double mode the way I always could before, I could just switch to TACC in those odd cases where it's going faster than I want it to, but unfortuantely, it seems the "I shouldn't have to understand how my car works" whiners got what they wanted because not having TACC fallback probably reduces Tesla's liability even if having to switch between FSD and TACC (or manual and auto speed) on the MCU actually increases real world accident risk by being a distraction. I guess there may be a setting that changes how fast it considers "natural" on given roads, but unfortunately, adjusting that setting would likely lead to other undesirable behaviors. Maybe Tesla is all-in on the "you shouldn't have to touch it" bit and thinks capturing cancels that they can assume are due to speed is a sane way to train further even though everybody has different preferences?
 
I’ve noticed the FSD trial likes to speed in school zones (28mph vs. 20mph limit) so I disengage for those!
That is the correct procedure. FYI, there has not been any indication in any release notes for any recent versions that FSD now handles school zones. Keep an eye out in the future for that line item, otherwise always disengage when in school zones while school is in session.
 
  • Like
Reactions: edseloh
Right! You'd have a point if Tesla's software was effective at detecting steering weights or humans in the seat versus helium balloons, stuffed animals, etc. Heck even a human slumped over in a sleeping position behind the wheel wasn't detected as a concern. All their recent test drives used steering weights and light weights in the seat. If only the software worked.
I love all these discussions, and similar elsewhere. Everyone from the NHTSA to Consumer Reports are complaining that the nags/detects are not good enough. Ok, consider this scenario. A driver becomes incapacitated while FSD is driving. Should (a) FSD continue driving safely or (b) disengage and let the car crash?

Sure, the car should check for driver alertness .. but disengaging as a response seems to me to be idiotic.
 
I love all these discussions, and similar elsewhere. Everyone from the NHTSA to Consumer Reports are complaining that the nags/detects are not good enough. Ok, consider this scenario. A driver becomes incapacitated while FSD is driving. Should (a) FSD continue driving safely or (b) disengage and let the car crash?

Sure, the car should check for driver alertness .. but disengaging as a response seems to me to be idiotic.
Or (c) put the hazards on and slow the car to a stop in the lane. :)
 
Sure, the car should check for driver alertness .. but disengaging as a response seems to me to be idiotic.
I would have loved to hear the discussions when they changed the wheel nag conditions (v11, I believe) to include a nag at every traffic light. "Let's nag and distract the driver at the time we most want them to pay attention to the road. ". What could go wrong? ;)

As far as not responding to nags, seems they could learn to pull over, turn on 4-ways, honk the horn every 60 seconds, and call 911 assuming incapacitation. Maybe not perfect but seems better than stopping in traffic though.
 
Well exactly .. ideally the car should do that and maybe even dial 911. But it clearly should not just give up driving and let the car swerve and crash.
I haven't paid attention to this issue in some time - last time I tested it was back in early v11. Does v12, after failing to satisfy the nag, not slow down and stop in the lane?

The one condition I'm worried about is a medical emergency. As we know, if you apply enough torque to the wheel, it will disengage. If someone has a heart attack or otherwise passes out and their body slumps onto the wheel in a way that applies torque, it will disengage and crash. This could be a tough problem to solve, as you could use the cabin camera to see if the person is slumped over the wheel and then ignore the torque take-over. But that could cause problems with federal regulators where there are rules in place that must allow the driver to take over the system for any reason - like pressing the brake should always disengage any ADAS function.