Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Not an Elon tweet, but a Tesla one:

Disclaimer at the start says it's V11. But it's definitely running in "Elon mode" with no steering wheel torque required.
Yeah, the elephant in the room there is that it was false advertising because that's not the product that they sell/subscribe. We have to keep our hands on the wheel.

1. The car made no effort to clear the through lane when it pulled up behind traffic for the left turn for the highway entrance ramp.
2. Once entering the highway, it started to pass the white Tesla on the right before deciding that it better not.
3. When crossing multiple lanes on the exit ramp, it hung up on the first lane that was backed up before moving over.
4. When turning right towards the end, it again hung up on the turn.
5. The driver got out of the car while it was sitting in the middle of the road.

I've gotten all of these myself, though I've yet to get out of the car wherever it happens to stop. I wonder how many people who watch the video will notice this stuff - or would tolerate them if they were the responsible driver.
 
Yeah, the elephant in the room there is that it was false advertising because that's not the product that they sell/subscribe. We have to keep our hands on the wheel.

Here's the full fine-print at the beginning:

"This video shows current driving capabilities of FSD Beta, which began deploying around 8-23-2023 as software update version 11.4.7 or later. For demonstrative purposes, the capin camera driver monitoring remained active but the hands-on steering wheel requirement was disabled (customers cannot disable this feature). All drivers must remain attentive and be ready to take over at any time."

I'm not a lawyer, but I'm guessing that text above was written by one to cover themselves.
 
Yeah, the elephant in the room there is that it was false advertising because that's not the product that they sell/subscribe. We have to keep our hands on the wheel.

1. The car made no effort to clear the through lane when it pulled up behind traffic for the left turn for the highway entrance ramp.
2. Once entering the highway, it started to pass the white Tesla on the right before deciding that it better not.
3. When crossing multiple lanes on the exit ramp, it hung up on the first lane that was backed up before moving over.
4. When turning right towards the end, it again hung up on the turn.
5. The driver got out of the car while it was sitting in the middle of the road.

I've gotten all of these myself, though I've yet to get out of the car wherever it happens to stop. I wonder how many people who watch the video will notice this stuff - or would tolerate them if they were the responsible driver.
At least it was disclaimed about the hands on the wheel - just like all car commercials that say "closed course with professional driver - do not attempt".

1. I see what you're referring to, but it didn't bother me. Many humans do this where I am, so it's normal for me.
2. Don't see the problem here, it may have thought there was enough of a gap in front of the white Tesla, but as it got closer it saw there wasn't, so it adjusted
3. It can do a better job of crossing multiple lanes without adjustment, but at this point it still makes a brief pause in the new lane before shifting to the next one
4. I see what you mean, and it could have been smoother
5. Yeah, it's not Waymo yet - :) Hopefully they'll add some auto-parking option so that when you reach your destination it tries to find a stop to pull over and park.
 
Repeat marketing message from 2016.
Um. The 2016 message was kind of semi-faked and all that. THAT video we just watched wasn't. And does replicate pretty much what I've seen on 11.4.7.

This particular run didn't have any interventions. And I've had runs on 11.4.7 that went for miles and miles that didn't, or had none at all. So, as a capability demo, it works and isn't faking.
 
At least it was disclaimed about the hands on the wheel - just like all car commercials that say "closed course with professional driver - do not attempt".
I didn't even notice it, partly because the viewer controls come up over top of it, and partly because I can't be bothered to read fine print that small with my eyes the way they are these days.
2. Don't see the problem here, it may have thought there was enough of a gap in front of the white Tesla, but as it got closer it saw there wasn't, so it adjusted
I picked up on that as a result of going through the same experience. Mine was worse because there was just enough time for FSD to dig a nice deep hole for itself by getting alongside the other car as the merge lane was ending. It's one of those times where you can either play chicken with FSD to see if it'll sort itself out and possibly leave you with making an insurance claim - or just take over.

I no longer allow FSD to handle a merge if I have a lead car. To me, the idea of crowding a car like that in a merge lane comes straight out of crazy town.
 
  • Like
Reactions: Dewg

We know if Elon is saying anything it's not true. As sure as the sun rising tomorrow.
Hm. As things go at Tesla and such, Elon, being in the middle of the whirlwind, probably has better knowledge of what's going this way and that than any of us. Doesn't mean that he guesses right all the time.

However, the thought crosses my mind: It might not be Tesla wanting to ditch the torque requirement that's ruling this decision, it could likely be a regulator like the NHTSA. Remember, it was those guys who forced Teslas running FSD to stop at the white line, then creep forward, then analyze the turn, unlike any human who drives, and Tesla had evidence that humans didn't drive that way.

My understanding is that other manufacturers of nominal self-driving cars (Bluedrive, etc.) use IR cameras in the cabin that actually can track eyeballs, even when people are wearing sunglasses. Teslas don't have IR cameras in there; as people in the FSD 10.x and 11.x threads have noted, putting on a pair of sunglasses often defeats that part of the driver monitoring. Get rid of the torque-y stuff and suddenly there's no driver monitoring.

Elon says, "People who have driven 10k miles shouldn't have to do the torque dance" is fine - but all it would take is another driver in there using the profile of someone who's done the 10k. So, the NHTSA may have objected. Or the people who actually have to talk to the NHTSA saw the train wreck coming and put the change in slow-mo.
 
  • Like
Reactions: VanFriscia
Hm. As things go at Tesla and such, Elon, being in the middle of the whirlwind, probably has better knowledge of what's going this way and that than any of us. Doesn't mean that he guesses right all the time.
Elon has a long history of completely false FSD tweets since 2016. He has been wrong time and time again about FSD. In fact he has never been correct about full self driving (not partial self driving) timeline. If he is guessing he should state that.
 
However, the thought crosses my mind: It might not be Tesla wanting to ditch the torque requirement that's ruling this decision, it could likely be a regulator like the NHTSA. Remember, it was those guys who forced Teslas running FSD to stop at the white line, then creep forward, then analyze the turn, unlike any human who drives, and Tesla had evidence that humans didn't drive that way.
Given the NHTSA approach, if the car isn't going to stop at a stop sign, the driver knows as soon as it fails to slow for the white line. Without that, a failure by the car would mean that the driver would have much less time to react. I'm fine with the current cautious solution so long as it isn't viewed as ideal. When the systems are shown to be reliable, they should be permitted to drive according to their abilities - including stuff like rolling a stop sign.
 
Elon has a long history of completely false FSD tweets since 2016. He has been wrong time and time again about FSD. In fact he has never been correct about full self driving (not partial self driving) timeline. If he is guessing he should state that.
He has said in the past - he is extrapolating. So, for eg., if the disengagement rate comes down by 10% in a month, he thinks it will be down 70% in a year. Infact he thinks, it will "exponentially" get better i.e. the rate of advancement will increase. Obviously wrong.
 
  • Like
Reactions: edseloh
My understanding is that other manufacturers of nominal self-driving cars (Bluedrive, etc.) use IR cameras in the cabin that actually can track eyeballs, even when people are wearing sunglasses. Teslas don't have IR cameras in there; as people in the FSD 10.x and 11.x threads have noted, putting on a pair of sunglasses often defeats that part of the driver monitoring. Get rid of the torque-y stuff and suddenly there's no driver monitoring.
My understanding is the new Model S does indeed have this (with an IR light) so that it can monitor eye position even at night. Probably also true of the newer model X, not sure of Y or Highland model 3.
 
  • Informative
Reactions: JB47394