Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
There may be brilliance in exquisitely trained NNs that we cannot conceive of.

There may be a level of possible behavior that is beyond most human conception.

These are likely to be judged harshly by humans in our typical short sightedness and this is likely unavoidable.

I offer a real world example in that I know someone who over a period of 6 years of driving, never made a left turn.

This is possible. This was done quite safely and consciously and successfully although it required a lot of maps (pre gps navigation).

If FSD stopped making left turns it would appear maddening to most/all of us but it is possible to achieve. The consequences of such a choice would likely be profound on many levels and costs. And there are likely other hidden possibilities to be discovered.

In a RT world, would humans accept a fee structure based on calculations they could not decipher but that offered levels of choices and fees considering planning optimized around levels of safety, timeliness or cost?

FSD in a human centric environment is going to be surprising and hopefully a bit humbling.
I don't think there's anything profound, technically impressive, or otherwise appealing about substituting a left turn with 3 right turns. I want to see them solve the problem that was posed, not solve an easier problem instead.
 
  • Disagree
  • Like
Reactions: EVNow and D Good
The amount of inconsistency from one drive to the next is stunning. Yesterday the car just kept driving with wheels on the double yellow line rumble strip on a straightaway until I disengaged.

On a related note, has anyone been pulled over due to FSD behavior? If so, what did you tell the officer?
A commenter on one of Chuck's Youtube videos was pulled over for speeding while he was using FSD. He explained to the cop that he was using the self driving feature of his car and the cop ended up letting him off with a warning. I think he was doing 40 in a 30.

I disengage if I happen to notice police around. Not worth the risk of FSD doing something weird at the worst time. The plus side of using FSD though is that I'm less likely to get pulled over for speeding than if I was driving myself. Since I'm occupied with observing and analyzing FSD, I find I am not so bothered by FSD driving a bit slower than I would, as long as its not hindering traffic.
 
Last edited:
Yep, FSD definitely lets cars in - very polite. Most of the time it’s fine but I’m careful and I have closed the gap a couple of times when I didn’t think it was safe using a little pressure on the accelerator. I’ve also experienced the not blocking an intersection behavior.
It's awkward sometimes when the other driver looks to you for confirmation and you're not sure if you should wave them on, since you're not sure if FSD will actually stay in place. 😅
 
I'm saying L4+ accidents rate that Cruise was using as the benchmark is a good number to go with. Nothing to do with Cruise per se.
Why not use Tesla’s 1 per million mile metric?
Severe collisions are what really matter. Talking about collisions that are equivalent to curb rash just confuses things. The evidence shows you can’t extrapolate from L4 collision rate to severe collision rate.
 
I got to drive our Model Y again today. It seems to confirm my earlier impression, our Y, using the same software and settings, feels significantly more "assertive" that our Model 3. Assertive in this case means stopping and starting faster (more aggressive). Autospeed seems to get to higher relative speeds. It's possible the perception comes the difference in the vehicles (e.g. the ride height). Anyone else with 3 and Y notice any differences?
 
Severe collisions are what really matter. Talking about collisions that are equivalent to curb rash just confuses things. The evidence shows you can’t extrapolate from L4 collision rate to severe collision rate.
Which evidence? The VTTI study seems to show that accident rates decrease roughly in proportion for all severity levels (for autonomous car vs human), at least for levels 1 - 3 in one of their charts. This seems to imply that e.g. halving the minor collision rate (through software/hardware advancement) would be likely to roughly halve the severe collision rates as well. Not in a direct causal sense of course, but because improving the overall software/hardware would tend to reduce mistakes at all levels of severity, and because such improvements tend to not be specifically targeted at narrow accident types (e.g. reducing curb strikes). (Some may be, but it's the exception.)
 
Are you one of those people who stops with their cart right in the middle of the aisle at the grocery store, completely blocking the lane while you decide whether you want Cheerios or Golden Grams for breakfast?
LOL… You seem to have some bad eggs in your neighborhood grocery store!

FSD is simply following the rules, cautiously, as it should. Not sure why people are mad about it.
 
I want to see them solve the problem that was posed, not solve an easier problem instead.
A large part of what we may be asking relates to human nature. Human nature requires speed limits. Human nature requires stop signs and on and on.

We build a world of laws because it better than than a world of brutes and emotional manifestations of frustration, possession or egotism…

If we define the real world requirements of replacing vehicle operation for transportation very carefully we may find ourselves with unexpected solutions.

We are asking NNs to solve simple problems like A to B layered upon complex human emotional short comings we are hardly aware of or willing to admit.

It might be fun and surprising. 🤔
 
I've gotten so many "Thank you" waves from pedestrians and other drivers on FSD. I always chuckle to myself that they're thanking a machine for driving courteously.
They're thanking you for being courteous. It doesn't matter if you chose to do it manually or use an automatic system, it was still courtesy. A discourteous person would push the car through, which remains an option to you. A discourteous person would also not even engage the automatic system, perhaps because it was so slow and bleeping "courteous".
 
Why not use Tesla’s 1 per million mile metric?
Severe collisions are what really matter.
We have to pick a metric but I'm not sure tesla's definition of a severe collision is it. (Although I'll give them credit for including being rear-ended
since the majority of those are likely the fault of the other driver.) I've seen posts with pics of some pretty impressive damage but airbags didn't deploy and no crash footage appears to have been saved. As well, damage to the other cars should be part of the criteria, the bent fender for a Humvee could be catastrophic for the Nissan Micra it hit.

It is the minor stuff (like curbing or hitting a pillar while parking) that I care almost as much about since it adds to the cost of ownership (new tires, body work, rental car while awaiting bodywork). I think of paying for any such repairs as an idiot tax levied because I was an idiot. If I had to pay for that damage because FSD is an idiot, I'd be steaming (and so I don't use FSD hardly at all because I know elon is an idiot and I don't trust him to not push out faulty software.)

We collect the amount of miles traveled by each vehicle with Autopilot active or in manual driving, based on available data we receive from the fleet, and do so without identifying specific vehicles to protect privacy. We also receive a crash alert anytime a crash is reported to us from the fleet, which may include data about whether Autopilot was active at the time of impact. To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. (Our crash statistics are not based on sample data sets or estimates.) In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated. We do not differentiate based on the type of crash or fault (For example, more than 35% of all Autopilot crashes occur when the Tesla vehicle is rear-ended by another vehicle). In this way, we are confident that the statistics we share unquestionably show the benefits of Autopilot.
 
Yeah I’m not claiming the poster is lying…but something isn’t right. The described behavior is not characteristic of v12 running properly. Maybe the car needs a reboot, calibration, or something else.
Just an update - I tried the reboot and camera recalibration and sure enough the difference is HUGE. Although it still accelerates a bit hard for my taste it's a lot smoother, and the lane following and braking behind stopped traffic is dramatically improved. Before, it was like it just couldn't judge distances. I went from disengaging many times per drive to basically zero disengagements for whole drives now. While I still think there's room for improvement, this is much more in line with the positive experiences I've read.

Thank you again for the suggestion and the understanding reflected in your response.
 
I try to reboot after each update, just for the heck of it and because it's really easy to do. Recalibrating cameras is a PITA and takes too long so that's only when its really needed. If your car is that messed up maybe it will help. 🤷‍♂️
I've only recalibrated the car cameras twice, but it was doing pretty silly stuff before that and it seemed to resolve the overall weirdness I was experiencing with FSD(then Beta) YMMV
Yep, sure enough recalibrating the cameras made a world of difference. It feels like a completely different system and is now very usable. It did take a VERY long time for the recalibration for the FSD, probably over 100kms of highway driving.
 
I don't think there's anything profound, technically impressive, or otherwise appealing about substituting a left turn with 3 right turns. I want to see them solve the problem that was posed, not solve an easier problem instead.
One of the delivery companies either FedEx or UPS train their drivers and plan the routes not to take left turns. Apparently it reduces accidents and reduces fuel costs!!
 
No I think he had it right.
Affect as a noun (pronounced AF-fect). A purposeful and somewhat over-expressed response or demeanor, put on for effect.

Like an over-the-top actor delivering a line or a comedian playing for the laugh.

Or on the internet, an emoji to communicate the tone - the affect.

It can be a good or a bad thing, but sometimes annoying. Sometimes when a person is prone to a put-on accent or demeanor, people will say that's an affectation.

I thought @APotatoGod used it purposefully and cleverly there. Good but uncommon word, and I know that the clever joke is diminished by having to explain it. :)
OK - it is good to learn something new everyday. @APotatoGod - intentional or accidental?
 
  • Like
Reactions: JHCCAZ