Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

It's happening....FSD v9 to be released (to current beta testers)

This site may earn commission on affiliate links.
You: Humans are bad drivers, the bar is low.
Us: Humans get into collisions (>12mph) every 2 million miles. It doesn't look like FSD beta is very close to that.
You: Humans are bad in these ways, machines are good in these ways. Autonomous vehicles will replace human drivers when they achieve greater than human safety.
Me: Yes. Are we arguing about something?

Yes, indeed.
 
how do you define 'done' for fsd? definition of done (DoD) is challenging for something that needs a bunch of nines in order to not kill people on the road.
That's not up to me. I mean, I was in software development for 30 years, and I know what the standard definition for "Beta" is in that industry (feature complete, but not completely validated), but it's obvious that as with so many other things, like "two weeks", Elon has his own, special definition of the term "beta".
 
That's not up to me. I mean, I was in software development for 30 years, and I know what the standard definition for "Beta" is in that industry (feature complete, but not completely validated), but it's obvious that as with so many other things, like "two weeks", Elon has his own, special definition of the term "beta".
Sadly beta has been degraded over the years to mean more or less anything other than "released". In the original terminology, what Tesla has been releasing is Alpha software (aka "work in progress").
 
I don't think it matters what the dude was doing.
What matters is that he was not paying attention and tragically died because of it.
He could have been snorting cocaine -- still does not change the end result.
And even more importantly, the AP1-equipped model S he was driving was never designed to handle, or advertised as handling, such a situation.
 
And even more importantly, the AP1-equipped model S he was driving was never designed to handle, or advertised as handling, such a situation.
And yet he believed and other people have died because they believed they could drive without paying attention.

If you read and watch social media, people continue to believe Teslas are competently self-driving to this day.

Of course they are all wrong and foolish, but why does it continue to happen?
 
And yet he believed and other people have died because they believed they could drive without paying attention.

If you read and watch social media, people continue to believe Teslas are competently self-driving to this day.

Of course they are all wrong and foolish, but why does it continue to happen?
I think the environment is much more grey now than it was then. Now, with terms like FSD etc. floating around the general population has the impression that teslas are self driving (which I'm sure most members here can attest to). But back then? AP simply kept its lane on the highway, and performed lane changes upon demand. TACC kept spacing behind vehicles in your lane. Nowhere was it claimed that the car would avoid or stop for stationary objects, including crossing traffic. And this driver was a knowledgeable Tesla owner active on this forum. He knew what it could do and what it couldn't.

He was not expecting crossing traffic. For whatever reason, he was not paying attention. He got careless. He got complacent.

The real question here is, from a human factors standpoint, what led him into complacency and what with AP could be improved to reduce complacency, not "why didn't AP stop for this truck?"
 
The real question here is, from a human factors standpoint, what led him into complacency and what with AP could be improved to reduce complacency, not "why didn't AP stop for this truck?"
It seems the answer is "Autopilot enabled him to be complacent." However, as we know, the driver is responsible for not becoming complacent under any scenario less than SAE L3 autonomy.
 
The real question here is, from a human factors standpoint, what led him into complacency and what with AP could be improved to reduce complacency, not "why didn't AP stop for this truck?"
Ok, let's start with "what with AP could be improved", or in general with any of the autonomous features.

We see new postings here of people surprised the car has crashed on Smart Summon. Tesla says you didn't read the manual.

People saying it doesn't read speed signs, runs over curbs, doesn't turn sharply in Asia, fails to lane change correctly, takes exit ramps at unsafe speeds. Most of which are explained in "very clear language" in the manual which IMO is not very clear at all. Basically if you have managed to read the manual and are one of the few who are technical enough to understand it, you would know not to expect the car to be perfect.

For everyone else who didn't read it or can't exactly understand the intricacies of tech speak in the manual, and there are many people like that, they have to take it on faith/belief that the car is probably pretty safe and well-designed.

We all can scoff at these foolish people who didn't read the manual but that's the first failure to tackle in your "human factors" proposal. These are not systems for trained professionals. Are they too dangerous to be used safely by untrained novices?
 
We all can scoff at these foolish people who didn't read the manual
If you had bothered to read the post to which you are responding, you would have realized that I stated he DID understand his car very well and he did understand the limitations in the manual. The point I was making is that this particular accident should not be used as a data point to support the general conclusion that FSD capabilities are not clearly communicated by Tesla and not well understood by the general public. This conclusion may very well be true, but you shouldn't use this one particular accident to support your argument.
 
If you had bothered to read the post to which you are responding, you would have realized that I stated he DID understand his car very well and he did understand the limitations in the manual. The point I was making is that this particular accident should not be used as a data point to support the general conclusion that FSD capabilities are not clearly communicated by Tesla and not well understood by the general public. This conclusion may very well be true, but you shouldn't use this one particular accident to support your argument.
Sorry, I didn't mean to offend. I was generalizing to the types of people in my post. If the contention is that J. Brown was a knowledgeable user then, yes his accident would tend towards complacency.
 
Sorry, I didn't mean to offend. I was generalizing to the types of people in my post. If the contention is that J. Brown was a knowledgeable user then, yes his accident would tend towards complacency.
When GPS navigation units were starting to become popular, there were several examples of people "following the route" and trusting the navigation instructions with comical (at best) or tragic results.

I know that when I read a map and then drive, I learn the route pretty well. When Google Maps guides me, I retain less. When my passenger guides me turn by turn, I retain almost nothing. There's some kind of leader/follower dichotomy in the brain that I suspect is quite an ancient & primitive instinct. This is at the core of the famous old Lemmings meme and the amusing videos of baby ducks who can get attached to the wrong mother figure - but it's there in us also.

We can roll our eyes at those who zone out on L2 assistance, but I think it's a real human trait to be lulled into the follower role, and a more powerful draw than we'd like to admit.

8 drivers who blindly followed their GPS into disaster
 
  • Like
Reactions: BR666T
And yet he believed and other people have died because they believed they could drive without paying attention.

If you read and watch social media, people continue to believe Teslas are competently self-driving to this day.

Of course they are all wrong and foolish, but why does it continue to happen?
Because people "believe" what they choose to believe, even if it is unsubstantiated nonsense. The earth is flat, covid vaccines are fake and designed to add a chip to everyone so Bill Gates can spy on them (why?) and covid is caused by cell phone towers.

People can choose to believe whatever they like, but if those beliefs fly in the face of factual, evidence based information, and they get hurt as a result, then they have no-one to blame but themselves. Sadly, other innocent by-standers often get hurt as a result.

The root causes are subtle and psychological, but can be summarized as a combination of Dunning-Kruger and egotism.