Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
Let's not forget that he was in a 2 ton car. It's an unguided missile, on a busy public road. If you can pickt he wrong moment to do some texting, you can pick the moment a minibus of kids without seatbelts get in the way of you finding semi or lane divider. Had the driver lived, he should have been tried for a profoundly serious road crime. This goes beyond Darwin awards. And Tesla allowing it to happen, hiding behind a notice on a the screen, from the safest car maker on the planet, is a slap in the face of someone in the triple digit IQ club.
All automobile manufactures offer cruise control. It's not just Tesla that allows you to crash your car if you don't pay attention. Do you think Tesla is being unethical for offering an auto-steer driver assist feature? Are you afraid to use AP while driving your Tesla, because it might let you crash?
 
I have to disagree with this, my collision detection has screeched at me when a car 2 or 3 in front of me has suddenly slowed. I never tailgate and usually hang back at least 6 - 8 car lengths, but I can clearly see on the screen the system is picking up 3 or 4 cars in front comfortably, and definitely knows if one of them slows suddenly.

Ok. I guess I just have not seen it do that yet without also seeing brake lights from the tracked vehicle. Guess there is sometimes some scatter.

But I would say it certainly is not a “capability” of the system in the sense it could reliably do so, as physically that would be very difficult or impossible for any system (even LIDAR ;))

It would be bad for people to start thinking their car had this capability (with reliability...)...so I was just trying to make sure people know it can’t really do this.
 
I have to disagree with this, my collision detection has screeched at me when a car 2 or 3 in front of me has suddenly slowed. I never tailgate and usually hang back at least 6 - 8 car lengths, but I can clearly see on the screen the system is picking up 3 or 4 cars in front comfortably, and definitely knows if one of them slows suddenly.
It's an amazing feat, isn't it? With deficiencies in other aspects of the software impacting safety, it makes you wonder whether they even tried or changed anything after casualties. The road to FSD is not there until they fix the way they approach challenges. Let alone there being any time line. There are grave issues. But the best we get from the FSD is "LIDAR sucks".
Well, maybe it help triggering an emergency brake operation. RoboTaxi will not have a driver presumed to be operating the vehicle. Death by RoboTaxi, is that a cool thing to have typed up about you?
 
  • Like
Reactions: OPRCE and GeoX750
Ok. I guess I just have not seen it do that yet without also seeing brake lights from the tracked vehicle. Guess there is sometimes some scatter.

But I would say it certainly is not a “capability” of the system in the sense it could reliably do so, as physically that would be very difficult or impossible for any system (even LIDAR ;))

LIDAR would have less ability to spot this than a camera, unless the LIDAR was mounted on the roof, because what allows the camera to "see" this is the relative motion between the car in front and the object in front of that, via the wide-angle lens on the top of the windshield.

Regardless, the point really is that the further back you are from the car in front, the more likely you (or your AP) is able to see issues further ahead. That's one of the reasons tail-gating is so dangerous, because you literally can only react when the car on front brakes, and then often it's too late.
 
  • Like
Reactions: jerry33
Regardless, the point really is that the further back you are from the car in front, the more likely you (or your AP) is able to see issues further ahead. That's one of the reasons tail-gating is so dangerous, because you literally can only react when the car on front brakes, and then often it's too late.

Totally agree with this. It is the same reason you increase following distance, prior to passing someone.

My main point is that the radar and cameras have very limited capability. This is according to Tesla; I learned this (well, I guess I kind of suspected already) from reading the Owner’s Manual.
 
Totally agree with this. It is the same reason you increase following distance, prior to passing someone.

My main point is that the radar and cameras have very limited capability. This is according to Tesla; I learned this (well, I guess I kind of suspected already) from reading the Owner’s Manual.
So does the human eye/brain. At least the cameras/software are improving.
 
  • Funny
Reactions: AlanSubie4Life
LIDAR would have less ability to spot this than a camera, unless the LIDAR was mounted on the roof, because what allows the camera to "see" this is the relative motion between the car in front and the object in front of that, via the wide-angle lens on the top of the windshield.

Regardless, the point really is that the further back you are from the car in front, the more likely you (or your AP) is able to see issues further ahead. That's one of the reasons tail-gating is so dangerous, because you literally can only react when the car on front brakes, and then often it's too late.
If the car in front of you is stopping using its brakes then you should be able to stop in time (assuming you have instant reaction time like a computer). This seems like the easiest type of accident to avoid no matter what sensor suite you're using.
 
  • Disagree
Reactions: Zhelko Dimic
If anything is to blame here, it's not so much the engineers but rather the hubris of upper management ...

You ignore the most important fact to reach an irrational conclusion. Tesla's cars do not have autonomous driving and management doesn't claim they do.

In other words, when using a driver aid like Autopilot, the driver is responsible for monitoring the current conditions. That obviously didn't happen in this instance and is the primary cause of this accident, not the "hubris of upper management".

Yes, your conclusion is as irrational as they come.
 
If the car in front of you is stopping using its brakes then you should be able to stop in time (assuming you have instant reaction time like a computer). This seems like the easiest type of accident to avoid no matter what sensor suite you're using.

I was talking about humans, not machines, otherwise you're probably right.

Even so, I don't think it's good practice to drive 1 car-length behind another, just because you have the benefit of a nanosecond reaction time, because 1) you're likely to anger the person in front, 2) if they brake hard enough they may out-brake the Tesla and you'll still crash, and 3) they may hit the car in front of them and stop dead, and then you'll do the same.
 
You ignore the most important fact to reach an irrational conclusion. Tesla's do not have autonomous driving and management doesn't claim they do.

In other words, when using a driver aid like Autopilot, the driver is responsible for monitoring the current conditions. That obviously didn't happen in this instance and is the primary cause of this accident, not the "hubris of upper management".

Yes, your conclusion is as irrational at they come.
I see people on this forum say all the time that EAP is better than most humans. I feel like Tesla is partially responsible for creating that impression. You've got the CEO saying things like this:
“Well, we already have Full Self-Driving capability on highways. So from highway on-ramp to highway exiting, including passing cars and going from one highway interchange to another, Full Self-Driving capability is there."

I worry that if Tesla is not careful they will make EAP less safe (if it's not already) in real world usage and the system will be banned. None of us want that.
 
I see people on this forum say all the time that EAP is better than most humans. I feel like Tesla is partially responsible for creating that impression. You've got the CEO saying things like this:
“Well, we already have Full Self-Driving capability on highways. So from highway on-ramp to highway exiting, including passing cars and going from one highway interchange to another, Full Self-Driving capability is there."

I worry that if Tesla is not careful they will make EAP less safe (if it's not already) in real world usage and the system will be banned. None of us want that.

The problem was that the driver took his eyes off the road and crashed. All the Tesla literature (both on screen and on their website and in their sales materials) makes this crystal clear. But this does not mean people won't still win Darwin Awards.

You cannot fix stupid. Or irrational analysis.
 
I see people on this forum say all the time that EAP is better than most humans. I feel like Tesla is partially responsible for creating that impression. You've got the CEO saying things like this:
“Well, we already have Full Self-Driving capability on highways. So from highway on-ramp to highway exiting, including passing cars and going from one highway interchange to another, Full Self-Driving capability is there."

I worry that if Tesla is not careful they will make EAP less safe (if it's not already) in real world usage and the system will be banned. None of us want that.

Agreed - this is an unpopular truth, but it's a "have your cake and eat it" situation where he can claim "FSD on the highway" for marketing and bragging rights, but then say "driver needs to pay full attention" when there is an accident.

If the non-highway "FSD" is branded by the same criteria, then it will need full supervision for the foreseeable future. Most observers, and even most Tesla aficionados understand this.

I've personally never conflated FSD with 100% autonomy in my own mind, but some people are.
 
  • Like
Reactions: OPRCE and jsmay311
You ignore the most important fact to reach an irrational conclusion. Tesla's cars do not have autonomous driving and management doesn't claim they do.
Really now. Here's a quote from Elon Musk from the Q4/18 earnings call:

“Well, we already have Full Self-Driving capability on highways. So from highway on-ramp to highway exiting, including passing cars and going from one highway interchange to another, Full Self-Driving capability is there. "

Not to mention the recent re-classification of "Navigate on Autopilot" as part of a "Full Self Driving" package, or countless other remarks over the years that were clearly designed to create the impression that what Tesla does is self-driving.

I agree with you on one thing though: at this point it's absolutely irrational to take anything they say about autonomous driving at face value.
 
  • Like
Reactions: OPRCE
  • Love
Reactions: Daniel in SD
The problem was that the driver took his eyes off the road and crashed. All the Tesla literature (both on screen and on their website and in their sales materials) makes this crystal clear. But this does not mean people won't still win Darwin Awards.

You cannot fix stupid. Or irrational analysis.
I think you're missing my point. When it comes to automotive safety, real world results are what matter the most. Somehow people are getting the impression that EAP is "better" than most human drivers. If Tesla can't fix this problem then they may be forced to stop people from using it on public roads.
 
The Nissan LEAF used to make you agree to a waiver every time the car started and it had no driver assistance features. I can't begin to understand why Tesla does not make everyone sign a separate waiver that explains AP and how to use it at purchase as well as one on the screen that reminds people once a month or with each and every software update. Far too many people claim ignorance but this seems like a no-brainer. Tesla seem to love liability, not a flame thrower is a perfect example.
 
  • Like
Reactions: OPRCE and Cloxxki
it's a horrible tragedy.

It sounds like the convergence of bad factors and bad timing. The driver, seeing a nice, well marked state road, with a clear path ahead, light traffic, and good weather, all ideal conditions for AP, engages AP and takes his eyes off the road for a few seconds. Unfortunately, the timing was horrible because a semi just happens to cross in front of him at that exact same moment. And that scenario of a semi truck crossing in from of you is one of the rare cases that AP cannot handle.

This will obviously be something that FSD will be able to handle. Once the front side cameras become active, FSD will be able to better track cross traffic and slow down preemptively before the vehicles cross in front of you.

All recent versions of AP are using all cameras, if you look at some of the analysis . But again since it happen a while ago, not sure what version his car is on.

Nevertheless what bothers me is, at least AEB should have engaged, stopped the car or reduced speed considerably. Neither of this happen for whatever the reason.

Unfortunately poor soul lost his life for not paying attention for 10 seconds.:(
 
  • Like
Reactions: OPRCE
I've personally never conflated FSD with 100% autonomy in my own mind, but some people are.
Honestly, I don't see how anyone who doesn't spend every day on Tesla forums would not expect something called "full self-driving" to mean autonomous driving. Tesla's own descriptions of "full self-driving" clearly meant autonomous driving before it was re-defined earlier this year.
 
Honestly, I don't see how anyone who doesn't spend every day on Tesla forums would not expect something called "full self-driving" to mean autonomous driving. Tesla's own descriptions of "full self-driving" clearly meant autonomous driving before it was re-defined earlier this year.

I agree, but Tesla has been misleading people for 3 years on this, and a lot of folks are all too willing to drink it up. And while some people have been pointing this out for a while, they get voted down into oblivion or insulted.

This often comes up when referring to FSD allowing people to lease out their cars and make money from it being a robo-taxi. Some people thought it was a 2-year time-frame, I know I said more like 10 - 15 years and got laughed at - but a robo taxi needs full autonomy in the vast majority of situations, with no intervention, whereas NoA and any non-highway equivalent, clearly relies on 100% attention from a driver.

What's frustrating (as far as any web-forum can be truly frustrating) is that in being skeptical about FSD's capability in the near term, doesn't diminish Tesla's advantage in the market place, which is very real. No one else is any closer than they are.
 
Status
Not open for further replies.