Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
I wish they would stop saying that the car didn't detect that the drivers hands were on the steering wheel in these accidents. The car didn't detect any torque on the steering wheel, it has no sensors to detect whether or not the driver was holding the steering wheel.

Yeah, it is pretty meaningless for the most part. He engaged AP 10 seconds before the crash, that is probably the more interesting piece of information.
 
  • Like
Reactions: OPRCE and Msjulie
Also, would it matter if the driver would have pretended to be keeping tabs on what the car's doing through a hand or orange on the wheel, even a bag of coins? These accidents happen only when:
- The driver decides it's OK to look away from the road for several seconds on end.
- The car doesn't monitor driver's eyes independent of steering wheel torque
- The driving aids are not ready to back up the driver's disinterest for the road

Since a nearly identical deadly AP accident occured years ago and no significant braking seems to have occurred, progress on AP/FSD appears to be less than optimal. If regulators need to bring in stationary fire trucks and road crossing semis to test a self driving system, are we even into the right century to entertain the idea of autonomous cars?
 
Also, would it matter if the driver would have pretended to be keeping tabs on what the car's doing through a hand or orange on the wheel, even a bag of coins? These accidents happen only when:
- The driver decides it's OK to look away from the road for several seconds on end.
- The car doesn't monitor driver's eyes independent of steering wheel torque
- The driving aids are not ready to back up the driver's disinterest for the road

Since a nearly identical deadly AP accident occured years ago and no significant braking seems to have occurred, progress on AP/FSD appears to be less than optimal. If regulators need to bring in stationary fire trucks and road crossing semis to test a self driving system, are we even into the right century to entertain the idea of autonomous cars?
There will probably never be a standardized test for autonomous vehicles. It will all be done with statistical analysis. You’ll drive with safety drivers for millions of miles and count disengagements and accidents. There are plenty of autonomous vehicles prototypes that use the “crutch” of LIDAR to avoid ever running into the side of semi trucks.
 
There will probably never be a standardized test for autonomous vehicles. It will all be done with statistical analysis. You’ll drive with safety drivers for millions of miles and count disengagements and accidents. There are plenty of autonomous vehicles prototypes that use the “crutch” of LIDAR to avoid ever running into the side of semi trucks.
Sadly, every member of the public is part of the test already.
That Tesla approaching, will the driver be there mentally? Some pretty crazy accident "happen to" Tesla drivers. Who's volunteering to be in a small car rather than a semi when making the same crossing?
EAP being safer in corner cases on cherry picked road sections and conditions covers up the vast effect on driving standards AP has enable to make it to public roads.
The social contract between traffic participant has been thoroughly broken. You pay attention. The car cutting you off accidentally was paying better attention than, clearly, a much too large share of Tesla drivers. And every single one of the new cars now gets EAP, it's becoming standard. Even people who didn't really want it will now be trying it.
 
The driving aids are not ready to back up the driver's disinterest for the road

Yeah, even unlimited access freeways, you can still have edge cases. I was in the RH lane once on AP going 70 on a limited access freeway. An RV that was pulled over in the shoulder decided to pull out in front of me at around 5-10mph. It happened far enough in front that I had time to react, so I took over when AP did not seem to be responding to the situation. I don't know if AP would have seen the RV at the last second, but I wasn't about to find out the hard way.
 
  • Like
Reactions: OPRCE
Yeah, even unlimited access freeways, you can still have edge cases. I was in the RH lane once on AP going 70 on a limited access freeway. An RV that was pulled over in the shoulder decided to pull out in front of me at around 5-10mph. It happened far enough in front that I had time to react, so I took over when AP did not seem to be responding to the situation. I don't know if AP would have seen the RV at the last second, but I wasn't about to find out the hard way.
Chances are, it wouldn't have noticed the RV by itself. Isn't it supposed to be quicker than the human brain...WHEN it works?
Semis and RV are 100% transparent to Tesla AP cams.
Now that Tesla is poopooing LIDAR, I have to wonder, would LIDAR miss such an event? Most of the horizon taking up by vehicle straight on your path. A well documented weakness of AP but zero sign of Tesla having fixed or even addressed it.
One Tesla kills its driver that wasn't paying attention. Next day another Tesla drives there, allows EAP and just does it AGAIN. Where is the machine learning, where are Tesla AP engineers themselves in all of this? Just pllaying the cherry picked safety data gme, not too interested at fixing bugs or geofencing proven deadly road situations?
In no other sector would an equivalent accident warrant just doing nothing and waiting for it to happen again.
Say, an Airbus crashes on a landing strip that's a bit different than most. Likely some software error combined with pilot error. Nothing to see here, it's likely never going to happen again... Can you imagine that?
 
Sad. Hopefully HW3 can at least detect the semi and phantom brake to alert the driver.
Phantom brake?? That's an annoyance that's happening too much. Why look up from your phone for that anymore?
AP has brake full on predicting accidents in front of it that a driver would never be able to get out of. But for a semi quare on the highway, you propose some dab of phantom brake to make the driver decide whether they want to do something about it? This company predicted FSD software and reality with dates IN THE PAST!
 
  • Like
Reactions: Kant.Ing
it's a horrible tragedy.

It sounds like the convergence of bad factors and bad timing. The driver, seeing a nice, well marked state road, with a clear path ahead, light traffic, and good weather, all ideal conditions for AP, engages AP and takes his eyes off the road for a few seconds. Unfortunately, the timing was horrible because a semi just happens to cross in front of him at that exact same moment. And that scenario of a semi truck crossing in from of you is one of the rare cases that AP cannot handle.

This will obviously be something that FSD will be able to handle. Once the front side cameras become active, FSD will be able to better track cross traffic and slow down preemptively before the vehicles cross in front of you.
 
it's a horrible tragedy.

It sounds like the convergence of bad factors and bad timing. The driver, seeing a nice, well marked state road, with a clear path ahead, light traffic, and good weather, all ideal conditions for AP, engages AP and takes his eyes off the road for a few seconds. Unfortunately, the timing was horrible because a semi just happens to cross in front of him at that exact same moment. And that scenario of a semi truck crossing in from of you is one of the rare cases that AP cannot handle.

This will obviously be something that FSD will be able to handle. Once the front side cameras become active, FSD will be able to better track cross traffic and slow down preemptively before the vehicles cross in front of you.
And now Tesla has this “edge case” to program into the neural net so it should never happen again. :rolleyes:
 
Chances are, it wouldn't have noticed the RV by itself. Isn't it supposed to be quicker than the human brain...WHEN it works?
Semis and RV are 100% transparent to Tesla AP cams.
Now that Tesla is poopooing LIDAR, I have to wonder, would LIDAR miss such an event? Most of the horizon taking up by vehicle straight on your path. A well documented weakness of AP but zero sign of Tesla having fixed or even addressed it.
One Tesla kills its driver that wasn't paying attention. Next day another Tesla drives there, allows EAP and just does it AGAIN. Where is the machine learning, where are Tesla AP engineers themselves in all of this? Just pllaying the cherry picked safety data gme, not too interested at fixing bugs or geofencing proven deadly road situations?
In no other sector would an equivalent accident warrant just doing nothing and waiting for it to happen again.
Say, an Airbus crashes on a landing strip that's a bit different than most. Likely some software error combined with pilot error. Nothing to see here, it's likely never going to happen again... Can you imagine that?

Semi's and RV's are not categorically "100% transparent to Tesla AP cams". And yes the computer will be quicker to react than a human when it gets to the point of actually reacting. The problem with saying that the computer is slower is that a human is generally anticipating all kinds of things and pre-reacting to situations whether they needed to or not.
 
  • Informative
Reactions: dhanson865
I had the "Full Self Driving" trial the last few weeks and I don't trust it over 25mph or stop and go traffic. On a straight 2 lane undivided highway I had it engaged going ~55mph when we came up to a tractor that was half on the road and half on the shoulder. It didn't see the tractor and I had to take over at the last moment putting 2 wheels over the yellow line slightly. That's when it finally freaked out about the oncoming car (that was accommodating me by moving over) and auto-braked. What it should have done is slow down behind the tractor. I was never in any danger because I was ready to take over and wanted to see how it handled the situation, but the answer is it failed. Stop calling it FSD, it's adaptive cruise control with lane keep and some gimmicks that work less well than just doing it yourself.
 
I had the "Full Self Driving" trial the last few weeks and I don't trust it over 25mph or stop and go traffic. On a straight 2 lane undivided highway I had it engaged going ~55mph when we came up to a tractor that was half on the road and half on the shoulder. It didn't see the tractor and I had to take over at the last moment putting 2 wheels over the yellow line slightly. That's when it finally freaked out about the oncoming car (that was accommodating me by moving over) and auto-braked. What it should have done is slow down behind the tractor. I was never in any danger because I was ready to take over and wanted to see how it handled the situation, but the answer is it failed. Stop calling it FSD, it's adaptive cruise control with lane keep and some gimmicks that work less well than just doing it yourself.

Congratulations on using the system in a way that is expressly warned about in the manual!

It is FSD as in the FSD option...it is NOT Level 3/4/5 autonomous driving yet.
 
Chances are, it wouldn't have noticed the RV by itself. Isn't it supposed to be quicker than the human brain...WHEN it works?
Semis and RV are 100% transparent to Tesla AP cams.
Now that Tesla is poopooing LIDAR, I have to wonder, would LIDAR miss such an event?
No it wouldn't.
Most of the horizon taking up by vehicle straight on your path. A well documented weakness of AP but zero sign of Tesla having fixed or even addressed it.
The problem is that computer vision isn't "there yet". The systems we have today are basically limited to spotting objects and structures in the image that they have been trained to recognize by their features. If it encounters something it hasn't been properly trained to recognize, or its features are obscured to the cameras (e.g. by lack of visual contrast between a white semi trailer against a white sky) it will not recognize it as an object and will not react to it. There is some early work on doing full 3D mapping of the environment based on recognized edges and surfaces, but it is far from mature at this point.
 
And now Tesla has this “edge case” to program into the neural net so it should never happen again. :rolleyes:

Is this the NN that's learning "exponentially" and will be "waaaaay better" than the current one ? /s

It is a tragedy, and extremely likely that at the point he engaged it, he took his eyes off the road for some reason. Heaven knows I've done that briefly to get water or change the music/podcast, although 8 seconds is a very long time to not be paying attention at that speed.

Technology will improve and these tragic cases will become rarer, but the ongoing lesson is, pay attention all of the time.
 
  • Like
Reactions: OPRCE
No it wouldn't.
The problem is that computer vision isn't "there yet". The systems we have today are basically limited to spotting objects and structures in the image that they have been trained to recognize by their features. If it encounters something it hasn't been properly trained to recognize, or its features are obscured to the cameras (e.g. by lack of visual contrast between a white semi trailer against a white sky) it will not recognize it as an object and will not react to it. There is some early work on doing full 3D mapping of the environment based on recognized edges and surfaces, but it is far from mature at this point.
If one Tesla not recognizing the Semi blocking the highway as an object killing the driver once...
How do we get into a reality of a carbon copy repeat years later?
Perhaps Tesla are losing engineers also because they feel the incompetence of management and themselves leads to evitable more deaths. I'm not sure I could live with my job at AP when this happened again, a semi crossing the highway was still too complicated to trigger a braking event for. There seems to be a form of arrogance or denial of accountability going on that is deeply disregarding common sense and human life itself.
 
Technology will improve and these tragic cases will become rarer, but the ongoing lesson is, pay attention all of the time.
This cannot happen with proper vision and action and the management to facilitate it. None have been proven thusfar.
On the contrary. Case in point: refusing to monitor the driver's attention for the road. Happy to turn the car in big brother mobiles but unwilling to make sure the driver is paying attention when it's been well document that AP turns sane people into lunatics that drive a heavy car blindfolded on busy roads.

We here have a driving aid that deals with some instances, but still lets you kill yourself if you look away the wrong moment. And the makers don't seem to concerned about it (re)curring. The skewed statistics "prove" that overall it's slightly safer, right? Right? There, then.
Accountability denied.
 
If one Tesla not recognizing the Semi blocking the highway as an object killing the driver once...
How do we get into a reality of a carbon copy repeat years later?
In my view there are two fundamental limitations given the current state of the art:

- Limitations of the sensors (cameras and radar)

- The fact that training large neural networks is a bit of an art form rather than an exact science. If a NN misbehaves, you can't just go in there and fix it, since no human understands exactly how millions of neurons and connections transform the input to get the result. You're basically limited to trying to re-train the NN using additional training data. But if you're not careful, you might make it worse in other ways. Computer vision in complex environments is just a really hard problem.

If anything is to blame here, it's not so much the engineers but rather the hubris of upper management ...
 
Status
Not open for further replies.