Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
And now Tesla has this “edge case” to program into the neural net so it should never happen again. :rolleyes:

In the strictest sense, yes, it is an edge case that FSD will need to be able to handle. But I would never be so callous. A person's death is never just another "edge case" to solve.

I do suspect that one big reason why Tesla is so committed to reaching L5 autonomy as quickly as possible, is because FSD is the only way to truly solve these edge cases on the current hardware. The fact is that the current AP with it's limited use of cameras and limited NN, is never going to be able to handle these sort of cases. The only way to guarantee that you have solved a scenario like this, is with all cameras active, including the front side cameras, and the full NN that can detect and predict all vehicles and objects. Doing so, requires that Tesla finish their work on FSD. The sooner Tesla finishes FSD, the sooner they can make the cars much safer and prevent these accidents from ever happening again.

I had the "Full Self Driving" trial the last few weeks and I don't trust it over 25mph or stop and go traffic. On a straight 2 lane undivided highway I had it engaged going ~55mph when we came up to a tractor that was half on the road and half on the shoulder. It didn't see the tractor and I had to take over at the last moment putting 2 wheels over the yellow line slightly. That's when it finally freaked out about the oncoming car (that was accommodating me by moving over) and auto-braked. What it should have done is slow down behind the tractor. I was never in any danger because I was ready to take over and wanted to see how it handled the situation, but the answer is it failed. Stop calling it FSD, it's adaptive cruise control with lane keep and some gimmicks that work less well than just doing it yourself.

Autopilot is not designed to handle that situation yet. Once Tesla releases the full FSD and says that Autopilot is L5, then yes, it will be able to handle that situation.
 
In the strictest sense, yes, it is an edge case that FSD will need to be able to handle. But I would never be so callous. A person's death is never just another "edge case" to solve.
I just hope the publicity these incidents generate will make people realize that EAP is NOT "Full Self Driving" on the highway as Elon Musk claims.
 
Only surprise to me here is that it had just been engaged for 10 seconds. You would think he would still be paying attention. I think most people here thought it was Autopilot as it was hard to explain it any other way.

Ditto on the hands on the wheel data - seems silly to quote it as such since it is a torque sensor, not a touch sensor. He may have had a hand on the wheel and just chosen to divert his attention away from the road.

Very sad to see, but definitely no surprises. Autopilot is not capable of dealing with this situation. Maybe FSD. We’ll see I guess. Seems hard.
 
  • Like
Reactions: OPRCE
This cannot happen with proper vision and action and the management to facilitate it. None have been proven thusfar.
On the contrary. Case in point: refusing to monitor the driver's attention for the road. Happy to turn the car in big brother mobiles but unwilling to make sure the driver is paying attention when it's been well document that AP turns sane people into lunatics that drive a heavy car blindfolded on busy roads.

We here have a driving aid that deals with some instances, but still lets you kill yourself if you look away the wrong moment. And the makers don't seem to concerned about it (re)curring. The skewed statistics "prove" that overall it's slightly safer, right? Right? There, then.
Accountability denied.

Really? and skewed statistics, too.

Tesla says you are to drive with your hand(s) on the wheel and be paying attention out the front window. Not all people turn into "lunatics" "driving blindfolded" despite your "well documented" documentation. I have driven AP many miles and never had a problem. If Tesla did put eyeball or retina cameras in the car, people would find a way to foil them just like they put heavy weights on the steering wheel to foil the steering sensors.

This cannot happen if the driver is alert, hands on wheel, watching out front window, and NO sensors or software can stop someone from killing themselves. I am convinced that the Darwin principle is real.

Accountability sufficient.
 
We haven't heard about radar for awhile. It does seem to see cars ahead of cars pretty well. But as an emergency braking systems for big things it doesn't seem to have improved as much as hoped.

I have never seen any evidence of this seeing ahead of cars. That famous video in Germany or wherever probably picked up on sudden slowing of the vehicle directly in front (which then swerved and crashed as I recall?). Hard to know but super easy to trigger that alarm if it is tracking a leading vehicle - it does not even need to slow down much - it seems to be the rate of change of velocity of the tracked vehicle that triggers it, not just closing speed.
 
We haven't heard about radar for awhile. It does seem to see cars ahead of cars pretty well. But as an emergency braking systems for big things it doesn't seem to have improved as much as hoped.
The problem with conventional radar is that it isn't accurate enough to differentiate obstacles from background objects. It just sees a large "blob" that could be a semi trailer or just an overpass. Automotive radar applications avoid this problem by ignoring anything that doesn't seem to move along with the car (the relative speed can be detected by Doppler shifts of the reflected radar signal). So essentially radar is good at detecting other cars, but not obstacles that are stationary or moving perpendicularly in front of the car.
 
I just hope the publicity these incidents generate will make people realize that EAP is NOT "Full Self Driving" on the highway as Elon Musk claims.

Agreed. The other likely consequence is that these accidents may cause the public to be a lot more skeptical when Tesla does declare that FSD is L5. Although, presumably, Tesla will roll out FSD in the beginning with a disclaimer that it requires driver attention. This will enable Tesla to continue to fine tune and improve the system but also give Tesla owners time to experience it so that when it does reach L5, they will know that they can indeed trust it.
 
That has not been rolled out yet, only the hardware is being shipped, but right now it's using the older software.

It was an inside joke referring to another thread - I don't think any of us are qualified to speculate whether AP3, FSD or future versions of the software would deal with this scenario any differently, given it's not clear why this happened. We can only hope it would.
 
  • Like
Reactions: jerry33
It was an inside joke referring to another thread - I don't think any of us are qualified to speculate whether AP3, FSD or future versions of the software would deal with this scenario any differently, given it's not clear why this happened. We can only hope it would.
From the debug video in the investor presentation, it does appear that trucks can be identified.
 
  • Informative
Reactions: Wooloomooloo
Only surprise to me here is that it had just been engaged for 10 seconds. You would think he would still be paying attention. I think most people here thought it was Autopilot as it was hard to explain it any other way.

Ditto on the hands on the wheel data - seems silly to quote it as such since it is a torque sensor, not a touch sensor. He may have had a hand on the wheel and just chosen to divert his attention away from the road.

Very sad to see, but definitely no surprises. Autopilot is not capable of dealing with this situation. Maybe FSD. We’ll see I guess. Seems hard.
I'm guessing he engaged Autopilot just so he could take his eyes off the road to text or something else he shouldn't be doing on AP. Bad timing, but not Autopilot's fault. It will kill you if you let it.
 
I have never seen any evidence of this seeing ahead of cars. That famous video in Germany or wherever probably picked up on sudden slowing of the vehicle directly in front (which then swerved and crashed as I recall?). Hard to know but super easy to trigger that alarm if it is tracking a leading vehicle.
To me those videos seem real enough. Radar is focusing really low to and against the road surface.
These cars are now supposedly for years already FSD capable but cannot see a 4 meter tall semi with 70 cm of light under it. Yes LIDAR is the dead end here?
People sitting in RoboTaxis who die due to semis cross their path are just part of the overwhelmingly great statistics for self driving safety? Tesla makes progress with things they have a handle on, but seem to make none with the issues identified to them by reality. Casualties. It's like they care about safety as a statistics, not as a way of life. They found their algorythm to paint it as safer now. Why change? FSD is about to somehow be accomplished!

I've seen more proof of AP seeing unavoidable accidents before the happen through a radar bounce under the car ahead than of ANY machine learning. Even engineers fixing deadly corner cases or taking them out of the equation until they are confident they fixed it. Drive the same road again, the same lane divider will be the last thing a Tesla front facing camera records, but doesn't "see".

I'm a bit appalled that Tesla drivers seem to be OK with any share of fellow owners to be lunatics who takke risks with other peoples' lives as well as their own. And Tesla by saying that you should pay attention and hold the wheel (*wink, wink*) is somehow all the due diligence the world could ever expect from the company claiming to be in the business of making the safest cars. How did we arrive in a world like this one? I'm 42 but feel old in how people seem to throw common sense out of the window when they see a brand they forged an emotional connection to.
 
I'm guessing he engaged Autopilot just so he could take his eyes off the road to text or something else he shouldn't be doing on AP. Bad timing, but not Autopilot's fault. It will kill you if you let it.
Let's not forget that he was in a 2 ton car. It's an unguided missile, on a busy public road. If you can pickt he wrong moment to do some texting, you can pick the moment a minibus of kids without seatbelts get in the way of you finding semi or lane divider. Had the driver lived, he should have been tried for a profoundly serious road crime. This goes beyond Darwin awards. And Tesla allowing it to happen, hiding behind a notice on a the screen, from the safest car maker on the planet, is a slap in the face of someone in the triple digit IQ club.
 
I have never seen any evidence of this seeing ahead of cars. That famous video in Germany or wherever probably picked up on sudden slowing of the vehicle directly in front (which then swerved and crashed as I recall?). Hard to know but super easy to trigger that alarm if it is tracking a leading vehicle - it does not even need to slow down much - it seems to be the rate of change of velocity of the tracked vehicle that triggers it, not just closing speed.

I have to disagree with this, my collision detection has screeched at me when a car 2 or 3 in front of me has suddenly slowed. I never tailgate and usually hang back at least 6 - 8 car lengths, but I can clearly see on the screen the system is picking up 3 or 4 cars in front comfortably, and definitely knows if one of them slows suddenly.
 
Status
Not open for further replies.