Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
For Doppler, it is accurate. There is no different in the rate if change or range for a perpendicular moving vehicke than there is for the scene as a whole. So the truck is stationary in the same sense the world is stationary, both have the same rate. The RADAR filters out the returns where speed equals vehicle speed so that the scene (road, signs, guardrails) does not generate objects to track.
Hmmmm I believe we are thinking the same thing but I have to read that statement a bunch of times for it to sink in. :)
 
Nothing to defend, it is an assist system, not a driver replacement system. If I take my hands of the wheel of my truck, I will likely crash in under 8 seconds regardless of if there is a semi across the road.
Tell that to the guy who was recently seen on "60 Minutes" driving with Autopilot like this:

9Lzgumb.jpg
 
Its so strange, he drove a Prius for years, he was so cautious, recently remarried evidently. Some of the last things I remember him saying was him making fun of my large body 35mm camera I used to take pictures years ago at this home for a birthday we were invited over for.

This is an important point, and here we have a personal view of the character and nature of the individual behind the wheel of the crashed Tesla. Especially so over the course of a very narrow timeframe (say 10 seconds) human behavior often has very little to do with one's level of intelligence, schooling, number of loved ones and previously demonstrated level of care and attention.

If we tell a person, "You will die if you continue to do X" in time, the message may fade, and they'll risk repeating the negative behavior. These poor choices are just part of the reality of what it means to be human.

So if you provide a new maze for a test subject to run through that have safe routes so long as the subject does X, you can be sure that at some point, they'll do Y instead. It is a difficult argument to say because they didn't do X; the responsibility for their harm lies squarely with the test subject. The fact that you allowed them to do Y means that you share in some of that responsibility and I think this is the inherent problem.

I would prefer to see a shared responsibility model - because here, the manufacturers will be far more careful since they'll be on the hook for what happens. The implications, of course, are that the speed to deliver product may be impacted, but the current system, while being developed quickly, is being done at the risk of putting peoples lives at risk.

These two modes of developing autonomous driving systems are divergent and different paths, and indeed, we have the entire industry on one side, and Tesla on the other. I've mentioned this earlier in the thread, but its probably worth mentioning again. Tesla, in a way is in a race against time. They are betting that they'll be able to collect enough data and develop an autonomous program that will, in the end, be safer than human drivers before regulation is imposed on them that might prohibit such a program.

It is a terrible thing when someone dies, and it is the ultimate arbiter for whether or not things need to change. Here is the underlying point: Its never a one-way street; and even if the driver bore most of the responsibility there is a level of responsibility that lies squarely with Tesla as well.
 

I'm not sure how pitching the current Autopilot would be a wise PR move, especially when it was just implicated in another driver decapitation. Even if there is no legal culpability for Tesla, a Tesla driver lost his life because he had put too much trust in AP.

Advertising AP, in this context, would be an viewed as something done in extremely poor taste, at best.
And homicidal, at worst.

a
 
  • Like
Reactions: OPRCE and Kant.Ing
For Doppler, it is accurate. There is no different in the rate if change or range for a perpendicular moving vehicke than there is for the scene as a whole. So the truck is stationary in the same sense the world is stationary, both have the same rate. The RADAR filters out the returns where speed equals vehicle speed so that the scene (road, signs, guardrails) does not generate objects to track.
I think I said that already :)
 
  • Like
Reactions: OPRCE
You are putting too much emphasis on that word "traveling". That's not the way I interpret it.

That second statement is not correct. I think you might be forgetting about Alberts ole saying "It's all relative". A stationary object in front of the car (a truck in this case) is not perceived by the "sensors" to be stationary because the Tesla is moving. It is only perceived to be stationary if both the Tesla and the truck are stationary :) So to say another way, it is somewhat similar if the Tesla is not moving, but the truck is coming toward the car.

1. I'm convinced Tesla's legal department insisted on the placement of the word "traveling" in that precise spot for the excellent reason of evading liability when at highway speed the AP product crams itself + inattentive user all up in a parked Fire-Truck or similar massive obstacle, as has already all too frequently happened.

2. If the suit of Walter Huang's survivors makes it into court we shall likely hear Tesla seek to rely heavily on this subtle phrasing, as the gore-point he hit was not travelling, thus no sensors were designed to detect it, hence all his own fault and no liability for contributory negligence attaches to Tesla. Which would be a pretty embarrassing tack for them to air in public, so they will undoubtedly attempt to settle beforehand.

3. I'm beginning to suspect Musk's attitude here is (perhaps necessarily) encapsulated by the cynical but memorable Fight Club maxim:

"A new car built by my company leaves somewhere traveling at 75 mph. Under Autopilot it locks onto the wrong lane lines into a gore-point. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don't do one. [i.e. it works out cheaper to let people die than to fix things]"

In his defence he probably always somewhat naïvely thought vision would be "solved" long before this became a major problem, i.e. X > cost of recall, and now believes HW3 will soon cure all that ails AP.

4. The statement "For the purposes of AP detecting a slow-moving perpendicular truck up ahead, it is effectively a stationary object" is correct: doppler radar picks up things moving relative to the stationary background, hence a parked Tesla's radar will detect a truck moving towards it from the front but not vice versa, and other OEMs have the same problem. The Tesla manual specifically warns that at highway speeds it may not brake for stationary objects, see my signature for one personal experience. If it were otherwise they would not only change the manual but surely shout it from the PR rooftops with a video demo of this amazing safety breakthrough. I expect that to happen within the next year via the development of software for pseudo-LiDAR 3D depth mapping from vision on HW3/FSD.

5. Large objects (e.g. semi-trailer) crossing at 90° are, from POV of approaching car, effectively not moving relative to background so are not detected by the radar hence the AP-related deaths of Jeremy Banner and Josh Brown, in both which cases the system made no attempt to brake.

6. It is better to err on the side of caution than to be surprised in the crucial instant and suddenly appended to the list of unfortunate statistics.
 
Last edited:
  • Like
Reactions: Kant.Ing
Having said that, the most important thing to take away from this is to make it more clear to everyone how radar works or more precisely how it doesn't with stationary objects.

I fully agree and the most obvious candidate to do this necessary PSA would be Tesla itself, though of course it is conflicted by the need to sell the same system in ever increasing numbers.

Failing that we owners should do it for the benefit of new customers who place themselves and others at risk through over-optimistic evaluations of AP's abilities, allowing themselves a lapse in attention for those few seconds which unfortunately turn out to have been the crucial juncture at which the perception gap got exposed and immediately resuming full control became imperative.

This frailty of human nature can happen to anyone, even careful and experienced drivers, which from the testimony above it seems Mr Banner was. Thus a cautionary tale of which it bears reminding oneself and others regularly.


I'd imagine a new sensor of some sort must be found for stationary obstacle detection. Surely they can't hope to rely on image recognition only.

As Karpathy indicated at the recent Autonomy Day presentation, Tesla aims to produce a 3D image of the car's surroundings by mapping depth info onto each incoming pixel from the HDR cameras in real-time, by exploiting advanced algorithms on the greatly increased compute resources available on HW3. HW2.x was too tapped-out for this task.

This is known as pseudo-LiDAR from vision and should take care of the existing perception gap, i.e. doppler radar only detecting moving vehicles and AP not braking for stationary objects from >50mph.

Certainly there is no way to achieve a safe L5 Tesla RoboTaxi system without having this feature securely implemented.

Here are some relevant papers on the types of thing they should currently be working on:

1. Depth from Videos in the Wild: Unsupervised Monocular Depth Learning from Unknown Cameras
Depth from Videos in the Wild: Unsupervised Monocular Depth...

2. Pseudo-LiDAR from Visual Depth Estimation: Bridging the Gap in 3D Object Detection for Autonomous Driving
Pseudo-LiDAR from Visual Depth Estimation: Bridging the Gap in 3D...

3. Night Vision with Neural Networks
Learning to See in the Dark
[ see demo video ]

This does not detract from the fact that Waymo and others make a very safe AV system in reliance on LiDAR, but goes to illustrate that Tesla has IMHO now revealed a convincingly feasible technological path to the same L5 goal at probably much less unit expense. Only time can tell which path pays off first and/or more handsomely, but I think both will probably do very well at leaving the established auto OEMs in the dust.

It is very exciting what can now be achieved with just HDR camera inputs and maximal data-crunching.

Even though Tesla will unlikely manage a feature-complete (and totally safe, like 20x better than an expert human driver) L5 FSD candidate system by end of 2020, as Musk aims for, I expect it should still bring an impressive highway-L3 candidate by mid-2020, which will be an extremely welcome advance in safety although it will retain the existing L2 hands-on-steering checks until further approvals are granted.
 
Last edited:
Tell that to the guy who was recently seen on "60 Minutes" driving with Autopilot like this:

9Lzgumb.jpg
And somehow we're made to believe that this guy is good for the company. He does what got his customers killed, on TV. Countless vloggers are copying it. The disclaimer on the screen is for suckers to read. And even if they state you shouldn't, they do it themselves and get hundreds of thousands of views and some 10% in thumbs up.
This guy want to make the system remove a certain kind of accidents to make it on paper safer, but for now all he's accomplished is a cult following of people who try all the time to activate AP. And AP keeps trying. Often backing out of a move last second.
 
  • Like
Reactions: afadeev
In much of the discussion here, I don't get the accusing tone.
Of course fatalities are terrible. But the world isn't friendly.
The evidence is overwhelming: we all die, no exceptions.

Now we're pushing 2 ton machines on narrow strips of road
at speeds at which it's often impossible to react to danger in
time, let alone stop on a dime. Even with the fastest eyes and
minds and reflexes, with the best machines and the best brakes.
Accidents are inevitable. Period.

I see so much blaming. Blaming Elon, or the company, or the
choice of design, or whatever, and it's got a crazy ring to it that
implies that if we just do as the poster says, the world will conform
and we'll all live happy forever.

Western countries have so-called legislatures that spend a lot of
time creating laws and regulations, thousands each year, with
the implied objective of attaining perfection. Every publicized
tragedy cranks up the legalizing machinery to add to the patch-
work of prohibitions, restrictions, mandatory this, forbidden that.

Take it further and we wouldn't develop new medicine, we wouldn't
go into space, we wouldn't have mapped the earth, ventured into
unknown territories. We'd sit by our cuckoo clocks and even in
the safest, most restricted and most defensive setting, we'd still
die. I rather vote for taking chances and aiming for the stars.

I'm not a utopian Libertarian, but there's a lot to be said for the
line about "The ultimate result of shielding men from the effects
of folly is to fill the world with fools."
 
In much of the discussion here, I don't get the accusing tone.
Of course fatalities are terrible. But the world isn't friendly.
The evidence is overwhelming: we all die, no exceptions.

Now we're pushing 2 ton machines on narrow strips of road
at speeds at which it's often impossible to react to danger in
time, let alone stop on a dime. Even with the fastest eyes and
minds and reflexes, with the best machines and the best brakes.
Accidents are inevitable. Period.

I see so much blaming. Blaming Elon, or the company, or the
choice of design, or whatever, and it's got a crazy ring to it that
implies that if we just do as the poster says, the world will conform
and we'll all live happy forever.

Western countries have so-called legislatures that spend a lot of
time creating laws and regulations, thousands each year, with
the implied objective of attaining perfection. Every publicized
tragedy cranks up the legalizing machinery to add to the patch-
work of prohibitions, restrictions, mandatory this, forbidden that.

Take it further and we wouldn't develop new medicine, we wouldn't
go into space, we wouldn't have mapped the earth, ventured into
unknown territories. We'd sit by our cuckoo clocks and even in
the safest, most restricted and most defensive setting, we'd still
die. I rather vote for taking chances and aiming for the stars.

I'm not a utopian Libertarian, but there's a lot to be said for the
line about "The ultimate result of shielding men from the effects
of folly is to fill the world with fools."

I generally agree with your attitude but there is no doubt Tesla draws legitimate flack by overselling on the dream of FSD while failing to clearly illustrate the current weaknesses of AP, then seeks to pin sole responsibility for the ensuing crashes on their clients, the equivalent of snorting "No, you're holding it wrong!", an ad-hoc strategery likely to lead to a legal cropper.


Perhaps the most honest thing Tesla can do at this point is recall all their cars to retrofit a sharpened steel spike in the steering wheel centre, pointed directly at the driver's chest, in order to remind you frivolous people to pay attention, or else.
This safety system will best be known as "The Milligan".
 
Last edited:
  • Like
Reactions: Kant.Ing and dloop
The statement "For the purposes of AP detecting a slow-moving perpendicular truck up ahead, it is effectively a stationary object" is correct
. Large objects (e.g. semi-trailer) crossing at 90° are, from POV of approaching car, effectively not moving relative to background so are not detected by the radar hence the AP-related deaths

radar used by Tesla gives you a speed reading on the object with a flag. The flag says is the object is static, moves at you, away from you or sideways. I have plenty of examples to demonstrate that.

Also I would suspect that semi is not super slowly moving to get into the car's path like that in 8 seconds. In fact probably quite quickly.
 
radar used by Tesla gives you a speed reading on the object with a flag. The flag says is the object is static, moves at you, away from you or sideways. I have plenty of examples to demonstrate that.

Also I would suspect that semi is not super slowly moving to get into the car's path like that in 8 seconds. In fact probably quite quickly.

Does the sideways flag get set for objects without a towards or away flag?
Even if so, lateral speed doesn't help doppler detection and a 40 foot trailer has the same return for a portion if the time it is in front of the vehicle. So, the ends might be able to generate sideways data, but the middle cannot (even with lidar). Vision, however could (track side lights and those colored stripes).
 
  • Like
Reactions: OPRCE
radar used by Tesla gives you a speed reading on the object with a flag. The flag says is the object is static, moves at you, away from you or sideways. I have plenty of examples to demonstrate that.

Also I would suspect that semi is not super slowly moving to get into the car's path like that in 8 seconds. In fact probably quite quickly.

Do you have a Tesla speed reading on a semi-trailer slowly crossing the path at 90° as you are approaching at 70mph? That I would like to see but it seems to be a tricky picture to snap and survive.

PS: Have seen your video where your car is stationary at a crossing and a semi passes at 90° and gets its bounding box marked "Truck" assigned, which is very interesting but may not be the case at high closing speed.
 
Last edited:
Does the sideways flag get set for objects without a towards or away flag?
the flag is basically a one value out of: stopped, stationary, moving, oncoming, leftward, rightward

It's pretty clear at least some scan-to-scan aggregation is happening to arrive at those.

Do you have a Tesla speed reading on a semi-trailer slowly crossing the path at 90° as you are approaching at 70mph? That I would like to see but it seems to be a difficult picture to snap and survive.

Yeah, I am in no real rush to try and snap something like this. But again I am somewhat sure the trailer was not moving all that slowly. 8 seconds from somewhere off the road to be in the middle of the road is not slow.

Edit: as can be seen from my video at times ther's no radar return for the trailer at all. not sure if it's because it's misattributed to other objects or what. Need to interpret raw radar feed I guess.
 
  • Informative
Reactions: OPRCE and mongo
And somehow we're made to believe that this guy is good for the company. He does what got his customers killed, on TV. Countless vloggers are copying it. The disclaimer on the screen is for suckers to read. And even if they state you shouldn't, they do it themselves and get hundreds of thousands of views and some 10% in thumbs up.
This guy want to make the system remove a certain kind of accidents to make it on paper safer, but for now all he's accomplished is a cult following of people who try all the time to activate AP. And AP keeps trying. Often backing out of a move last second.

Bottom line: If you stop paying attention and/or holding the wheel (for whatever reason), are you safer in a car with AP than in a car without?

If a driver lets themself stop paying attention because they are confident in AP, that is on them, and them alone. Just like a parent taking their teen out for driving practice. Or a passenger on a bus/ taxi/ train/ plane who also died or was injured due to human driver error. (I'm a terrible passenger, if you couldn't tell)

Regarding disclaimers, bad decisions happens regardless Why Do Car Commercials Have Disclaimers?

Disney Plans to Omit Film Scene After Teen-Ager Dies Imitating It

recharging iPhones in the microwave or tide pod challenge for instance.
 
the flag is basically a one value out of: stopped, stationary, moving, oncoming, leftward, rightward

It's pretty clear at least some scan-to-scan aggregation is happening to arrive at those.



Yeah, I am in no real rush to try and snap something like this. But again I am somewhat sure the trailer was not moving all that slowly. 8 seconds from somewhere off the road to be in the middle of the road is not slow.

Edit: as can be seen from my video at times ther's no radar return for the trailer at all. not sure if it's because it's misattributed to other objects or what. Need to interpret raw radar feed I guess.
Right, but can the radar link the sideways detection of the two discontinuities 40 feet apart into a 40ft band of object?

Regardless of speed, the middle of the trailer does not have a non-stationary signature, assuming it is moving in a straight line.
 
Right, but can the radar link the sideways detection of the two discontinuities 40 feet apart into a 40ft band of object?
Tesla disabled this "aggregation" mode of operations (the smarts) because they think they can do it better. So they now get a "point cloud" of about 40 or so radar returns and themselves decide how to act on those.

Regardless of speed, the middle of the trailer does not have a non-stationary signature, assuming it is moving in a straight line.

the scans results are out every 100ms, not sure how many actual scans ar in between, but it's totally possible to see some movement from scan to scan and aggregate some of that to arrive at the sideway movement reporting and such.

Radar Tesla uses also reports object types for every radar return (in both smart and not smart modes)
 
  • Like
Reactions: OPRCE
Status
Not open for further replies.