Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
This is a situation wherein Brain-computer interface is needed. The compuer recognizes an object (Higher on the ground) and rejects it. The brain itself doesn't see the object (whether he was sleeping/watching movie/texting ehatever). But if the computer sends the signal to the brain, the human brain could be seeing and recognizing via cognition that it was a truck and not a overhead sig. It could happen in nano seconds. The computer should "wait" for the human brain interface to act on the outcome or the brain itself (via muscles) could clamp on the brake. Future APs, relying only on the optical camera and a computer interface is likely to fail again and again and again. US Govt and NIH is goint to spend upwards of 30 billion dollars on Brain-this good be an interesting project for Tesla and other AP makers.

Of course it would be much easier to just put more sensors and a more sophisticated computer system on board to detect and react to more things in the environment. That's well within our current tech capabilities. The computer-brain interface is being experimented with in the lab, but do you really want a $200,000 operation and all the possible complications that can entail just to drive your $100K car?
 
Very sad and I agree that AP lures you into false sense of security. My wife told me a story about one of her psychology lecture: when airbag first came out, it was daunted as a great life saving technology (which it is), but her professor claimed that instead of installing life saving airbag which cushioned the blow, a poison dart should be deployed in case of a crash. Her theory is that human will inevitably take more risk if he/she believes that there is something else to offset the added risk he/she is taking. It sounds like this is the case here.

Also, does anyone remember the "auto retract" shoulder belt from the 80s? It was also removed because a lot of people ended up with no lap belt.

The professor needs to read up on risk homeostasis. Air bags don't change things. Seat belts do as do better tires, brakes and suspensions.
 
Well, you are right. I will wait for more research and future. My life and brain are certainly worth more than a couple of hundred dollars and I wont risk it after reading many posts that suggested that the camera "ignores" items above certain level of height.
However, there are optical scanners that see where the eye is focussed. If I was watching a movie or distracted, it would alert me!
 
  • Like
Reactions: mmd
Risk-homeostasis. In other words, if you see driving as not risky, you don't have to wear seatbelt. However, this theory applies to only individuals and their perception of risk. It does not anticipate or take into account the risk imposed by others' carelessness, including driving skills or the lack of tyhem.
 
Hi Breezy, who are you referring to when you state the "message isn't getting across"? Surely not the driver, an acclaimed user and promoter of AP, TSLA and advanced technologies in general. I am a fairly new user of AP. I found the process of indoctrination, demo and usage to be quite thorough. The message was loud and clear further reiterated by the responsibility acceptance button on the screen I had to toggle. Respectably, YMMV of course.

I'm not sure I understand your point. Are you suggesting that the driver in this case, as an acclaimed user and promoter of AP, was fully aware of the limitations of Autopilot? Evidence suggests otherwise. I see he posted a YouTube video of a trip from Boston to Orlando where he had his hands off the wheel 90% of the time.

When I say "the message isn't getting across," I mean it's not getting across to many (most?) Autopilot users, who drive with their hands off the wheel at least some of the time. This is directly in contravention of Tesla's warning: "Always keep your hands on the wheel. Be prepared to take over at any time." There are people who drive with their heads down, checking email and whatnot. It's certainly not getting across to the media or the public at large, who discuss this incident as an accident with a "self-driving car."
 
  • Like
  • Disagree
Reactions: mmd and kort677
Are you suggesting that the driver in this case, as an acclaimed user and promoter of AP, was fully aware of the limitations of Autopilot? Evidence suggests otherwise.

I would suggest otherwise. I'm of the view that he was fully aware of the limitations, especially considering his background in technology. I would also suggest that the evidence does not suggest otherwise, but in fact suggests a "perfect storm" of highly unlikely factors taking place that caused his unfortunate death.

He was, by his very nature, a risk taker. You don't become a Navy Seal by having a lack of guts -- like me. I only had AP loner for less than a week and I found myself having a very difficult time trusting it, despite the fact that it performed very well.
 
Trucks are stealth objects for low mounted radar:

Future_USS_Zumwalt's_first_underway_at_sea.jpg


Radars rely on a reflection off the sensed object (rather than on ambient light).
That means that if a surface is tilted such that any signals emitted from the radar are reflected away from the receiving sensors of the radar, the radar will not see, or be aware of, them.

The stealth boat above reflects any radar waves coming from water level up into the air so that a radar with all elements at water level will never see it.

The geometry of the side of a truck in relation to the bumper mounted radar looks a lot like this stealth warship, invisible because it provides no return reflections.[rotate the picture counter clockwise until the side of the warship is vertical. The surface of the water intersects the location of the bumper mounted radar (at some point on the vehicle's path)] The geometry between radar elements and the detected objects is the same. The truck, above the running gear, is invisible.

For truck structures above the wheels to be clearly visible, in a high signal to noise way, the radar emitters and some of the receiving sensors need to be at the tallest part of the vehicle - the roofline, rearview mirror or the top of the A-pillars.

This way the radar sensors get a return unless the target is a light load flatbed trailer, perhaps hauling a pyramid shaped load.
 
Last edited by a moderator:
In the video at 1:30, when the accident diagram was shown, the reported stated "The car's airbag never deployed
That is likely because the Tesla passed under the truck trailer without slowing sufficiently to trigger the air bag deceleration sensors.
She was passed and she says she was doing 85, and when this car just passed her, she was just like, wow, you know, I wonder how fast that car was going"
If that witness is accurate, she was passed by the Tesla just before the accident while going 85mph, so the Tesla driver was traveling significantly faster than 85mph (on a road where the posted limit is 65).

I thought that AP and TACC would not engage at over 85mph? In which case the Tesla driver any have been in manual control of the car at the time of the crash.
 
Risk-homeostasis. In other words, if you see driving as not risky, you don't have to wear seatbelt. However, this theory applies to only individuals and their perception of risk. It does not anticipate or take into account the risk imposed by others' carelessness, including driving skills or the lack of tyhem.

More accurately, wearing a seatbelt makes you feel more secure so you drive faster and corner faster until your perceived level of risk is elevated to your risk tolerance level. Airbags don't affect the driving experience so they don't change perceived risk. Hence airbags and crumple zones don't come into play under risk homeostasis.
 
"She was passed and she says she was doing 85, and when this car just passed her, she was just like, wow, you know, I wonder how fast that car was going"

Is there a video or transcript of the actual witness saying this about his speed? The person in the video told us what he said the witness said (as quoted above) but I'm always suspect of hearsay evidence. There's good reason it's inadmissible in Court (subject to certain exemptions).
 
I would suggest otherwise. I'm of the view that he was fully aware of the limitations, especially considering his background in technology. I would also suggest that the evidence does not suggest otherwise, but in fact suggests a "perfect storm" of highly unlikely factors taking place that caused his unfortunate death.

He was, by his very nature, a risk taker. You don't become a Navy Seal by having a lack of guts -- like me. I only had AP loner for less than a week and I found myself having a very difficult time trusting it, despite the fact that it performed very well.

Autopilot shouldn't be trusted at any time, any more than anyone "trusts" conventional cruise control. Autopilot is a set of driving aids. Autosteer helps you stay in your lane. It can safely change lanes, usually. It can help you avoid obstacles in a narrow set of circumstances. That's all it does.

.
 
  • Like
Reactions: Magus
I've never seen an overhead highway sign that is 8-12' above the road. They're all substantially higher, because, you know, semi trucks have to fit beneath them.

If the sensors see an object in the road that is at windshield level height, Tesla should not create software that tells the vehicle to ignore what it sees. Plain and simple.

Just to be clear here. MobileEye camera is not in full color. It is 'essentially' monochrome. It is not the typical RBG. It has a red pixel every 4th pixel the other three have no filter.

It does not see what your eye sees and the appearance of a white truck on a brightly lit sky on this camera is very different than on your human eye.

The blog post makes complete and total sense and was probably the result of a first hand account of what went wrong as interpreted by people at Tesla who likely have the information.
 
More accurately, wearing a seatbelt makes you feel more secure so you drive faster and corner faster until your perceived level of risk is elevated to your risk tolerance level. Airbags don't affect the driving experience so they don't change perceived risk. Hence airbags and crumple zones don't come into play under risk homeostasis.

I remember when airbags were a new thing. I was a teenager at the time so take that into consideration but I specifically recall other teens talking about driving with less care (and no seatbelt too) because they had a car with airbags.
 
  • Like
Reactions: bhzmark
Autopilot shouldn't be trusted at any time, any more than anyone "trusts" conventional cruise control

That makes no sense to me since cruise control doesn't drive me around corners or slow down when the car in front slows down. In order to allow a car to drive me around corners, and to slow down and stop me when the car in front does, I need to trust the vehicle a heck of a lot more than cruise control. Of course, I constantly keep my eyes on the road and watch everything the car does. But still, if I allow the car to do this for me, it requires a certain amount of trust in the vehicle and the technology.
 
Last edited:
  • Like
Reactions: wbrumfiel
It's limited to 90 MPH. But if her account is accurate it would seem there would need to be more than a 5 MPH delta between their speeds.

Mike
Thanks Mike. And I went back and reviewed Teslas recent blog post about the crash and Tesla says AP was engaged at the time of the crash. So the car was going no faster than 90mph.

Tesla knows how fast the car was going at impact. It would be interesting to know what that speed was. Based on that witness account, and the fact that the Tesla driver had numerous recent speeding tickets, it seems very possible that the car was traveling at well over the speed limit.

There have been accounts describing the car "cresting a hill" not far from the intersection where the crash occurred. How far is the hill from the intersection?