Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
I'm drawing attention to a video someone else made which I consider relevant to this thread.

Including the video title or description might be helpful, since people may miss this:

Traffic Aware Cruise Control (TACC) at night gets confused if there are no lines or non-parallel lines. However, all tests showed the trailer was never recognized straight on. Aside the car, the trailer ghost image is shown. I don't think radar plays any part in object detection but it appears to be all optical.

A few observations:

- Video was recorded and posted to Youtube recently, using a Model 3.

- Tesla regularly missed detecting the trailer on a direct approach and did not automatically slow down, or depict it in the display. The operator had to stop the car manually, or it would have hit the trailer.

- On occasion, when approaching or leaving the trailer at an angle (the Tesla would detect the trailer, sometimes briefly, and sometimes it would see an additional phantom trailer, in the wrong location).

- Testing was done at night, with both standard and high beams active (no difference).

- The trailer was white, and connected to a truck.

- The trailer was equipped with a side-guard.

It is a little concerning that following three or more years of documented fatalities that this issue still exists. I understand there are complexities involved, and would be interesting to hear some theories as to why Tesla has not yet released a solution to this case.
 
- The trailer was equipped with a side-guard.

Thanks for the help on clarify video source and ownership. Still that video is crazy here. Now we will have everyone testing this out, thats my point.

Thats not a side guard or under-ride guard, that for aerodynamics only it lets wind pass around the rear wheels to save fuel. Its a thin sheet of aluminum.
 
  • Helpful
  • Like
Reactions: OPRCE and GeoX750
Thanks for the help on clarify video source and ownership. Still that video is crazy here. Now we will have everyone testing this out, thats my point.

Thats not a side guard or under-ride guard, that for aerodynamics only it lets wind pass around the rear wheels to save fuel. Its a thin sheet of aluminum.

I found a resource page that clarifies these devices (both skirts, and guards) here: Truck Side Guards Resource Page

From the perspective of Tesla's sensors however I'd be curious to understand if there is a practical difference between them.
 
From the perspective of Tesla's sensors however I'd be curious to understand if there is a practical difference between them.
Good job. I totally agree its a wall of something that might someday be detected.

Its really after going to your site you highlighted is a bicycle, pedestrian excluder to push those two possible broadsiders out of the way to prevent running over them if broadsided by either. A good thing. They also save fuel.
 
Last edited:
  • Like
Reactions: OPRCE
In a Reddit article posted last month (Autopilot doesn't detect a truck partially in lane :( : teslamotors), a Tesla Driver had been following a Truck engaged in a lane-shift. The Tesla hit the truck before it had a chance to exit the lane completely. Of note is the fact that the Tesla vehicle was approaching an overpass.

If I understood correctly the way the ADAS maps work when marking a location "radar do not brake" (as is common near overpasses) we should still be relying on visual input from cameras and ultrasonic sensors.

This begs the question:

Does anyone know whether or not the Tesla can brake without input from its Radar (meaning, camera, and ultrasonics alone, or in combination with one another?)
 
  • Informative
Reactions: OPRCE
Yes and no

Yes, lidar would have the same issue. No being further away does not help. Radar/ lidar needs to return to bounce back to it.
If the flat surface starts above the sensor, it will not reflect back to the radar regardless of the distance from it.
Back to the mirror analogy, find a vertical mirror, note where your eyes are when standing close, now back up. If the floor is flat and the mirror vertical, your eyes will stay in the same spot and you will never see your eyes in the part of the mirror above the original spot.

Radio signals scatter a lot more than light hitting a reflective surface, so that analogy isn't a very good one. A better analogy would be shining a flashlight up at a projection screen above your head that is tilted away from you slightly. You definitely see the light. And the angle of incidence matters a great deal as to how much of that light you see. If the light is almost lined up with your head, you see a much brighter reflection than if you skew the light source by a few feet.
 
Radio signals scatter a lot more than light hitting a reflective surface, so that analogy isn't a very good one. A better analogy would be shining a flashlight up at a projection screen above your head that is tilted away from you slightly. You definitely see the light. And the angle of incidence matters a great deal as to how much of that light you see. If the light is almost lined up with your head, you see a much brighter reflection than if you skew the light source by a few feet.

Radio signals can bounce a lot, but they don't just scatter/ spread. A projection screen is design to scatter light, a flattish sheet of metal will more more mirrorish. Depending on the required signal return strength, the brighter reflection may not be bright enough. If the radar is below the bottom of the trailer side, then the angle of incidence will send the main energy of beam over the top of the car.
 
It is a little concerning that following three or more years of documented fatalities that this issue still exists. I understand there are complexities involved, and would be interesting to hear some theories as to why Tesla has not yet released a solution to this case.


The issue is people who don't read the manual using AP in places it's explicitly not intended to be used.

AP does not handle cross-traffic well because it's not intended to handle that situation

You can't fix stupid. Even after 3 years.
 
The issue is people who don't read the manual using AP in places it's explicitly not intended to be used.

AP does not handle cross-traffic well because it's not intended to handle that situation

You can't fix stupid. Even after 3 years.

Can someone provide a cookbook of how AP should be used? Maybe something like:

1. low speed stop-and-go creeping forward, say, 5-10mph
2. high speed cruising on freeways(assuming traffic permits), just like normal cruise control in non-AP/non-Tesla cars
3. no lane change anywhere
4. no turn(left or right) anywhere

So if one follows the above 4 rules, there will be no possibility of fatality, correct?
 
The issue is people who don't read the manual using AP in places it's explicitly not intended to be used.

AP does not handle cross-traffic well because it's not intended to handle that situation

You can't fix stupid. Even after 3 years.

My intention wasn't to direct attention to reasons why an operator may misuse autopilot, or even fail to read the manual (RTFM), but rather to identify why after three years Tesla still struggles with identifying this use-case as well as similar cases of stopping for stationary objects in the road - like the multiple cases of stationary red firetrucks.

Let us not forget, Elon believes that Tesla's destiny is to superseed human driving capability, and with a self-defined target of FSD by End-Of-Year 2019, they have a rather short runway left.

I believe it is fair to make some underlying assumptions here:

- It is essential to stop for large stationary objects, like red firetrucks.
- It is vital to stop for slow-moving objects, like trucks crossing the freeway.
- Since these events do happen, and with some level of frequency, they should not be designated as rare-corner cases and ignored.
- Tesla has received increasing pressure based on news reports and other pressure to solve these use cases for at least three years.

Based on this, we could conclude that Tesla has not ignored the use cases. I don't agree that Tesla has not delivered because it never intended to solve for this issue, but rather because they've encountered some difficulty in doing so.

Identifying these reasons is what I was hoping to do here.

My curiosity is more around the technological barriers, vs. the psychology of why humans make bad decisions ;)

As an aside, I encounter red firetrucks at least twice a month, and because I drive in proximity to a USPS depot, I encounter trucks that cross a two-way roadway that is frequently driven at 50 miles per hour or more. It is a straight-away road of approximately one mile in length that connects two major freeways with the depot on one side, and a major food distributor on the other.

Google Maps

Screen Shot 2019-05-20 at 5.09.27 PM.png
 
Last edited:
My intention wasn't to direct attention to reasons why an operator may misuse autopilot, or even fail to read the manual (RTFM), but rather to identify why after three years Tesla still struggles with identifying this use-case as well as similar cases of stopping for stationary objects in the road - like the multiple cases of stationary red firetrucks.

Let us not forget, Elon believes that Tesla's destiny is to superseed human driving capability, and with a self-defined target of FSD by End-Of-Year 2019, they have a rather short runway left.

I believe it is fair to make some underlying assumptions here:

- It is essential to stop for large stationary objects, like red firetrucks.
- It is vital to stop for slow-moving objects, like trucks crossing the freeway.
- Since these events do happen, and with some level of frequency, they should not be designated as rare-corner cases and ignored.
- Tesla has received increasing pressure based on news reports and other pressure to solve these use cases for at least three years.

Based on this, we could conclude that Tesla has not ignored the use cases. I don't agree that Tesla has not delivered because it never intended to solve for this issue, but rather because they've encountered some difficulty in doing so.

Identifying these reasons is what I was hoping to do here.

My curiosity is more around the technological barriers, vs. the psychology of why humans make bad decisions ;)

As an aside, I encounter red firetrucks at least twice a month, and because I drive in proximity to a USPS depot, I encounter trucks that cross a two-way roadway that is frequently driven at 50 miles per hour or more. It is a straight-away road of approximately one mile in length that connects two major freeways with the depot on one side, and a major food distributor on the other.

Google Maps

View attachment 409964
You might want to reread the actual manual for a Tesla. I do agree that if Tesla thinks they are close to FSD anytime soon, they should solve for all of your points. But as currently deployed Tesla manuals SPECIFICALLY exclude the road this guy was on from the recommendations. NO CROSSING TRAFFIC are the words they use.
 
  • Like
Reactions: TomHudson
Status
Not open for further replies.