Oldschool496
Member
Another valid point.
"Autopilot is cruise control on Steroids" I think I just read that here somewhere. FULL attention required. Auto Pilot if not FSD.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Another valid point.
Nighttime testing of M3 towards broadside of parked semi-trailer, no radar or visual ID:
Nighttime testing of M3 towards broadside of parked semi-trailer, no radar or visual ID:
What in the "he double toothpicks" are you doing?
Lets test it another way. How about a white sheet with images and some reflectors on it. Nobody tests cars with humans in it.
I'm drawing attention to a video someone else made which I consider relevant to this thread.
I'm drawing attention to a video someone else made which I consider relevant to this thread.
- The trailer was equipped with a side-guard.
Thanks for the help on clarify video source and ownership. Still that video is crazy here. Now we will have everyone testing this out, thats my point.
Thats not a side guard or under-ride guard, that for aerodynamics only it lets wind pass around the rear wheels to save fuel. Its a thin sheet of aluminum.
Good job. I totally agree its a wall of something that might someday be detected.From the perspective of Tesla's sensors however I'd be curious to understand if there is a practical difference between them.
Yes and no
Yes, lidar would have the same issue. No being further away does not help. Radar/ lidar needs to return to bounce back to it.
If the flat surface starts above the sensor, it will not reflect back to the radar regardless of the distance from it.
Back to the mirror analogy, find a vertical mirror, note where your eyes are when standing close, now back up. If the floor is flat and the mirror vertical, your eyes will stay in the same spot and you will never see your eyes in the part of the mirror above the original spot.
Sorry, I think I get your point, I thought it was you doing the testing. I have never seen anything like that testing.I'm drawing attention to a video someone else made which I consider relevant to this thread.
Radio signals scatter a lot more than light hitting a reflective surface, so that analogy isn't a very good one. A better analogy would be shining a flashlight up at a projection screen above your head that is tilted away from you slightly. You definitely see the light. And the angle of incidence matters a great deal as to how much of that light you see. If the light is almost lined up with your head, you see a much brighter reflection than if you skew the light source by a few feet.
It is a little concerning that following three or more years of documented fatalities that this issue still exists. I understand there are complexities involved, and would be interesting to hear some theories as to why Tesla has not yet released a solution to this case.
The issue is people who don't read the manual using AP in places it's explicitly not intended to be used.
AP does not handle cross-traffic well because it's not intended to handle that situation
You can't fix stupid. Even after 3 years.
The issue is people who don't read the manual using AP in places it's explicitly not intended to be used.
AP does not handle cross-traffic well because it's not intended to handle that situation
You can't fix stupid. Even after 3 years.
That’s what she said.Let me cut to the chase - it's ALWAYS the driver's fault.
You might want to reread the actual manual for a Tesla. I do agree that if Tesla thinks they are close to FSD anytime soon, they should solve for all of your points. But as currently deployed Tesla manuals SPECIFICALLY exclude the road this guy was on from the recommendations. NO CROSSING TRAFFIC are the words they use.My intention wasn't to direct attention to reasons why an operator may misuse autopilot, or even fail to read the manual (RTFM), but rather to identify why after three years Tesla still struggles with identifying this use-case as well as similar cases of stopping for stationary objects in the road - like the multiple cases of stationary red firetrucks.
Let us not forget, Elon believes that Tesla's destiny is to superseed human driving capability, and with a self-defined target of FSD by End-Of-Year 2019, they have a rather short runway left.
I believe it is fair to make some underlying assumptions here:
- It is essential to stop for large stationary objects, like red firetrucks.
- It is vital to stop for slow-moving objects, like trucks crossing the freeway.
- Since these events do happen, and with some level of frequency, they should not be designated as rare-corner cases and ignored.
- Tesla has received increasing pressure based on news reports and other pressure to solve these use cases for at least three years.
Based on this, we could conclude that Tesla has not ignored the use cases. I don't agree that Tesla has not delivered because it never intended to solve for this issue, but rather because they've encountered some difficulty in doing so.
Identifying these reasons is what I was hoping to do here.
My curiosity is more around the technological barriers, vs. the psychology of why humans make bad decisions
As an aside, I encounter red firetrucks at least twice a month, and because I drive in proximity to a USPS depot, I encounter trucks that cross a two-way roadway that is frequently driven at 50 miles per hour or more. It is a straight-away road of approximately one mile in length that connects two major freeways with the depot on one side, and a major food distributor on the other.
Google Maps
View attachment 409964