Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

NOA tries to sideswipe a truck.... how do I reportthis bug?

This site may earn commission on affiliate links.
As you can see in this video, NOA nearly sideswiped a truck. I was coming up on a semi so NOA decided it was time to change lanes (GOOD). It recognized it was not clear by showing red on the display (GOOD). It then decided to change lanes any way (BAD). I pulled it back and let the truck pass (GOOD).

 
...As you can see in this video, NOA nearly sideswiped a truck. I was coming up on a semi so NOA decided it was time to change lanes (GOOD). It recognized it was not clear by showing red on the display (GOOD). It then decided to change lanes any way (BAD). I pulled it back and let the truck pass (GOOD)...

You can log in to your Tesla Account on the web and there's a form to contact it. Please include the date and time of the event.

I wouldn't be disappointed if I don't get a reply from Tesla on a beta product because it is not a final quality product so there are expected undesirable behaviors (such as deaths, injuries, and crashes) to be worked out and that's why a driver needs to be ready to take over at any times.
 
It's pretty well known that the current software doesn't process and respond to the side and rear cameras very well. That's one of the reasons why they are currently doing a big rewrite of the software.

I didn’t know they were doing a rewrite of the software. Thanks for sharing - do you have any links that refer to that as i would love to to learn more.
 
I didn’t know they were doing a rewrite of the software. Thanks for sharing - do you have any links that refer to that as i would love to to learn more.

When asked about why there's still no coast-to-coast with FSD demo, Elon Musk said he could game the system by writing the programming code for a specific trip but he prefers to wait for the system to work everywhere.

Yes, when you complain about an imperfection such as this incident in this thread, Tesla can write some codes to correct this specific scenario and specific location but that won't help with the next mile with a sharp curve. Someone then has to complain about the sharp curve and Tesla could write some more codes to correct that scenario as well.

Instead of relying on drivers with the loudest complaint, I think Tesla relies on its Artificial Intelligence method.

I think Tesla subscribes to nVidia way as described below:

"In contrast to the usual approach to operating self-driving cars, we did not program any explicit object detection, mapping, path planning or control components into this car. Instead, the car learns on its own to create all necessary internal representations necessary to steer, simply by observing human drivers.

The car successfully navigates the construction site while freeing us from creating specialized detectors for cones or other objects present at the site. Similarly, the car can drive on the road that is overgrown with grass and bushes without the need to create a vegetation detection system. All it takes is about twenty example runs driven by humans at different times of the day. Learning to drive in these complex environments demonstrates new capabilities of deep neural networks.

The car also learns to generalize its driving behavior. This video includes a clip that shows a car that was trained only on California roads successfully driving itself in New Jersey ."

If human programmers don't need to write programs anymore, then why doesn't everyone subscribe to that model?

The danger for letting the machine taking over the work of human programmers is: Human programmers may not understand why the machine wrote those lines and that potentially means human loses the control of programming itself!

Tesla is very aggressive to implement the transition from human coding to machine taking over the coding:

 
When asked about why there's still no coast-to-coast with FSD demo, Elon Musk said he could game the system by writing the programming code for a specific trip but he prefers to wait for the system to work everywhere.

Yes, when you complain about an imperfection such as this incident in this thread, Tesla can write some codes to correct this specific scenario and specific location but that won't help with the next mile with a sharp curve. Someone then has to complain about the sharp curve and Tesla could write some more codes to correct that scenario as well.

Instead of relying on drivers with the loudest complaint, I think Tesla relies on its Artificial Intelligence method.

I think Tesla subscribes to nVidia way as described below:

"In contrast to the usual approach to operating self-driving cars, we did not program any explicit object detection, mapping, path planning or control components into this car. Instead, the car learns on its own to create all necessary internal representations necessary to steer, simply by observing human drivers.

The car successfully navigates the construction site while freeing us from creating specialized detectors for cones or other objects present at the site. Similarly, the car can drive on the road that is overgrown with grass and bushes without the need to create a vegetation detection system. All it takes is about twenty example runs driven by humans at different times of the day. Learning to drive in these complex environments demonstrates new capabilities of deep neural networks.

The car also learns to generalize its driving behavior. This video includes a clip that shows a car that was trained only on California roads successfully driving itself in New Jersey ."

If human programmers don't need to write programs anymore, then why doesn't everyone subscribe to that model?

The danger for letting the machine taking over the work of human programmers is: Human programmers may not understand why the machine wrote those lines and that potentially means human loses the control of programming itself!

Tesla is very aggressive to implement the transition from human coding to machine taking over the coding:


I suspect that Tesla probably did the 2016 FSD demo video by writing specific code for the route and probably thought that it would work generally but then later switched to the machine learning approach when they realized that hard coding would not work for a generalized solution for autonomous driving.
 
Is it weird that I'm more bothered by the fact that NOA turns on the turn signal then just sits there like an a**hole for 10 seconds, confusing the hell out of the truck driver, than I'm bothered by the fact that it nearly causes a collision? :p
No, that was horrible passing lane etiquette.
This is why I don't use auto lane change if there's someone in the other lane.
 
  • Like
Reactions: jsmay311