Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Auto Pilot Is Dangerous

This site may earn commission on affiliate links.
It's very unfortunate that people do not appreciate the seriousness of the nature of a beta product.


"Yikes. I just watched your videos, and that is not normal behavior from Autosteer.""

YES! this is how Autopilot behaves in my car too- when the lane lines become skewed as in this situation this does happen!! When there're HOV lanes on the left, as in some part of Washington DC, and the gates block the entrances when closed, this is exactly what happens!! This also happens on the right side when an entrance lane merges into the main traffic -YES! there are many things that can be done to mitigate these issue like incorporate the map or ping the car in front as a guide or likely both when lane lines skew, but obviously the software isnt near that stage- along with the fact it would likely need the compute power of HW3 !!

ONE shouldn't use AP unless they know (or are learning) the limitations of the software!!! Until these things are ironed out protect yourselves!!
 
"getting AP to the point where it doesnt do such bad things."

That's without saying! But before then lets protect ourselves-

I finally saw the video- and clearly this situation pops up consistently, the car responds to the reflective lane paint if that's not there it looks for it, that's why eyes should be on the road, it happens too when cars enter the highway, the road is wider then converges to one lane.
I’m glad to you finally watched the videos. As you now understand, no one is bashing Tesla or autopilot. We as consumers are pointing out the faults and or issues so that they could be addressed and therefore making it a better product. The only issue occurred when a company (Tesla) refusing to be accountable for its faults.
 
*******update*****

the last contact with the SC, I was informed that the “computer” is being replaced. Don’t know what computer or how that would affect the AP but I’m glad something is being done. Hopefully I can pick up T’Challa tomorrow.

Please let us all know of the outcome .. I'm curious if changing out the computer will fix this (since I'm somewhat surprised at this choice by the SC).
 
*******update*****

I was too mentally tired yesterday to update you guys but after a long fought battle, I was able to get the service manager to reach out to "headquarters" in California where they actually approved the service. Fingers crossed, hopefully I don't get a call Monday saying they change their minds lol...stay tune.
 
  • Informative
Reactions: Az_Rael and emmz0r
The first three videos look familiar. I've seen that behavior occasionally in version 10 on HW2.5. However, it kind of sounds like you are experiencing those sorts of misbehavior frequently, which is not normal. The one where it nearly crashed into the wall on a nearly straight stretch of road with clear lane lines was where I just stopped watching and concluded that something is seriously wrong with that car.

The failure almost certainly cannot be any sort of camera or sensor issue, because the autopilot computer clearly knows that it is deviating from the lane, as evidenced by the lane visualizations, and at one point it even freaked out because the lane excursion was so severe. If this were a software bug, given how easily the original poster was able to reproduce it, that bug would likely be happening to a lot more drivers on a regular basis, which tends to rule out software, too.

There are two possibilities. One is that the two autopilot computers can't agree because of defective RAM/registers, and the wrong one is "winning". However, if they disagree that much, it should give you an autopilot unavailable failure, which suggests that this is probably not the case unless somebody missed a really critical regression test.

The other possibility — and the more likely one, IMO — is that the AP computer is unable to make small steering adjustments in your vehicle. This could be caused by any of:
  • A failing/sticking steering actuator
  • A loose connector/faulty wire between the AP computer and the actuator
  • An electrical fault in the AP computer's output.
It is possible that the side of the AP computer that handles lane departure avoidance is working, and that this is why it ping-pongs. However, I think it is more likely that the actuator is sticking, and that small motions don't result in any actual motion, but when the computer commands a large change, shift happens.

Either way, I doubt any software fix will correct the problem. Chances are, you need new hardware, and I seriously doubt that the problem is the AP computer. My bet is that the actuator that pushes the steering back to the left is physically defective, and is sticking. Either that or the steering rack is sticking. One of the two. Of course, the computer is a heck of a lot easier to swap out than any of those components, so that's still the right starting point, just in case it's a loose cable on the back or a defective output on the AP board.


My bet is that it's the steering rack.
This seems like the type of issue where they should buy back the car and ship it to Fremont to be examined. It’s really bad that the system can’t detect that there is a problem.

Yeah, and that part is downright terrifying, because if the failure this particular car can occur without being detected, then the entire system is fundamentally unsafe, i.e. the thread title is correct. Tesla needs to give this person a new car, capture this car, and take it back to Fremont, then have a team of a hundred engineers spend two weeks figuring out what the heck is happening, then tell their R&D team to write a hundred different unit or functional tests that each catch some aspect of this failure, to ensure that this cannot possibly happen without triggering an AP fault, and, if possible, to find a way for the AP computer to compensate for the failure better until repairs can be made, so that when the same failure occurs in some other car, it won't be a crisis.
 
Yeah, and that part is downright terrifying, because if the failure this particular car can occur without being detected, then the entire system is fundamentally unsafe, i.e. the thread title is correct. Tesla needs to give this person a new car, capture this car, and take it back to Fremont, then have a team of a hundred engineers spend two weeks figuring out what the heck is happening, then tell their R&D team to write a hundred different unit or functional tests that each catch some aspect of this failure, to ensure that this cannot possibly happen without triggering an AP fault, and, if possible, to find a way for the AP computer to compensate for the failure better until repairs can be made, so that when the same failure occurs in some other car, it won't be a crisis.

+1 .. we all agree/understand that AP is a beta, but that's a 2-way street - Tesla should also take customer (aka testers) feedback seriously, particularly in cases like this when something really bad happened. If the SC is just coming up with excuses or blindly replacing parts "to make it go away" then Tesla have detached themselves from the only source they have for a truly robust AP/FSD system; us, the actual drivers.
 
  • Like
Reactions: Toppatop55
Had a "dangerous" moment yesterday when NOA tried to merge me into a lane where a vehicle (pickup truck towing a trailer) was entering the freeway (the onramp lane was coming to an end, and they were engaging a very slow, gradual merge), and I subsequently had to wrest control away.
 
Sadly AutoSteer is dumb as a brick when it comes to merging. It's no logic there whatsoever it seems, and while we're waiting for proper cut-in detection and behaviour we get some "happy accidents" where it coincidentally works.
 
So, since I've updated to V10 I've notice my M3 (LRDM) exhibiting some weird and dangerous situations. For starters, Ive lost about 6% of my max charging miles on a two month old car. My biggest safety issue is autopilot nearly killing me and my 7 month old child. Putting too much trust in autopilot can make you complacent believing it's a finished product. While using autopilot on a ride home, my car started to bounce around in the lane as if it could not center itself. Before I could react, the car drove out of the travel lane on a 4 lane highway and nearly drove off the highway at 70 miles an hour. After regaining my nerves, I was ale to reproduce the incident and capture it on video. The scary part now is that the car frequently behaves like this, which leaves me unable to trust the autopilot system.

I have taken the car to the service center and the tech told me there's nothing wrong with the car and states that I should assume responsibility for any incidents that occurs as a result because I acknowledged the system is in beta when I purchased the vehicle. I don't know if this is satisfactory or legal but clearly there's something wrong with the car. Ive owned teslas since 2015 with my first MS85D and I have never experience anything like this before. I will try ad post all the videos I have and continue to post future videos until Tesla fixes the issue.



Tesla Autopilot - Google Drive


Hello, I constantly communicate with Tesla technicians and when it comes to working with NOA I can give you some recommendations to avoid such situations in the future :)

NOA and AP are both Level 2 Assist-Autonomy which expects you to follow the following basic rules
- A. keep your hands on the wheel all the time (The actions of the user override the actions of the machine 100% of the time)
- B. you have read the entire manual and understand the behavior and limitations of the autonomous system. (i.e. what objects it recognizes and what it ignores, how it reacts to which lanes, signs, types of pavements, traffic patterns)
- C. What sensors do you have their range and limitations, where they are placed).

If you haven't done A,B,C please please take the time to look those up in the manual or online that's CRUCIAL knowledge

### NOA Feedback Program
NOA is weight-based human-feedback AI autonomy. when you find a constant wrong pattern, please report it as defect by using voice command: Report. or holding the vehicle icon for 3 seconds. (his sets of a checkpoint timestamp so when you go in for service you can mentioned you reported it and they will find issues easily).

#### Configuration for reliability
- Confirmation on lane change: YES, alert = Sound
- Traffic avoidance: MILD
- Speed (Don't driver over 5 mph above speed limit)

#### Human Behaviours:
- Always merge in and out of High Way with full manual
- Reach to the middle lane (if applicable) and then Turn on Autonomy while foot is on Accel (Take it away no sooner than 1 second after switching NOA/AP on).

##### Teaching AP2.0 Hive Mind:
- Steer tech looks primarily at lanes, make sure you hold and direct the steering wheel at the path you intend. as AP wrestles with you it learns the human desire and over time adjusts, it's okay if you break out of AP and then set it back, this is just you telling the machine that it did so much wrong you took over, data like this helps it understand that the vector path it decided is wrong and it should try another one, it also tells Tesla to review your actions and the machine actions. data is not tied back to the driver / owner.

#### Difference between Geofencing autonomy and independent Autonomy
There are many Autonomous Driving machines in at least 3 other nations than the US, what separates Tesla apart is it skipping 10 years worth of evolution and jumping into AI based independent Autonomy which was not expected until after our life time, (we're skipping Lidar driving, Geofencing, Cloud based 5G remote navigation, and so much more).
Tesla may work something with 5G in the future for report support during Project Robotaxi) but that's in Autonomy Level 5 2030.

Currently once Level 3 is out NOA will be nearing Level 4 confidence and you can confidently be distracted because machine warning will be based of projection and as early as 30 minutes away, it will also have better corrective navigation maneuvers than the current (go back to your lane) solution.

#### Updates preference (New features sooner come at grater bug risks)
It goes without saying to engineers but to non-tech people here is the short version
Stick to normal update cycle rather than advanced keeps you in the stable non buggy experience.
Advanced is basically stable-beta release Where even if most rough kinks are ironed out there are defects to be expected some can be really troublesome like software lag, connectivity, or infotainment crash, but you'll never have driving interruptions ever. recently Tesla has been working better with Stable-beta than last year. driving core software is untouched in minor releases.

If you want to enjoy your experience avoid testing new stuff. I know it sounds mega lame. but I enjoy stability over cutting-edge especially with kids around yikes.

Beta cycle content is deployed within 30 day window unless a major hot fix is required.


Disclaimer: I don't work for Tesla. some of my intel especially around numbers can be off.
 
Hello, I constantly communicate with Tesla technicians and when it comes to working with NOA I can give you some recommendations to avoid such situations in the future :)

NOA and AP are both Level 2 Assist-Autonomy which expects you to follow the following basic rules
- A. keep your hands on the wheel all the time (The actions of the user override the actions of the machine 100% of the time)
- B. you have read the entire manual and understand the behavior and limitations of the autonomous system. (i.e. what objects it recognizes and what it ignores, how it reacts to which lanes, signs, types of pavements, traffic patterns)
- C. What sensors do you have their range and limitations, where they are placed).

If you haven't done A,B,C please please take the time to look those up in the manual or online that's CRUCIAL knowledge

### NOA Feedback Program
NOA is weight-based human-feedback AI autonomy. when you find a constant wrong pattern, please report it as defect by using voice command: Report. or holding the vehicle icon for 3 seconds. (his sets of a checkpoint timestamp so when you go in for service you can mentioned you reported it and they will find issues easily).

#### Configuration for reliability
- Confirmation on lane change: YES, alert = Sound
- Traffic avoidance: MILD
- Speed (Don't driver over 5 mph above speed limit)

#### Human Behaviours:
- Always merge in and out of High Way with full manual
- Reach to the middle lane (if applicable) and then Turn on Autonomy while foot is on Accel (Take it away no sooner than 1 second after switching NOA/AP on).

##### Teaching AP2.0 Hive Mind:
- Steer tech looks primarily at lanes, make sure you hold and direct the steering wheel at the path you intend. as AP wrestles with you it learns the human desire and over time adjusts, it's okay if you break out of AP and then set it back, this is just you telling the machine that it did so much wrong you took over, data like this helps it understand that the vector path it decided is wrong and it should try another one, it also tells Tesla to review your actions and the machine actions. data is not tied back to the driver / owner.

#### Difference between Geofencing autonomy and independent Autonomy
There are many Autonomous Driving machines in at least 3 other nations than the US, what separates Tesla apart is it skipping 10 years worth of evolution and jumping into AI based independent Autonomy which was not expected until after our life time, (we're skipping Lidar driving, Geofencing, Cloud based 5G remote navigation, and so much more).
Tesla may work something with 5G in the future for report support during Project Robotaxi) but that's in Autonomy Level 5 2030.

Currently once Level 3 is out NOA will be nearing Level 4 confidence and you can confidently be distracted because machine warning will be based of projection and as early as 30 minutes away, it will also have better corrective navigation maneuvers than the current (go back to your lane) solution.

#### Updates preference (New features sooner come at grater bug risks)
It goes without saying to engineers but to non-tech people here is the short version
Stick to normal update cycle rather than advanced keeps you in the stable non buggy experience.
Advanced is basically stable-beta release Where even if most rough kinks are ironed out there are defects to be expected some can be really troublesome like software lag, connectivity, or infotainment crash, but you'll never have driving interruptions ever. recently Tesla has been working better with Stable-beta than last year. driving core software is untouched in minor releases.

If you want to enjoy your experience avoid testing new stuff. I know it sounds mega lame. but I enjoy stability over cutting-edge especially with kids around yikes.

Beta cycle content is deployed within 30 day window unless a major hot fix is required.


Disclaimer: I don't work for Tesla. some of my intel especially around numbers can be off.
Clearly you have not watched the videos and are just thread surfing. If there wasn’t a serious issue with my car, why would Tesla lead engineer tell the service center to replace the computer and send them the one from my car? People need to stop making stupid pigheaded excuses for Tesla and all of the other mega companies. We as consumers need to hold them accountable for the products they release to the public.

***update***
just to update everyone else that’s been following my misfortune with Tesla. I was told they replace the car computer and so far after 5 days of driving, I have not exhibited the autopilot issue. Hopefully that can be put to rest. Oh by the way, someone commented about how the car should have given a issue warning for the autopilot. Well the SC said the same thing and said the Tesla engineers have added it to future updates so that if autopilot misbehaves it will give the operator a warning.
 
##### Teaching AP2.0 Hive Mind:
- Steer tech looks primarily at lanes, make sure you hold and direct the steering wheel at the path you intend. as AP wrestles with you it learns the human desire and over time adjusts

There was a lot of just weird stuff in your post- but this one is explicitly complete nonsense.

AP does not change it's behavior at all based on direct driver feedback in a single car.

The programming only changes behavior at all when Tesla pushes out an update.

The neural network stuff and other driving is updated at HQ, for everyone who gets a given version of software.

Individual cars do not learn and adapt to individual behavior at all

Doing so would make troubleshooting issues horrifically difficult.


, it's okay if you break out of AP and then set it back, this is just you telling the machine that it did so much wrong you took over, data like this helps it understand that the vector path it decided is wrong and it should try another one

No, it really doesn't.

Again software behavior in a specific car for a given SW release never changes on that same version- regardless of driver feedback in a specific car.

Behavior changes globally when a new SW version is pushed.


Currently once Level 3 is out NOA will be nearing Level 4 confidence and you can confidently be distracted because machine warning will be based of projection and as early as 30 minutes away, it will also have better corrective navigation maneuvers than the current (go back to your lane) solution.

...there's a lot of words here, but I'm not sure they make much sense in that order. "machine warning will be based of projection and as early as 30 minutes away"??



L3 already allows the driver to not be paying attention- L3 specifically means under the operational domains certified for L3 the driver does not need to actively pay attention to driving and the car is in fact doing 100% of the driving task.

The human might be asked to take over at some point (so you need to be awake, but not actively monitoring the road)


Beta cycle content is deployed within 30 day window unless a major hot fix is required.


Disclaimer: I don't work for Tesla. some of my intel especially around numbers can be off.

... they are.
 
Disclaimer: I don't work for Tesla. some of my intel especially around numbers can be off.

Translation: I have no idea how any of this stuff works. I pasted multiple paragraphs of nonsense without reading the thread.

***update***
just to update everyone else that’s been following my misfortune with Tesla. I was told they replace the car computer and so far after 5 days of driving, I have not exhibited the autopilot issue. Hopefully that can be put to rest. Oh by the way, someone commented about how the car should have given a issue warning for the autopilot. Well the SC said the same thing and said the Tesla engineers have added it to future updates so that if autopilot misbehaves it will give the operator a warning.

Interesting that they are claiming only to replace the computer and that seems to have fixed the problem. Maybe this means the faulty RAM theory (however unlikely it seems vs. mechanical steering failure) could actually be correct. Unfortunately, since the computer was supposedly sent back to Fremont, we'll almost certainly never figure out what actually happened in this instance.