Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Auto Pilot Is Dangerous

This site may earn commission on affiliate links.
I am sorry that being factual might be seen as giving "a hard time".

It may not happen all the time but Autopilot is known to do things that we don't expect.

I see this as just one of the imperfections that Autopilot will get better but I do not see there's any hardware malfunction.

Autosteer command was activated and the icons lit up as designed.

Whatever the automation steered, it's all displayed on the instrument cluster faithfully and flawlessly.

When autosteer steered out of the right lane marker, the instrument cluster also displayed that very clearly and flawlessly.

I don't think there's any problem with the sensors nor the computer because it faithfully reported what it was doing and what it was intending to do: moving out of the current lane and try to fit in a gore point or a road shoulder.

The problem is logic.

Why would it want to move out of a perfectly good lane and try to fit in a gore point or a road shoulder?

It's a logic immaturity problem that resulted in the 2018 death of Walter Huang.

View attachment 473463

Tesla does not have the luxury of Waymo who send its fleet out and pre-map in high resolution of the road way of every gore point, every road shoulder, every light pole, every tree...

That might explain why Autopilot wants to squeeze in a gore point or a road shoulder in this thread. Autoplilot is not matured enough to make this kind of decision just yet.

Tesla's method is different from Waymo's.

Tesla does not want to write hard codes to perfect Autopilot but it relies on owners to do corrections so the whole fleet will eventually learn from the mistakes too.

And that's what Consumer Reports says:

"Jake Fisher, CR’s senior director of auto testing, says consumers are not getting fully tested, consumer-ready technology. In essence, he says, Tesla owners are being enlisted as beta testers to help fine-tune the technology for the future—even though they’re paying $6,000 up front for the promised automation.

“What consumers are really getting is the chance to participate in a kind of science experiment,” he says. “This is a work in progress.”"

Your taking a single moment out of his entire Video. Yes we've seen AP get confused at times, not really in that situation but granted in that single moment it could. But it never recovers and in my experience it always does when he has clean unambiguous lines.

He also said it's doing it a lot. I give him the benefit of doubt that he knows it's normal capabilities. And everyone else agrees this is not normal.
 
This is not the first time that Lane-Keeping feature went astray. NTSB preliminary report on the fatal Autopilot of Walter Huang:

"As the Tesla approached the paved gore area dividing the main travel lanes of US-101 from the SH-85 exit ramp, it moved to the left and entered the gore area. The Tesla continued traveling through the gore area and struck a previously damaged crash attenuator at a speed of about 71 mph."

crash-ed.jpg


It's repeatable because:

"Walter Huang's family tells Dan Noyes he took his Tesla to the dealer, complaining that -- on multiple occasions -- the Autopilot veered toward that same barrier -- the one his Model X hit on Friday when he died."

That's how the beta works. It's safe if a driver is in control. It can be deadly if a driver gives up the control and allows it to steer out of a desirable lane.

We all know this corner case that AP struggles with. This is not the OP's case.

It's not like AP just randomly goes sour. There are cases it can't handle. It also can't handle some cases of broken or miscalibrated hardware which is probably what is going on with the OP's situtaion.
 
What your are experiencing in those videos is definitely not normal. As suggested above, show those to your service center.

It seems to want to favor the right side. The car seems to know this based on the screen visual, but it’s having trouble correcting.

Is the car pulling to the right under normal driving conditions (without AS engaged) ?

I agree,that's far from normal, especially on seemingly perfect roads with well defined lane markings. No rain or fog or hills. That should not happen. If the car had an alignment issue I think that would have been discovered on a trip for service.
 
The lamentable case of the late Mr. Huang is a good example of Autosteer gone tragically wrong. However, that specific case was with the parsing of that specific section of that specific highway, before Autosteer was intended to be able to handle such things as interchanges or exit ramps. While relevant, it is in my opinion no longer germane, and indeed has several specific differences from the problems depicted in this thread's OP:

In Mr. Huang's case, it was a repeatable issue at that specific location which was specifically repeatable in other cars at the time. It was also not terribly difficult to see what was going wrong- as other videos of that location before it was corrected in the NN have shown, the car thought it was in its lane right up to the dividing barrier.

In the OP's case, there is no gore lane, barrier, exit/entrance ramp, interchange, intersection, lane line quality issue, traffic, or any other immediately obvious cause for the behavior and, more importantly, it is happening without any correlation to a specific location, time of day, or speed of travel. Autosteer in the OP's case is clearly and obviously behaving in an other-than-normal fashion, repeatably. The MFD shows that on at least some level, the NN knows that the car is departing its lane.

The third video looks as though Autosteer is in fact not steering- the car seems to be pulling slightly to the right at all times, and being "bounced" back into its lane in a manner consistent with the Lane Departure Avoidance system. It would be interesting to see if the car would just keep veering right rather than "bouncing" with that function turned off in the name of Science (albeit carefully).

The second video is the clearest example of something being awry- you can see the angle of the steering wheel icon on the MFD indicating it thinks it's steering the car left, back into the lane, just before it gives up and throws the "Take Over Immediately" red hands alert:
upload_2019-11-5_14-51-42.png


I should note that the majority of the fourth video, where the cammer says "This is dangerous", the car is indeed bouncing between the lane lines thanks to Lane Departure Avoidance only, as TACC is enabled but Autosteer is not, as shown by the grey -- not blue -- steering wheel icon on the MFD for the majority of the footage.
 
It's very unfortunate that people do not appreciate the seriousness of the nature of a beta product.

This beta product requires a licensed driver who is competent in steering, braking and accelerating as needed.

If a driver does not know how to steer, brake and accelerate then please do not use the current beta Autopilot.

Autopilot is safe as long as there's a competent licensed driver who knows how to put basic driving skills in practice.

Clearly he knows how to steer! He had to keep steering left to keep the car on the road. I don't see a need to be insulting. or condescending.That is not a beta issue. There is something wrong there. If not its a total waste of money purchasing the fsd feature.I'm very concerned Tesla service didn't do more here.
 
  • Like
Reactions: DopeGhoti
...There is something wrong there...

Wrong is the nature of beta.

It's just like wet is the nature of rain.

If I go into a house that is not finished and I close the door and I am soaking wet inside the unfinshed house.

Something must be really wrong and I look up, there's no roof! The rain just falls down hard on me. A house without roof can soak me up during a storm.

And that's what beta pilot is like. The feature is not perfected nor proven by any third party tester.
 
Wrong is the nature of beta.

It's just like wet is the nature of rain.

If I go into a house that is not finished and I close the door and I am soaking wet inside the unfinshed house.

Something must be really wrong and I look up, there's no roof! The rain just falls down hard on me. A house without roof can soak me up during a storm.

And that's what beta pilot is like. The feature is not perfected nor proven by any third party tester.

You're not getting it. Beta doesn't mean an absolute waste of money. Your analogies aside if all model 3's behaved like that they quickly would lose all fsd upgrades and the large dollar amount that goes with it. You seem to be the only one who won't admit there is more wrong with that guys car than software tuning!
 
We all know this corner case that AP struggles with. This is not the OP's case.

It's not like AP just randomly goes sour. There are cases it can't handle. It also can't handle some cases of broken or miscalibrated hardware which is probably what is going on with the OP's situtaion.
If this was the case I’m sure the SC would have figured that out in the two days they had the car. Looks like Tesla doesn’t want to admit something more is nefarious.
 
Wrong is the nature of beta.

It's just like wet is the nature of rain.

If I go into a house that is not finished and I close the door and I am soaking wet inside the unfinshed house.

Something must be really wrong and I look up, there's no roof! The rain just falls down hard on me. A house without roof can soak me up during a storm.

And that's what beta pilot is like. The feature is not perfected nor proven by any third party tester.
My dear friend you couldn’t be so wrong. To keep with your analogy. Beta would be more like buying that house before the ceiling fans are installed or before the newly seeded lawn is fully grown. Who would sell or buy a house without a roof, how long do you think a company would last if they sold houses without a roof. Other than some third world country, what government would give the green light for a company to sell houses without roofs. Your logic is very flawed, just because you love a product should not mean you render you logic and intellect as irrelevant. I love the product which is why I’m on my second Tesla and currently waiting for my wife’s M3 to be delivered. Love if anything should not not also mean blind loyalty.
 
which leaves me unable to trust the autopilot system.

IT IS A BETA FEATURE, AND SHOULD NOT, UNDER ANY CIRCUMSTANCES BE TRUSTED. EVER.

I'm very sorry for coming off so harshly, but this circumstance warrants it, as we are talking about a possible life or death situation. But honest to God, you should NEVER trust it. Ever.

IMO, the only safe way to use autopilot is to "ghost drive" it... Actively drive the car the entire time the autopilot is on. As long as it's performing adequately, your inputs should be enough to keep the nags away, yet not hard enough to disengage the autopilot. And driving in this manner makes it a natural occurrence for you to increase your "ghost driving" inputs hard enough to disengage the autopilot should it start behaving poorly.

But do not, ever, ever ever ever, reduce your driving awareness below the level of which you would use to drive the car without the autopilot on. No texting, no taking your eyes off the road, no taking your hand off the steering wheel. Drive the car, but allow the system to ASSIST you in the process, as that what it currently IS... an assistance device, and nothing more.

This is a serious case of needing to read the manual, and following the manual to the letter. Your life (and the lives of others) depends upon you using the system within the bounds of its current state of development.

I always try to not come off as a guy standing on a soap box preaching, or as some sort of Social Justice Warrior, but there are times when we (myself included) need a "wake up call," and this is one of them.

Drive the car, and allow the autopilot system to assist you in the process.

Eventually, Tesla (and others) will get autopilot to the point where you can just sit in the car and watch a movie while it safely takes you to your destination. We are NOT there yet. YOU MUST MAINTAIN POSITIVE CONTROL OF THE VEHICLE AT ALL TIMES.
 
  • Like
Reactions: house9
IT IS A BETA FEATURE, AND SHOULD NOT, UNDER ANY CIRCUMSTANCES BE TRUSTED. EVER.

I'm very sorry for coming off so harshly, but this circumstance warrants it, as we are talking about a possible life or death situation. But honest to God, you should NEVER trust it. Ever.

IMO, the only safe way to use autopilot is to "ghost drive" it... Actively drive the car the entire time the autopilot is on. As long as it's performing adequately, your inputs should be enough to keep the nags away, yet not hard enough to disengage the autopilot. And driving in this manner makes it a natural occurrence for you to increase your "ghost driving" inputs hard enough to disengage the autopilot should it start behaving poorly.

But do not, ever, ever ever ever, reduce your driving awareness below the level of which you would use to drive the car without the autopilot on. No texting, no taking your eyes off the road, no taking your hand off the steering wheel. Drive the car, but allow the system to ASSIST you in the process, as that what it currently IS... an assistance device, and nothing more.

This is a serious case of needing to read the manual, and following the manual to the letter. Your life (and the lives of others) depends upon you using the system within the bounds of its current state of development.

I always try to not come off as a guy standing on a soap box preaching, or as some sort of Social Justice Warrior, but there are times when we (myself included) need a "wake up call," and this is one of them.

Drive the car, and allow the autopilot system to assist you in the process.

Eventually, Tesla (and others) will get autopilot to the point where you can just sit in the car and watch a movie while it safely takes you to your destination. We are NOT there yet. YOU MUST MAINTAIN POSITIVE CONTROL OF THE VEHICLE AT ALL TIMES.
I’m not sure if you watched the videos. The autopilot I unusable. With both hands on the steering wheel the auto pilot always disengages because it tries so hard to drive off the road or into another lane. I doubt that’s the way Tesla made it to work, beta or not.
 
My dear friend you couldn’t be so wrong. To keep with your analogy. Beta would be more like buying that house before the ceiling fans are installed or before the newly seeded lawn is fully grown. Who would sell or buy a house without a roof, how long do you think a company would last if they sold houses without a roof. Other than some third world country, what government would give the green light for a company to sell houses without roofs. Your logic is very flawed, just because you love a product should not mean you render you logic and intellect as irrelevant. I love the product which is why I’m on my second Tesla and currently waiting for my wife’s M3 to be delivered. Love if anything should not not also mean blind loyalty.

I can’t like this enough. Beautifully said. Tam is obviously a VERY dedicated Tesla enthusiast, to the point to where they can’t see any of its current flaws. It’s basically like trying to have a conversation with a rock.
 
  • Like
Reactions: Joe300@
I’m not sure if you watched the videos. The autopilot I unusable. With both hands on the steering wheel the auto pilot always disengages because it tries so hard to drive off the road or into another lane. I doubt that’s the way Tesla made it to work, beta or not.


I think the comments earlier in the thread noting how the little steering wheel graphic doesn't seem to match up with what the car is actually doing and may provide a clue to help get the service center to look in a new direction.

Is your steering wheel aligned (this has been a problem in the past)? If the little steering graphic is often not reflecting what your actual steering wheel looks like, it might be a problem with those sensors.
 
  • Like
Reactions: KenC
I can’t like this enough. Beautifully said. Tam is obviously a VERY dedicated Tesla enthusiast, to the point to where they can’t see any of its current flaws. It’s basically like trying to have a conversation with a rock.

I have been expecting some kinds of Autopilot flaws and imperfections such as videotaped in this thread.

Current flaws of collisions, injuries and deaths are undesirable but they are the very nature of beta mode.

I too have experienced plenty of sudden swerving without any warning when there have been cars beside me and a steep cliff on my side. Not as often as in this thread but enough to remind me this is still beta.

As a matter of fact, even today, most of my passengers do not feel comfortable riding in Autopilot. They prefer a driver who uses conventional mode and not autopilot.

Yes, my passengers even realize that autopilot is still in beta mode without me saying.

As to the flaws in this thread, maybe someone is not writing hard codes for theses stretches of the roads.

There's many locations that I experienced sudden swerving repeatably for over a year and after that, it stopped doing that for those locations.

Maybe someone has not been writing hard codes for theses stretches of the roads.

Elon said that Tesla could game the system by writing codes for the route from LAX to NYC but that would not benefit the rest of the general users.

Instead, Tesla lets owners do lots of driving corrections so the machine can learn what's the right way instead of Tesla manually writes the program for particular stretches of the roads.

Despite all of the flaws. And I do say that Autopilot does have flaws as it is the nature of beta, I still feel I am much safer with it than without.

So, there! I admit it: Flaws are the nature of beta and I enjoy managing its flaws by using Autopilot with its instructions.
 
IT IS A BETA FEATURE, AND SHOULD NOT, UNDER ANY CIRCUMSTANCES BE TRUSTED. EVER.

YOU
MUST MAINTAIN POSITIVE CONTROL OF THE VEHICLE AT ALL TIMES.

That is the inherent contradiction of using Autopilot. Not unlike the air traffic control terminals which work very reliably... but not so much they don't want controllers familiar with the backup methods. So once a month they require them to use the backup system (moving little pieces on a board). With the Tesla Autopilot it can be more reliable than the once a month FAA requirement so that when the thing starts misbehaving it can be tricky to initially recognize it. I've had the car start to veer off the road before, but I've always been on top of it. However, it is very unsettling and I am not comfortable for a while. This is the sort of situation that definitely dampens enthusiasm for the car.

There are a handful of locations I regularly drive where the car would not handle the markings well. I don't think there was any problem with the marking, it was just the car being unhappy with the situation. Some were problems early on and eventually improved. Others have not improved and there are some new locations. I like Autopilot, but I don't trust it further than I can throw it. But one of these days it may just lull me into a comfort zone and I forget that I don't trust it. Let's face it. People driving cars are not airline pilots and aren't able to pay attention like a pilot does. So Autopilot shouldn't have too hard a time being a better driver than we are. We'll see if it can rise above even that low bar.
 
I have made an appointment with another SC mobile service. the next available date is Nov 18th. if that doesn't go well, I'll make the long drive to Dedham, Mass.

I don't recall your mentioning where you took the car for service initially. In case you weren't aware, one opened in Warwick, RI recently. That's probably a bit closer than the Dedham, MA store. Of course, if you took yours to Warwick for the initial diagnostic, then going to Dedham (or maybe Mt. Kisco, NY) would be better. See Tesla's store list for phone numbers and addresses.

Also, you haven't mentioned what version of the Autopilot hardware your car has. Most of those sold in the last five or six months have the HW3 computer, but older ones have the HW2.5 computer. The two are significantly different, and it's possible that a flaw could affect one but not the other.

This is not the first time that Lane-Keeping feature went astray.

You're presenting straw-man arguments. Nobody here has argued that Autopilot's LKA feature is perfect. It does sometimes make mistakes, and as you and others have said, Autopilot is not an excuse to ignore the road or take one's hands off the wheel. Your criticisms of @Toppatop55's driving may be valid, but it's unclear from these short clips precisely how he's actually driving. It is clear from these clips how Autopilot is working, though....

The problem is that @Toppatop55 has documented his car behaving far less reliably than is typical of Autopilot. My own Model 3 has almost 8,000 miles on its odometer, and it's never done what @Toppatop55 has shown his car doing multiple times in a period of just over two weeks. Beta feature or not, that frequency of misbehavior abnormal, even if it's well known that Autopilot sometimes misbehaves in similar ways. Several possible causes occur to me:
  • @Toppatop55's hardware (cameras, computer, etc.) may be faulty.
  • Autopilot software may be misconfigured for @Toppatop55's car. Perhaps it has HW 2.5 software on HW 3 hardware or vice-versa (if that would work at all; I suspect not, but who knows). Perhaps a software update or other problem left the software in a bogus state. Perhaps the cameras need to be recalibrated.
  • There may be a flaw in Autopilot's neural network that happens to manifest more strongly on the roads that @Toppatop55 drives than on other roads.
  • Something else quirky about the driving conditions (lighting, traffic around the car, etc.) may have caused an Autopilot neural network flaw to manifest more for @Toppatop55 than would be expected.
  • @Toppatop55 may have been extremely unlucky and happened to run into an issue that affects other people, but with unusual frequency.
In any of these cases (except maybe the last one), Tesla needs to be made aware of the problem. That's what beta-testing is for, after all -- alerting developers of problems with their software. Maybe Tesla already knows about the issue, because Teslas "phone home" with driving data. OTOH, maybe these serious Autopilot errors have gone unnoticed by engineers. If the problem is with @Toppatop55's hardware, or certain software problems, then Tesla should be able to fix it; but so far, he's gotten the runaround from Tesla.