Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Scary Experience with AP2.0 Autosteer on the Highway

This site may earn commission on affiliate links.
I doubt you can label this as a "malfunction" because Tesla would say the system is a growing toddler needing to learn better that's all... Until subsequent firmware updates, I guess we just have to learn which scenarios it works best and which not.
There are so many things wrong with your statement I don't know where to begin. First, why on earth does Tesla put a toddler in control of a 5,000 lb car? For god's sake, you can't get a driver's license until the age of 18 in most states but you're saying it's okay to put a toddler in control. Furthermore, it's up to the driver to learn what the toddler can or cannot do?

Is this a joke? I don't understand your viewpoint at all. This is a death wish. If I were the OP I would immediately report this incident to NHTSA. I think a driver knows if he or she is overcompensating or jerking the wheel around. There is no mistaking whether it's you or the machine.

There is so much fanaticism and "blame the victim" mentality. I used to enjoy participating in these forums, but the blind defense of defenseless actions by Tesla is alarming to say the least. I hope GM's Supercruise wipes the floor with Tesla's Autopilot because Tesla needs to be taught a cold, hard lesson that it cannot play with the lives of its customers whether or not they agreed to use "beta" software.

Tesla doesn't seem to live in reality, but instead, acts like we are all inside of the Silicon Valley bubble where this kind of
asinine behavior is acceptable.
I think it's similar to the case of Pennsylvania Turnpike when the driver blamed AP1 for swerving into guard rails but the log showed that the Autopilot was automatically disengaged when the driver took over the steering wheel and the driver over-corrected the steering which landed the car on the side.
There you go again, blaming the victim. Maybe Tesla should change the amount of tension required to disengage Autopilot. That would be the prudent response. Yet around here, with some people (probably TSLA shareholders), it's always the driver's fault.
GM's approach with Super Cruise seems to address issues like this. GM geo-mapped the 160,000 miles of highway where the system will work. They also tested every one of the 160,000 miles with a Super Cruise equipped car to validate the system before putting it on sale.
Bravo to GM! Tesla throws out beta software that can steer you right into a semi, yet somehow Tesla has no responsibility whatsoever? Tesla's ToddlerPilot system controls your vehicle, yet Tesla legally has no responsibility because it tells you that as the driver you must be in control at all times even when, clearly, you are not?

I think it's a pretty sure bet that GM's geo-mapping which restricts their system to only work in areas it was designed for will be more effective than Tesla's system. AP relies on the driver to know where the system will work or not work properly. That hasn't ended well in many occasions.

In my opinion, a system that automatically restricts it's use in areas it was not designed for is a better design than relying on a typical driver to make those decisions.
I completely agree. GM also has a driver-facing camera that can determine whether the driver is paying attention, or was incapacitated. If incapacitated, the vehicle will safely pull over to the side of the road. If the system determines that the driver is not paying attention and doing something else, for example, texting or reading, then it will also act appropriately.

Of course Tesla's system does none of this. As supposedly the highest tech car in the industry, it's decidedly low tech and deficient in such basics. Tesla would rather roll the dice and risk the lives of its customers than slow down, do things right, and implement the correct checks and balances. One day this is all going to backfire in a big way. It's only a matter of time.
 
  • Like
Reactions: MS16 and kavyboy
...There are so many things wrong with your statement...


Just like anything else, including medicines advertised on TV that give you all kind of scary side effects, they are safe as long as you know how to take them and don't get into all those side effects.

I agree with you that it can be dangerous to release an unfinished product as evidenced by fatal AP1 Florida accident.

However, in responsible hands, AP2 is safe.

If an owner is unable to perform basic driving skill such as manually reducing the system's speed to posted sign on a curve, then the owner should not use the system.
 
Last edited:
...NHTSA...

Keef Wivaneff submitted it!


C-dISQcUIAE-UgF.jpg