Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How long until a Autopilot accident is reported and the potential public backlash?

This site may earn commission on affiliate links.
Autopilot is incomplete

Posting this to enthusiast forum is not probably a good idea but I will do it anyway.

It is not a good idea to release a beta software to car steering and control system and even worse is to call alpha software as beta. This post is based just on videos seen on the internet.

Here is one video that says "autopilot saves the day"
Autopilot saves the day

The reason for the near crash is the autopilot speeding when the lane next to is almost standing still. No human or very few human would have driven that speed in the dark and wet conditions. There are other videos too where the autopilot is going 65mph on a lane on motorway when the two other lanes are standing still. It is like a crazy driver and "co-pilot" aka the driver is trusting the autopilot to do its job. The autopilot should be conservative and not drive like a maniac. One lane change at wrong time and there is nothing even an autopilot with quick responses can do to stop the collision.

The other video where the driver is using an autopilot in winding road with very good markings is another good example of bad autopilot system. Autopilot seems to get confused over the bright sunshine and shadows made by the trees on the roadside. This should be an easy task for any autopilot whenworking properly but Tesla autopilot v7 is still failing. You know, many roads do have curves in them. Don't launch an autopilot that can't stay on the road if it is not straight.
Tesla Autopilot tried to kill me! - YouTube

Autopilot should be 99,5% complete to call it beta. To call something as beta software when there are barely 50% of the use cases complete is a lie. To release incomplete software controlling steering and other functions in a car at this stage is a situation where success is not one of the possible outcomes as Elon would say. It is a very different thing to release a software for a computer that is responsible for showing different colours on monitor in various order in hope there are not too many pixels in the wrong order.

Think a car as loaded weapon. The purpose of the autopilot software is to keep the loaded weapon not going in to "fired" state. Now the car fires here and there in the corner cases, good thing is there have not been anybody in the way. Someday somebody will be in the way and it is going to be ugly.

Please withdraw the autopilot from production as soon as possible. It is far from ready and will cause trouble. When the autopilot is better driver than human you can release it. Now it is much much worse.

For Elon: you cannot compare aviation autopilot and car autopilot, the environment is too different.
 
You do realize that AutoPilot != Autonomous driving, right?

- - - Updated - - -

Posting this to enthusiast forum is not probably a good idea but I will do it anyway.
Plenty of people here that will pull Tesla apart due it's known problems.

This post is based just on videos seen on the internet.
So you've never actually driven an autopilot car, right?

The reason for the near crash is the autopilot speeding when the lane next to is almost standing still.
He's going less than 45mph, the user sets the speed limit, not autopilot, and he's going UNDER the speed limit.

No human or very few human would have driven that speed in the dark and wet conditions.
Again, you realize that the human sets the speed, right?

There are other videos too where the autopilot is going 65mph on a lane on motorway when the two other lanes are standing still. It is like a crazy driver and "co-pilot" aka the driver is trusting the autopilot to do its job. The autopilot should be conservative and not drive like a maniac. One lane change at wrong time and there is nothing even an autopilot with quick responses can do to stop the collision.
Same could be said of a human driver. One wrong lane change, and the person is dead. What you're advocating is nanny controls telling the person when he can and can't use autopilot, and I completely disagree with that. It should be up to the driver.

The other video where the driver is using an autopilot in winding road with very good markings is another good example of bad autopilot system.
It's a bad autopilot system because the driver is using it on a road where it clearly states it should not be used?
The road should be a highway, with clear lane markings and a center divider, and your hands MUST be on the wheel at all times.
The driver was holding a cell phone and not paying attention.

Autopilot should be 99,5% complete to call it beta.
I'd say on a long trip, I can easily use it 95% of the time, so it's definitely Beta or even better than Beta. It's not autonomous, and it never will be.

Please withdraw the autopilot from production as soon as possible.
lol, I don't expect they'd listen to a non-owner with no experience as to how do conduct their business.

It is far from ready and will cause trouble.
It "will" cause trouble? So far it "saved the day" when used properly, and didn't kill anyone when used NOT properly. I'd call that a win-win.

When the autopilot is better driver than human you can release it
And this is where your confusion lies. Autopilot is Level 2 autonomous. The driver is FULLY responsible for the car. It's an aid, not a driver replacement.
Your tone implies that you think it's a Level 3 highway semi-autonomous system. It's not. With the current hardware, it will never be. AP will get better, but it will never be autonomous.

Once you know this, does this change your perspective as to what AP is and how it performs?
 
Last edited:
You should educate yourself on how it works before making requests based on false beliefs and information. Tesla AP does not determine the speed, Autopilot does for one. It is driver assistance technology with limitations. These are told explicitly by Tesla. The car tells you to keep your hands on the wheel, and has reminders for curves and other more challenging situations. The driver makes corrections at times to Autopilot. It is not yet intended for use on highways without a center divide. You could help the world much better by focusing on real and much more common problems.
 
Autopilot should be 99,5% complete to call it beta. To call something as beta software when there are barely 50% of the use cases complete is a lie.

I think you need to understand what "use cases" means. When testing software, "use cases" are scenarios set up to test the software to make sure it behaves as intended.

The current autopilot system is designed for highway driving, in good conditions, with clear lane markings and with the user holding the wheel and paying attention. It is designed to alert the driver when it cannot determine the safe action to take.

For all every test I have done within the the limits of that design, it has preformed nearly perfectly.
 
and least we not forget, there are many places in the US where HOV or other dedicated lanes are free flowing at or near full highway speeds while the adjacent lanes are grid locked. In those situations, it is not uncommon to have a significant speed differential between those adjacent lanes.

Safe or not, it is a fact of life.
 
I have question to that poster that claims that autopilot should do all the thinking:

Why don't you go argue that the traditional cruise control should be banned ? What if a human sets it at 90mph on 30mph zone? What does cruise control do when it encounters stopped traffic? It ploughs right in !!

How come I have not seen anyone bashing the traditional cruise control?

If Tesla's AP is beta, what do you call Mercedes system? In software world that will be unit test.
 
Yes, the cruise control set to high speed will get the driver to trouble, most people know it. It just controls the speed. Now why the few drivers i have seen to drive their cars and engage the Tesla autopilot trust the car to drive better or with more risk as they would normally do? I dont have an exact answer at this time but it will probably involve human psychology and the complexity of the functions of the autopilot. Rtfm is a common therm involved.

What I have heard about the Mercedes autopilot, nobody trusts it. It is a good thing considering safety.

My point is probably this: please don't spoil the the good thing going with unnessesary risks. The electric Tesla cars will do just fine without autopilot for years to come. When it is ready bring it on, not before.
 
Yes, the cruise control set to high speed will get the driver to trouble, most people know it. It just controls the speed. Now why the few drivers i have seen to drive their cars and engage the Tesla autopilot trust the car to drive better or with more risk as they would normally do? I dont have an exact answer at this time but it will probably involve human psychology and the complexity of the functions of the autopilot. Rtfm is a common therm involved.

What I have heard about the Mercedes autopilot, nobody trusts it. It is a good thing considering safety.

My point is probably this: please don't spoil the the good thing going with unnessesary risks. The electric Tesla cars will do just fine without autopilot for years to come. When it is ready bring it on, not before.
Pound sand, tolppa. I love using AP. And it will only get smarter the more we use it. I know change can be scary but this ****'s legit.
 
Please withdraw the autopilot from production as soon as possible. It is far from ready and will cause trouble. When the autopilot is better driver than human you can release it. Now it is much much worse.

Please provide links to your posts in Mercedes, Audi, Infiniti, and Volvo forums where you also call for the withdrawal of their lane holding systems. You do realize they offer the same feature, right? Just less accurate as they don't do fleet learning.

To not complain there suggests either you are severely misinformed or that you have ulterior motives.
 
I've seen a number of reports on the Google self driving cars saying they have been involved in several crashes but that all were the fault of the other vehicle.

Hopefully we'll see the same treatment of Tesla.

BTW, I do know of one very minor event locally. Someone was trying to use AP to maneuver through a connected work zone and the car ran into one of those orange barrels. Don't know about anyone else but anytime the situation gets at all sketchy I shut off immediately.
 
This post is based just on videos seen on the internet.
...
The reason for the near crash is the autopilot speeding when the lane next to is almost standing still. No human or very few human would have driven that speed in the dark and wet conditions.

Give me a break! Where do you get off asking Tesla to retract a signature feature and talking to everybody like children when by your own admission you have zero experience with the system?

As for that video, yes, the reason for the near crash is excessive speed, which the driver in that instance said was completely appropriate and was the speed he would have driven manually. I think he's completely wrong that the speed was appropriate, but the point is that autopilot was not responsible for the excessive speed! Without autopilot, the exact same thing would have happened, except it probably would have ended with bent metal. The fault here is 100% with the driver who thinks it's appropriate to drive 45MPH next to a line of stopped cars, and is completely unrelated to autopilot.

If you want to be taken seriously then go out and get some actual experience with the system you're criticizing. Otherwise you're just blowing smoke.
 
What happens if someone presses the accelerator instead of brakes? It is highly irresponsible to provide both the pedals so close to each other which will confuse a lot of people.

I think we should ban cars with accelerator pedals.