Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

My friend's model X crashed using AP yesterday

This site may earn commission on affiliate links.
Speed limit is only the maximum speed allowed - it's driver's responsibility to adjust speed according to conditions and driver's abilities. If AP is driving too fast you are supposed to take over -no ifs about it.

Absolutely. If blame is to be assigned, the driver should receive the most.

Playing the blame game, though, is superficial. Let's ask the five whys. What is the real root cause here? Likely it's that some people are just not capable of chaperoning Autopilot on roads it isn't intended for. If that's indeed the root cause, what is the root fix? Improve Autopilot? Make all drivers sit through a training class? Or automatically turn off Autopilot in cases where it's known to perform worse than a human driver? Of those three, the latter is the most reliable and quickest to implement.

(also lets keep in mind that in this crash Tesla has not even confirmed if AP was even on).

Tesla is very slow to confirm bad press and very quick to announce good press. If we don't hear anything soon, I'm going to assume Autopilot was on.
 
The trouble with trying to make things foolproof is that there are a lot of very ingenious fools.

The problem with fool proofing anything is getting around the foolishness of idiots.
Getting a device to follow rules is amazingly straightforward and simple. Variables are what immensely complicates programming.

There is no end to what a foolish idiot can come up with when driving. You can't fit enough servers in a Tesla that could facilitate the HD space of every possible action of a single foolish idiot driver. Let alone a gaggle of them driving together.
 
  • Funny
Reactions: Pdub2015
At what point does paying close attention to make sure the a/p is driving safely more effort than simply driving the damned thing yourself?
Robin
Answer - Before you turn it on. Until - You turn it off.

Full Automation will never come to fruition because the Human brain can't be adequately replicated except through procreation. There are many who may disagree with what I'm about to say....however its an inescapable reality. Emptions are needed when driving in certain situations:
1. Feelings are needed when driving
2. Fear is needed when driving
3. Changes in heart rate are needed when driving.
4. In some cases....music is needed
5. You get my point.

Emotion is not programmable because there are too many variables involved in its existence

Let me just go on out and say it. Nothing can fully substitute for the human body in a situation where self preservation is at risk.

Although Fantastic....AP will never be a substitute.
 
  • Disagree
Reactions: WhiteCap
Actually it works bluddy well on well marked two lane rural roads
I've never seen a well-marked rural road around here. :) They're all poorly marked. And much worse in the winter! You are lucky to have *any* well-marked rural roads.

with traffic. You must pay attention to intersections
Those are every 500 feet or so, right? Driveways everywhere? So basically you have to take control whenever you see anyone in a driveway or cross street. OK.

and take over as needed, and it's speed limited, but it does very well indeed.
I guess I can see how lane-keeping could work on roads with very good lane striping. And it would free you to pay more attention to all the cross streets and driveways and deer and chickens and children wandering near the roadside, so that you could more readily hit the brakes.

Maybe lane-keeping should be allowed only on roads with clear center lines and edge lines (there are a lot of roads without, and it blatantly doesn't work on them). Unfortunately it apparently doesn't even work on all of the roads with good lines -- it occasionally misses sharp curves and seems to have trouble with hills. That means it's not ready for use on the wild and wooly rural road environment.

Auto braking clearly doesn't really work as implemented, given that it can't accurately spot *stationary objects in the road*, so it simply isn't ready for prime time either. I wish it were. Unfortunately it doesn't even seem to be implemented in a safety-conscious manner, as I noted before; if it's designed to brake slowly and allow my car to rear-end someone else I want nothing to do with it; I would rather be rear-ended and not rear-end anyone else since then I don't pay anything in liability.

Again, all the other auto manufacturers appear to be just as bad. I'm not singling Tesla out here.
 
More information here Tesla Model X goes off the road and crashes in Montana, driver blames the Autopilot

This should be required reading for all new owners ... :cool:

upload_2016-7-12_10-4-49.png
 
Full Automation will never come to fruition because the Human brain can't be adequately replicated except through procreation. There are many who may disagree with what I'm about to say....however its an inescapable reality. Emptions are needed when driving in certain situations:
1. Feelings are needed when driving
2. Fear is needed when driving
3. Changes in heart rate are needed when driving.
4. In some cases....music is needed
5. You get my point.
I completely disagree with your points. Fully autonomous Level 4 driving vehicles already exist, built by Google, and they are proven to be significantly safer than cars driven by humans full of human emotions. In just a few years first Tesla, and then other manufacturers, will have fully autonomous vehicles as well. They will be available for sale. They will be legal.

You are welcome to ignore what Google has already accomplished. The driving revolution is proceeding without you.
 
I admit I'm a newbie here (I'm a lowly Model 3 Reservation holder) and found this thread on my Google News feed researching everything Tesla. But one thing that hasn't been mentioned yet, and can clearly be seen on Google Maps near the intersection of Rt 55 and Rt 2.... Crop Circles! Maybe it was aliens!! :eek::D

Google Maps

OK, so that probably wasn't too helpful. But hello everyone! Glad to have found this place, and hopefully (some of) my posts in the future will be a little more useful... ;)
 
Or automatically turn off Autopilot in cases where it's known to perform worse than a human driver?

There is no such thing as "a human driver", there is a broad population of drivers that fit on a bell curve of skills/competence. Where on the bell curve to you benchmark "worse"--the best case is you irritate half of you owners and it goes downhill from there. The thing you seem to be perhaps missing is that there is also a cost to excessively constraining the use of AP--there are going to accidents and fatalities that could have been prevented/avoided if AP was allowed to engaged, but was not.

The quantitative question is does the use of AP brings down the overall number of accidents/fatalities, even if it ends up being a contributing factor in specific instances and is the delta enough for the technology to be a net benefit to society. The only other analog I can think of whether to release an new antibiotic, knowing that a certain percentage of the population will have an adverse reaction to it?

Tesla is very slow to confirm bad press and very quick to announce good press. If we don't hear anything soon, I'm going to assume Autopilot was on.

Well, you are free to make that assertion, but it may simply be that they are going to wait until they have some actual facts in hand before making any announcements, especially since they are currently under the microscope. As far as I have heard, they still don't have the logs from the PA accident. If the MT driver did not have cell coverage, then Tesla probably doesn't have telemetry data, so they are likely going to have to send someone up to MT to retrieve data before they can make any assessments on what happened.
 
Here's google maps view of the "Winding Road" sign with recommended speed of 45 MPH right where wood post "guard-rail" starts (OP says that Tesla was going 60PMH):

Google Maps


Excellent detective work!

Wall Street Journal confirms:

"The most recent example is of a driver of a Tesla Model X SUV who told local authorities the feature was active when the vehicle crashed into railing wires along the side of Montana State Highway 2 near Whitehall Saturday."
 
  • Love
Reactions: Mark Z
You don't need a Ph.D to understand that you're ultimately responsible for driving the car. I've honestly reduced the amount I use AP because of the truck lust. I did that knowing that it was MY responsibility to drive my car in a safe manner. That if I crashed due to truck lust that it was going to be my fault. I felt like the infrequency of it happening combined with the sinusoidal correction (when I took control) meant that it exceeded what I felt safe with..

I felt that "truck lust" was a problem, too. I have only had the car 5 weeks and 5000 miles. As much as possible using/ learning AP. I find that my car stays in the center of the lane and, occasionally, a truck may inch closer creating the illusion of "truck lust". I have seen a big improvement in AP since I took delivery. In both cases I may be kidding myself, but those are my thoughts. And, I agree, AP is an aid and I am ultimately responsible for the safe operation of the vehicle, just as I was ultimately responsible for the safe operation of every airplane I flew for the past 40 years.

There appears to be a rather cavalier attitude in understanding and using the AP. Ninety percent of the people on the highway in any car apparently don't even know how to use cruise control. I would like to think readers of this forum are the exception to the rule when it comes to reading and understanding the owner's manual. The others having problems probably don't even know where to find the manual and their depth of understanding comes as a result of a test drive, or more worrisome, the delivery experience.

I have started engaging TACC first, followed by auto steer. I realized on several occasions that by engaging the full system simultaneously, that the speed selected immediately went to the last speed commanded. That led to several exciting experiences including pulling toward a concrete barrier after entering the Interstate. Maybe it was just my car or suspension or whatever, but now I engage one channel at a time to ensure a milder experience.
 
At what point does paying close attention to make sure the a/p is driving safely more effort than simply driving the damned thing yourself?
Robin

In such circumstances you disable the AP. I would never enable AP on roads w/o shoulders at speeds over 35mph as you need to allow yourself space and time to react in order to recognize AP is not working as it should.
 
I completely disagree with your points. Fully autonomous Level 4 driving vehicles already exist, built by Google, and they are proven to be significantly safer than cars driven by humans full of human emotions. In just a few years first Tesla, and then other manufacturers, will have fully autonomous vehicles as well. They will be available for sale. They will be legal.

You are welcome to ignore what Google has already accomplished. The driving revolution is proceeding without you.
I agree with you to a point.

If everyone is following the rules then the Google cars work great. I will find the article that describe what I'm talking about. However the Google cars are still failing furiously when a person makes a right hand turn from the left lane while someone is coming behind them speeding. When Google took the same - multiple issue scenario...humans reacted tremendously better as far as steering, braking, etc....in the efforts to save their lives - not so much the vehicle but their lives. In other words, there were cases where humans were willing to pull out into cross traffic in the direction of the traffic and take the accident. For some reason Google cars seem to react where human injury was much higher.
There was a Netflilx movie about this. People were pulled into rooms and asked....why did you make this decision...and why did you make that decision. Some answers were: I knew that my daughter was in the rear seat and I would rather be in a crash where someone would hit my right front driver door instead of her rear passenger door. Scenarios such as what is the significance of the payload as it pertains to the driver can not be ascertained by AP. There were tons of decisions made by humans were the end result concerning lives saved was vastly different - when compared to the mathematical decisions made by google AP.

However.... I am not in any way disagreeing with the effectiveness of Google AP. Google AP does a great or better job than humans when you have sub-par humans driving cars. For instance intoxicated humans or humans that are unaware of all of the rules of the road....

I'm saying...that AP is not fully baked yet to the place where it can replace humans and I don't thing that emotional decisions can ever be made by them such as the one scenario I mentioned above.
 
Tesla confirms "Autopilot" crash in Montana

""It's a winding road going through a canyon, with no shoulder," Shope told CNNMoney. The driver told Shope the car was in Autopilot mode, traveling between 55 and 60 mph when it veered to the right and hit a series of wooden stakes on the side of the road. Tesla confirmed that the data it has from the car shows it was in Autopilot mode, and that the driver likely did not have his hands on the wheel.

"No force was detected on the steering wheel for over two minutes after autosteer was engaged," said Tesla, which added that it can detect even a very small amount of force, such as one hand resting on the wheel.

"As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel," said Tesla. "He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway."
 
Agree 100% that the person and computer need to work together, which is why I think it's odd that the computer, aka AutoPilot, should be turned off on anything but divided highways.

A safety assist system should always be available no matter what kind of road you are on.

Am I looking at this wrong? Why wouldn't lane keeping always be available to potentially save the day if the driver screws up?

Because lane keeping isn't a safety feature, it is a convenience feature.

Safety features are kept on all the time, convenience can be turned on and off.