Andyw2100
Well-Known Member
Edit: Wow, I seem to be defending Tesla on some topics lately... whats gotten into me? Oh wait... nothing has changed. Logic still prevails.
Funny how I could make almost the exact same statement!
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Edit: Wow, I seem to be defending Tesla on some topics lately... whats gotten into me? Oh wait... nothing has changed. Logic still prevails.
Maybe because nobody has released a car to the public that has autonomous driving capability yet?
- - - Updated - - -
Do you believe he would have crashed had he been in a pre-autopilot car? If yes, then I suppose we could say that he couldn't have avoided it. But the AP makes no difference in this. the AP wasn't driving, he was. AP doesn't drive, it merely assists.
If someone does not leave themselves enough time to react, they are a dangerous driver. This is simple. He's driving, he failed to brake for the vehicle ahead, he hit them. AP is 100% irrelevant here, he's scapegoating "I couldn't have avoided it" is rubbish, he clearly states there was plenty of time to avoid it, but then blames the AP for not doing so, it's not the AP's job to do so, it's his.
I think that's actually a bit too easy to do, and if you do it (brush the brake pedal, say) there is no indication except that the TACC indicator switches color. No audio sound, nothing.
I realize that's the standard for traditional cruise controls, but with TACC and other adaptive cruise systems it seems like it should give you a little bit of warning. Does the Mercedes system do anything when you disengage distronic?
Could you have swithed off autopilot without realising? The logs should be able to confirm this.
Could you have at least skimmed this thread before posting?
OK, perhaps that is a little harsh, but it's a pet peeve of mine when a poster jumps into a long thread with some obvious comment that almost certainly has been made before and assumes it hasn't been made, and clearly hasn't taken the time to read the thread.
vchadha--without going back to check, I'm going to estimate that the possibility you just raised has been mentioned in at least thirty or forty posts in this thread, and, even more importantly, the OP has actually said in more than one post that he acknowledges that it is conceivable that that is what happened.
For other new owners, I suggest using autopilot with hands on the wheel and foot covering the brake, paying very close attention, for the first several hundred miles of use. Watch what it does, where it can get confused, etc. Then as you start noticing that you can anticipate how it will behave, you can relax--just a little bit.
What? You realize it still would actually be 0% on Tesla, right? Not even 1% on Tesla. 0%. Tesla was not driving the car. If the brakes failed or something when the driver went to stop his new car, maybe. But the driver just let his car hurl into another car. That's not Tesla's fault no matter what.
Edit: Wow, I seem to be defending Tesla on some topics lately... whats gotten into me? Oh wait... nothing has changed. Logic still prevails.
It always amazes me how two people can look at the exact same set of facts and come to exact opposite conclusions, as we seem to be doing here.
How you can say if the brakes fail and cause an accident its Tesla's fault, but if AP fails and causes an accident it's not, is completely beyond me!
You know, I wonder if Tesla thinks putting "Beta" in front of AP relieves them of responsibility for it being a safe feature. I'm thinking maybe that's why some on the forum seem to take this attitude "if you're killed using AP Tesla isn't responsible". But I promise all of you, in court if the data shows Tesla's camera or radar malfunctioned causing AP to steer the car under an 18-wheeler killing a family of four, Tesla is in one hell of a bad situation - and they will pay out millions in lawsuits to the family of the victims. AND THEY SHOULD. Fear of financial penalties is the only thing motivating manufacturers to make sure their products are safe.
Here, if the logs show Tesla's equipment malfunctioned causing this minor fender bender, Tesla should (if they want to continue to be the exceptional car company better than the traditional car companys) pay for all damages to both cars. If they admit the logs show that AP failed and don't offer to cover all damages I'll be surprised, because any lawyer could walk into court and force them to. Again, if the brakes failed would anybody think lawyers would not be involved? If you were put in the hospital because your car's brakes failed, wouldn't you get a lawyer? Who wouldn't? AP is no different than brakes.
Given the number of automakers (and others!) working on autonomous vehicles, and the profusion of "driver assistance" technology that allows cars to take over more and more of the tasks associated with driving, it is insane to me that there is no federal safety standard on how these technologies are to be implemented.
Just for clarification TACC numerical settings are not simple distances nor car lengths. The numbers are time in seconds until the Tesla, at present speeds, will occupy the space now occupied by the vehicle in front of the Tesla.
To clarify your clarification: the number is a time setting in seconds, but "1" does not mean 1 second following distance at the present speed, and "7" does not mean 7 seconds of following distance at the present speed. (Not sure if that's what you meant--so just clarifying for others if they weren't aware).
If 7 were 7 seconds, then at 70 mph I'd be over 700 feet behind the car in front of me, which is definitely not the case!)
The difference is one of us is actually using logical reasoning, the other is not. I'll leave it as an exercise to the reader to determine which is which.
There is a huge difference between the brakes failing and something related to autopilot failing. (First, it's worth noting that we don't even know if this had anything to do with autopilot at all, since the OP admits they may have disengaged it.)
In this case, autopilot did not fail. Have you read the manual and all of the caveats surrounding the functions of autopilot and TACC? I suspect you have not. It specifically states its limitations and that it can not and is not designed to be relied on to control the vehicle in all circumstances.
Now, if you read that same manual and find me a caveat that states limitations on where the brakes will not attempt to stop the car when pressed, let me know. Until then, the failure of these two things are not comparable.
And again, the driver is in complete control of the vehicle at all times. If the driver lets their car run into a semi and kill their family, how is that any different than a driver of a Honda Civic doing the same thing with regular cruise control? The driver is still in complete control. It amazes me to no end that people are still trying to argue that this is not the case. Until my car has it's own driver's license, I'm the driver.
- - - Updated - - -
I have to take this a bit further and again point out that if the driver is using autopilot as instructed and the car attempts to steer somewhere it shouldn't the driver's hands should be right there and not permit it. So the "steering under a semi" nonsense is nonsense.
Just because no one keeps their hands on the wheel doesn't mean this is how it's supposed to be used, and doesn't abdicate responsibility.
- - - Updated - - -
If I have autosteer/TACC engaged and...
I find a wire terminal I uncover on the car says "DANGER HIGH VOLTAGE", and I touch it after clearly being advised of the danger and die, it's Tesla's fault, right, because autopilot!
I get a flat tire and it makes it hard to control the car such that I drive off a cliff.... that's Tesla's fault, right, because autopilot!
I jerk the wheel to the side with my leg while fidgeting, drive off a cliff, and die... that's Tesla's fault, right, because autopilot!??!
I run over a pedestrian and kill them... that's Tesla's fault, right, because AUTOPILOT?!
I see that the car can't steer properly due to pool lane markings, and I let it, and we run off the road, fall down a cliff, smash into a small village and kill 47 people after ironically landing on top of a large oil tank... that's Tesla's fault, right, BECAUSE AUTOPILOT?!
I hit a deer... that's Tesla's fault, right, BECAUSE AUTOPILOT?!
I run over my dog... that's Tesla's fault, right, BECAUSE AUTOPILOT?!
Obviously I can go on, but I hope the sarcasm is dripping from this post and how ridiculous it sounds to blame autopilot for basically anything. It's not for replacing the driver. Just like cruise control, just like ACC, etc etc etc and any other feature. YOU'RE STILL THE DRIVER.
I had been puzzled by this - I've been counting seconds until my car passes the same spot the car in front of me just passed and it never was 7 seconds - always less. Does anybody know what "7" is 7 of? It definitely isn't seconds, and its not car lengths.
I'm sorry if I upset you.
I don't agree with you at all, but there's no point in discussing it further. I hope to God none of the nightmare scenarios takes place, and Tesla keeps improving our current cars so that they are safer and safer every release. I was definitely impressed with the improvements in taking curves and not diving for exits that was clearly made in 7.1.
Did Tesla ever get back to the OP after reviewing his logs? There seems to be a lot of side discussions to pick through here.
Electricfan, it is fair to point out that you were perfectly comfortable reading a book while your car was on autopilot on the highway, which is relevant to the argument.
But my feeling is, Tesla advertised and sold this car saying it would drive itself. The website doesn't use the word "autonomous", but all the definitions are really meaningless. They advertise the car will drive SAFELY on a divided highway and stay in its lane in full control of the brakes, accelerator and steering wheel. I'm trusting them to do that.