Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla refusing to show me my own logs?

This site may earn commission on affiliate links.
On further reflection (and having just read the thread where people complain about being forced to install an update and some people staying on 2 year old software because they don't like the autopilot nags), it seems to me that Tesla is actually incredibly generous (and accepting risk no other car manufacturer accepts) by allowing customers to actually use a feature/software that is in Beta. Of course this is very common in the software industry (lots of people run pre-release versions of new operating systems) but unheard of in a car. So this really is a case of the original poster consciously choosing to enable a feature that is in test and then being upset that it doesn't work flawlessly....
 
  • Informative
Reactions: xyeahtony
Bottom line (and people keep forgetting this) is that Tesla has publicly stated that Autopilot software is not complete nor completely tested and has not been released. Drivers (including me and you) who use Autopilot have to select a button that enables it in the menu and it explicitly states that this is Beta software...meaning you are testing software that has not been finalized yet. As such, the fact that it may phantom brake, steer oddly, or do other things, is totally in bounds (from a legal point of view) and YOU not Tesla are choosing to operate test software on the road environment recognizing (and accepting) liability for things the car may do because it is under your control. There are many on this list who have complained that Tesla is keeping Autopilot in Beta purely to protect themselves legally. While that may be true, I think the number of problems that Autopilot still has would point to the fact the software really is still in Beta and be more concerned if Tesla called the current software "Final".

As such, your comments that the phantom braking on the highway are a safety issue are correct, but they are a safety issue on YOU not Tesla, because you are choosing to operate test software in that environment.

As others have said, the other issues you have with your car may well support a lemon law claim (none of us can judge because you haven't shared them) but the phantom braking/autopilot issues you are seeing are not grounds for a lemon law claim because Tesla has not yet delivered a product to be defective...they are merely giving you the option of an advance preview/test of the current test software...

Thank you. This is the sort of constructive an informative response I was looking for. And you're totally correct that I am assuming responsibility when it's enabled, I won't debate that, and I fully recognize that this is beta. That being said I will still maintain that we've progressed from from odd and quirky behavior to downright life threatening behavior.
 
  • Like
Reactions: neroden
I can understand why you feel that way and can understand why others (and maybe even you) would choose not to use autopilot until it is further refined. I use it every day on very crowded roads and highways at speeds ranging from full stop to 70 mph. I do get the occasional phantom brake but do not feel it is a danger to me or others. I always have my foot on the pedal and am primed to either (a) goose the gas to overcome the phantom brake before the car has lost more than a tiny bit of speed or (b) hit hit the brakes or (c) turn the steering wheel. I use autopilot the some way I use an autopilot while flying a coupled approach in an airplane (and that's with certified/final software) -- one hand is on the yoke, one hand is on the throttles/power levers and my feet are on the pedals, ready to respond in an instant if the autopilot does something it shouldn't. If people used the Tesla autopilot that way, there would be no autopilot accidents or all these stories about autopilot trying to kill the drivers. The problem is people have been seduced by the video of the self-driving Tesla and all these idiots on youtube using Tesla autopilot with feet out the window, not paying attention to road, etc. In my mind, right now we have a driver problem, not a technology problem, because the technology simply isn't there yet for true "no driver attention" autopilot operations...
 
I totally agree. This car is for my fiancé and while I feel I could manage some of the danger in the ways that you mention (I'm a very active, defensive and alert driver), she may not. The effect you cite is most definitely real, and after seeing this dangerous coupling of human trust and beta software it's enough for me to pull the plug on this. I'm wondering if it's at all possible for me to push for a refund on just the $5K autopilot option in the event that my other claims don't trigger a lemon law buyback. At least I'd just be stuck with a shitty car in the conventional sense.

I can understand why you feel that way and can understand why others (and maybe even you) would choose not to use autopilot until it is further refined. I use it every day on very crowded roads and highways at speeds ranging from full stop to 70 mph. I do get the occasional phantom brake but do not feel it is a danger to me or others. I always have my foot on the pedal and am primed to either (a) goose the gas to overcome the phantom brake before the car has lost more than a tiny bit of speed or (b) hit hit the brakes or (c) turn the steering wheel. I use autopilot the some way I use an autopilot while flying a coupled approach in an airplane (and that's with certified/final software) -- one hand is on the yoke, one hand is on the throttles/power levers and my feet are on the pedals, ready to respond in an instant if the autopilot does something it shouldn't. If people used the Tesla autopilot that way, there would be no autopilot accidents or all these stories about autopilot trying to kill the drivers. The problem is people have been seduced by the video of the self-driving Tesla and all these idiots on youtube using Tesla autopilot with feet out the window, not paying attention to road, etc. In my mind, right now we have a driver problem, not a technology problem, because the technology simply isn't there yet for true "no driver attention" autopilot operations...
 
For what it's worth (and again, none of us know what these other issues are), even without an autopilot, the Tesla is a tremendous car and there are tens of thousands of Tesla drivers out there who feel that way. I suspect you have no luck in getting a refund for the autopilot funds you paid...when you selected that option, there were the disclaimers about autopilot being in Beta and still not finalized.

You're new to this forum and these are literally your first posts ever. You're clearly upset about the car and it may be that you are in a position to get your money back under a lemon law (none of us can judge that). If you're that unhappy and feel you have a valid case, then it sounds like you've made your decision...asking us to validate your plan of action without all the details is simply not something most of us are willing to do. What we have seen on these forums time and time and time again is people posting how autopilot is trying to kill them, they want to sue Tesla and these cars are deathtraps when in fact the truth is totally different and we have a case of people not understanding what they are signing up to when the enable that autopilot Beta. Could Tesla do a better job of explaining it? Some would say yes, others would say Tesla is already being more conservative than they would prefer. As in most cases, the opinions vary depending on how people view the situation...

Either way good luck and I hope you end up either a satisfied owner (these really are incredible cars) or at least satisfied with how things turn out.
 
OP: if you think Autopilot is unsafe - Don’t use it. Simple.

You can ask for a prorated refund and maybe Tesla will entertain that claim.

For the vast majority of us AP works great and this phantom braking is only a minor concern. In 7k miles for me it has happened once in a curve when my next lane truck braked hard.
 
  • Like
Reactions: ghostrunner
...You can ask for a prorated refund and maybe Tesla will entertain that claim...

I am sure Tesla can easily absorb that small cost but the issue is the principle of responsibility especially when safety is mentioned in this thread.

There's a 2016 undocumented fatal Autopilot lawsuit in China. This year, the family of Apple engineer who died in Mountain View, CA fatal Autopilot accident is threating a lawsuit as soon as they can get NTSB findings.

These families believe Autopilot is deadly defective which cause their son/husband to die and Tesla has to pay the blood money.

My opinion is: Autopilot is not for everyone. That's why Google/Alphabet/Waymo went straight to developing an autonomous system and would not release it to the public until it's perfected.

On the other hand, there's a huge demand for beta-testers who would love to pay for that privilege such as Comma.ai and Tesla users.

I think: It's very difficult for a beta-tester such as Comma.ai or Tesla user to convince the court that they didn't know what they were buying: a beta product!
 
I am sure Tesla can easily absorb that small cost but the issue is the principle of responsibility especially when safety is mentioned in this thread.

There's a 2016 undocumented fatal Autopilot lawsuit in China. This year, the family of Apple engineer who died in Mountain View, CA fatal Autopilot accident is threating a lawsuit as soon as they can get NTSB findings.

These families believe Autopilot is deadly defective which cause their sons to die and Tesla has to pay the blood money.

My opinion is: Autopilot is not for everyone. That's why Google/Alphabet/Waymo went straight to developing an autonomous system and would not release it to the public until it's perfected.

On the other hand, there's a huge demand for beta-testers who would love to pay for that privilege such as Comma.ai and Tesla users.

I think: It's very difficult for a beta-tester such as Comma.ai or Tesla user to convince the court that they didn't know what they were buying: a beta product!

The wife went on the news saying that her husband told her and even showed her (while driving) autopilot couldn't handle some roads, seemingly trying to make it look like autopilot was not safe. However, when she said that she inadvertently admitted that they knew autopilot was not safe but conitnued to engage it anyways, torpedoing any chance they'd have of putting any liability on tesla.
 
  • Informative
Reactions: neroden
I totally agree. This car is for my fiancé and while I feel I could manage some of the danger in the ways that you mention (I'm a very active, defensive and alert driver), she may not.
One factor you should consider, and should also verify on the road: In my personal experience, phantom braking is linked to TACC and not necessarily to auto pilot.
You should try disabling the auto pilot in your car settings; TACC can not be turned off and is automatically activated with purchase of the advanced autopilot feature, regardless of settings.
If you still experience the issue, your position will be much better as you're no longer using any beta software under your own responisbility but are relying on released hopefully supported car features.
 
I totally agree. This car is for my fiancé and while I feel I could manage some of the danger in the ways that you mention (I'm a very active, defensive and alert driver), she may not. The effect you cite is most definitely real, and after seeing this dangerous coupling of human trust and beta software it's enough for me to pull the plug on this. I'm wondering if it's at all possible for me to push for a refund on just the $5K autopilot option in the event that my other claims don't trigger a lemon law buyback. At least I'd just be stuck with a shitty car in the conventional sense.

Maybe in your case, you should ask for Tesla to do that. I can tell you, I have grasped with the idea of putting my wife in a Model S or X for quite sometime. Everyone is different, these two cars for sure are thinking person cars.

All these systems and the Beta issues along with Phantom Braking(why it might phantom brake at a given place before you get there), raising the suspension to park or even go over some speedbumps, exit a shopping center or enter, require attention to detail. Parking without curbing the rims(swinging it wide perhaps). It all requires attention to details.

People make mistakes, I have engaged AutoPilot by accident and have it suddenly accelerate to a previously set speed can throw you for a second. Never to the point of an occurrence, but its surprises you and you have to know quickly what is happening. Some, not all people panic in those situations.

Someone who is not attentive to phantom braking, engaging a system by accident or something as simple as raising the suspension to park, eventually, something is going to happen. I like being engaged with my car as fully as that may be, others not so much. SO if this is your fiancés car, I imagine she is not liking what is going on. I do however get your driving it a lot perhaps? Does Tesla see it that way?
 
Maybe in your case, you should ask for Tesla to do that. I can tell you, I have grasped with the idea of putting my wife in a Model S or X for quite sometime. Everyone is different, these two cars for sure are thinking person cars.

All these systems and the Beta issues along with Phantom Braking(why it might phantom brake at a given place before you get there), raising the suspension to park or even go over some speedbumps, exit a shopping center or enter, require attention to detail. Parking without curbing the rims(swinging it wide perhaps). It all requires attention to details.

People make mistakes, I have engaged AutoPilot by accident and have it suddenly accelerate to a previously set speed can throw you for a second. Never to the point of an occurrence, but its surprises you and you have to know quickly what is happening. Some, not all people panic in those situations.

Someone who is not attentive to phantom braking, engaging a system by accident or something as simple as raising the suspension to park, eventually, something is going to happen. I like being engaged with my car as fully as that may be, others not so much. SO if this is your fiancés car, I imagine she is not liking what is going on. I do however get your driving it a lot perhaps? Does Tesla see it that way?

Well said. In other words, requires some basic level of IQ and common sense to use Autopilot. If someone is lacking it, they should not use it. For example if you know and have experience that AP is a bit confused during lane splits, especially when the lane markers are confusing - then don't use it or watch the road. Don't be like the CA Apple guy.

For others this is an amazing product that makes driving much less of a chore - a great stress reliever - compared to driving without AP.
 
  • Like
Reactions: xyeahtony
If any European users covered by GDPR see this post I'd ask that you leverage your rights against the firm by demanding the entirety of your data, including your logs, which you are entitled to by law (rightfully in my opinion) from Tesla. This is the oly way I believe we will come close to seeing our own log data. I believe they have 30 days to provide you the data before violating the GDPR.

How to request your personal data under GDPR

The monetary penalty for Tesla as I understand it would be 4% of Teslas annual global turnover, or $23.6 million, per infringement. That should get their attention. Especially since they should be smart enough to know it'd be bad press to be described as 'violating GDPR'.

As a software dev the thought that I can't view my own hardware/software logs is despicable. You're asking people to throw their lives into your beta program without offering them any intelligible feedback. My opinion probably wouldn't be so strong if Tesla didn't have a long history of throwing logs at users when it suits their needs, like newsworthy crashes. We all know AP has plenty of issues. Many of us still use it sometimes anyway. Users would be more safe if they had access to their own logs giving them insight to causes of AP errors, and therefore allowing them to avoid such scenarios in the future. This no logs, no release notes, no GPL compliance crap has got to go.
 
You're asking people to throw their lives into your beta program without offering them any intelligible feedback.

Nobody is putting a gun to your head to use AP. Don't use it if you don't like it.

All these logs - which will make no sense for an outsider - without a relevant context, will only be used to spin the situation against Tesla, with an aim to shutdown AP or bankrupt the company. As long as Tesla is working diligently to make it better and safer, we should just let them do their job without all these fake law suits and exaggerated negative made up lies from the tabloids (NYT, BI, WSJ, CNBC,..), that will be inevitable if they release the logs.
 
I totally agree. This car is for my fiancé and while I feel I could manage some of the danger in the ways that you mention (I'm a very active, defensive and alert driver), she may not.

Ugh :rolleyes: if dudes across these forums are really concerned about the safety of their girlfriends, fiancées, wives etc., then why are you even hesitating to act? Swap cars immediately and stop blaming it on IQs, tech capacity or other cognitive limitations. I think many of these comments about female skills across the forums are dudes dealing with their own fears, incompetencies, inadequacies etc. and blaming it on their better half who simply arent enthused about these cars. The cars do not discriminate and they can overwhelm or disinterest many.

I've not seen evidence of a lemon law claim from the OP yet, but I do see a dissatisfied customer which is unfortunate. I also see hints of a regrettable purchase decision trying to inappropriately lay blame elsewhere. Also unfortunate, if true.

To the OP, there are several options in your immediate control to remove your safety concerns.
I hope you act quickly; we want your fiancé safe.
 
  • Like
Reactions: xyeahtony
@josephjah - Interesting thread you have started, just find it odd you are more focused on getting peoples opinions on how you can place a claim on Tesla versus asking for technical help. Maybe there are certain triggers on a stretch of road you use that make the TACC/AP become cautious and apply the brakes. Also odd you withhold other reasons you intend to use the lemon law.

@mmmk - Ah GDPR, the EU at its best. If they are not regulating the curvature of bananas they are limiting the maximum power on our vacuum cleaners... Nevertheless even if you take GDPR on face value as a good thing, I really wonder if Tesla are breaking any laws if they are simply recovering non personal technical data from a machine. Also why are you wacking that poor horse? Animal cruelty is way worse than withholding data logs
 
Let's be clear - there was no or negligible phantom braking with AP1. It's a significant and chronic problem with AP2. I'm surprised a class action hasn't gathered steam due to it and some other things. The only class action of which I'm aware that's done related to AP2 is the one about the delay relative to promises and statements made. A whopping $280 max per eligible owner for that one. Don't spend it all in one place. Although it could be spent on one tire :).

Meanwhile, there's been enough legalese and cya verbiage wrapped around AP now that I'd be surprised if phantom braking alone gets one a lemon law win. Same car without AP is, ironically, one of the safest cars made, and it's not close.

Presumably they fix phantom braking before claiming FSD is... I hesitate to use the word "done".
 
  • Like
  • Informative
Reactions: OPRCE and Benjanos
I doubt you'll have any success with a lemon law complaint as it's a known issue with TACC/AP on cars with the AP2, and it's not something that's wrong with your specific vehicle.

Heck it's so bad that the insurance institute made a mention of it when they did their study of various lane keeping systems. Your best bet is to see if you can be reimbursed for EAP. Obviously you don't want the TACC/AP feature as it doesn't meet your needs so you shouldn't have to pay for it.

It's a software only feature that can be turned on after purchase so I fail to see why it can't be turned off. There only seems to be a handful of people who hate it to that degree.