Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another fatal autopilot crash - China

This site may earn commission on affiliate links.
What exactly is so proprietary about accessing vehicle log data? Is that supposed to be some big secret? There is no reason at all for Tesla to refuse to an investigation by an independent third party. If someone wanted to access a Tesla all they would have to do is buy one. Nothing truly proprietary there.
The data may not be proprietary (although there is a possibility that it might be if it was encrypted/encoded and decrypting/decoding would reveal details about how Tesla's system works beyond what reverse engineering can do), but the tools definitely are. If it was something that anyone can easily pull and analyze from the car, then the family would have already done so. It seems the family doesn't want Tesla getting access to the logs at all, but to provide all assistance beyond that to allow a third party access. That's how both statements turn out to be somewhat true even though they seem to contradict each other.

And this being China, confidentiality agreements limiting the tools to only being used for this investigation (and not copied and leaked to other unrelated third parties) are likely practically worthless.
 
Last edited:
  • Like
Reactions: Cyclone
The data may not be proprietary (although there is a possibility that it might be if it was encrypted/encoded and decrypting/decoding would reveal details about how Tesla's system works beyond what reverse engineering can do), but the tools definitely are. If it was something that anyone can easily pull and analyze from the car, then the family would have already done so. It seems the family doesn't want Tesla getting access to the logs at all, but to provide all assistance beyond that to allow a third party access. That's how both statements turn out to be somewhat true even though they seem to contradict each other.

And this being China, confidentiality agreements limiting the tools to only being used for this investigation (and not copied and leaked to other unrelated third parties) are likely practically worthless.

Yeah, the notion that anything is simple here is laughable.
 
Anyone know if the seat belt was used? From the limited pics, it doesn't look like there was significant incursion into the driver's side. Wondering if anyone else has seen more pics showing the driver's side damage. It seems to me the accident replicates IIHS's small overlap crash test, so I'm surprised someone would die in a Tesla w/this type of accident if he were strapped in. (and I think I read it was at ~60mph).
 
Anyone know if the seat belt was used? From the limited pics, it doesn't look like there was significant incursion into the driver's side. Wondering if anyone else has seen more pics showing the driver's side damage. It seems to me the accident replicates IIHS's small overlap crash test, so I'm surprised someone would die in a Tesla w/this type of accident if he were strapped in. (and I think I read it was at ~60mph).

Don't think we have that info. I wondered about that myself.
 
...I fail to see the distinction between AEB and TACC/AP if AEB doesn't override user input. The fact that Tesla talks about AEB separately from TACC/AP suggests to me that AEB is intended to override manual input. Now, that doesn't say how well it works; jus that it *should* work that way.

Hopefully, some day Tesla AEB can prevent crashes from drivers' unintentional acceleration (In this case, the driver wanted to park, not to do a Ludicrous demo into a gym.)


tesla-gym-crash-1.gif


Tesla Model S crashes into a gym, driver claims autonomous acceleration, Tesla says driver’s fault
 
  • Informative
Reactions: redy
Hopefully, some day Tesla AEB can prevent crashes from drivers' unintentional acceleration (In this case, the driver wanted to park, not to do a Ludicrous demo into a gym.)


tesla-gym-crash-1.gif


Tesla Model S crashes into a gym, driver claims autonomous acceleration, Tesla says driver’s fault

It'll probably be a while before Tesla or any manufacturer allows their car to completely override driver input. The sensors aren't perfect yet, the software certainly isn't perfect yet, and someone getting into a crash because the car wouldn't obey the driver would not be a good look. It wouldn't even have to be the car being wrong about the driver causing a crash. Maybe the driver sees a semi barreling at him from the other side of the road and rightly figures better to crash into the car in front of him to try to push his way out of the truck's path than to be nailed by a semi. Or, if you remember the pretty high profile case of that group of motorcycle douches assaulting the Range Rover driver a few years back, maybe you're surrounded by a bunch of assholess and you're worried for your safety and the safety of your family.
 
It'll probably be a while before Tesla or any manufacturer allows their car to completely override driver input. The sensors aren't perfect yet, the software certainly isn't perfect yet, and someone getting into a crash because the car wouldn't obey the driver would not be a good look....

1) I don't know about whether you can reprogram Tesla current hardware but others have implemented AEB to override a driver's acceleration to avoid a collision against driver's action or will.

@Spidy posted a demo below when the driver manually accelerated 0-60 km/hr or 37.5 mph and you can hear the very noisy ICE manual acceleration:




2) If you do not want the car to act against your will, Tesla gives you a choice of manually disabling the automation before the drive. That way, you can run over obstacles without the car's interference.
 
1) I don't know about whether you can reprogram Tesla current hardware but others have implemented AEB to override a driver's acceleration to avoid a collision against driver's action or will.

@Spidy posted a demo below when the driver manually accelerated 0-60 km/hr or 37.5 mph and you can hear the very noisy ICE manual acceleration:

Seems like that's a different situation from the typical pedal mistake during parking. In your video, the car was accelerated to a certain public road speed and then AEB took over after detecting a car or pedestrian in front

In the parking example, the car is going at slow parking speeds toward to parking spot with no barriers (only some plants) and then accelerator pedal is floored (I guess the driver thought it was the brakes). I'm pretty sure the Mercedes would not override driver input if the pedal is floored in the same situation. And if it was reactive, it would be unlikely able to brake in time.

If there as a wall in front while at a stop, that is a different case (the Tesla does override things in this case). This feature was complained about as very dangerous in the case of a false positive:
"Obstacle detected" - blocking acceleration - very dangerous!
 
Last edited:
Anyone know if the seat belt was used? From the limited pics, it doesn't look like there was significant incursion into the driver's side. Wondering if anyone else has seen more pics showing the driver's side damage. It seems to me the accident replicates IIHS's small overlap crash test, so I'm surprised someone would die in a Tesla w/this type of accident if he were strapped in. (and I think I read it was at ~60mph).

What happened was that the wheel of truck went upwards and over the hood into the windshield and roofline, into the cabin.
 
The claims in that article from the owner and father of the victim are ludicrous:

In particular, he pointed to a conversation he had with Yaning after purchasing the Model S. Yaning, he said, explained that a Tesla salesperson told him that Autopilot can virtually handle all driving functions.

“If you are on Autopilot you can just sleep on the highway and leave the car alone; it will know when to brake or turn, and you can listen to music or drink coffee,” Jubin said, summarizing the salesperson’s purported remarks
Anyone who had used Autopilot in a Tesla for 5 minutes would already realize that any such claims by a salesperson are ridiculous. He undoubtedly knew that those claims were false, as he would have never seen the car "handle all driving functions" or "know when to brake or turn".

I could understand him wanting to return the car after finding out that it couldn't do what was claimed; but, in any case, he would have been quickly disabused of such notions. How, then, can he make the claim that, after experience with the actual product, claim that he or his son were under any such impression at a later time? Did he tell his son these false stories before letting his son drive it? They seem to be convenient claims to make after his son dies in order to blame someone.
 
They seem to be convenient claims to make after his son dies in order to blame someone.
Face saving lies. A cultural social thing. It won't yield a legal remedy but will achieve the social end sought with their group to manufacture a facade to cover the embarrassment that their family member negligently crashed and died.

A less prevalent phenomenon in the west, but still evident in things like Trump claiming the election was rigged to explain his popular vote loss.