Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[UPDATED] 2 die in Tesla crash - NHTSA reports driver seat occupied

This site may earn commission on affiliate links.
I'm not sure his example is relevant:


Notice that Elon said standard AP requires lane lines. That person paid for FSD so it has different features/requiremen
I think it is a problem that AP (and TACC) can be activated sometimes in strange places and at strange speeds (sometimes I have 45 mph speed limit in residential area, and sometimes 25 mph on a 5 lane divided major street with the actual 45 mph speed limit. My take on this is that if I turn AP on on city streets, I must be extra vigilant as it is NOT intended yet for city streets as stated clearly in the manual.

Regarding our big Ford fan S.Rodriguez and his video, 1) I found the place and on Google maps it actually seem to have remnants of something that looks like old lane marks; 2) I wonder why out of all places he chooses that specific street; 3) I wonder if his experiment is reproducible on any good number of similar streets; 4) I wonder if Model X or S has a lower tolerance for picking up lane marks for AP than Models 3/Y.
 
  • Informative
Reactions: Dan D.
Something else I thought of in relation to this snap:


What if those are not skid marks but acceleration marks? Someone not familiar with regen might be lulled into thinking their foot is on the brake if they are applying light pressure to a pedal and the car is slowing down: so they press the wrong pedal? The photo isn't HD enough to determine which way the grass is laying (forward for braking or back for accel) but in looking at the right track, it looks like more dirt is kicked up off the right side of the track indicating a sharp left turn - trying to stay on the road. The marks are also grooved a bit sharply. That could indicate that the front wheels were already turned significantly to the left when the car entered the grass and reduced traction caused the turned wheels to plow straight ahead while the wheels were turned left. At that point, the driver may have tried to avoid a direct collision with a tree and hence ended up wedged between two trees instead.

Whatever happened, it has signs of excessive speed and loss of control being factors. I don't think any scenario with AP would have lead to those marks/ruts. Those look like driver panic.

Mike
That's an interesting idea about acceleration. I'm sure they'll be checking that as a possibility. I'm still curious why you want there to be a driver when they have said there wasn't a driver? They seem to be sure.
 
I'm suggesting they created the situation through careful and unfortunate manipulation of behaviors in the Tesla's systems. Behaviors that other Tesla drivers have suggested do indeed exist.
Oh please. Are you suggesting two old suicidal hackers? If they unknowingly created such situation then they are SUPER extremely unlucky when the driver did something get out of the car and jumped in the back seat. If they knowingly did that, what exactly was their plan to stop the car?
 
  • Like
Reactions: Microterf
Something else I thought of in relation to this snap:


What if those are not skid marks but acceleration marks? Someone not familiar with regen might be lulled into thinking their foot is on the brake if they are applying light pressure to a pedal and the car is slowing down: so they press the wrong pedal? The photo isn't HD enough to determine which way the grass is laying (forward for braking or back for accel) but in looking at the right track, it looks like more dirt is kicked up off the right side of the track indicating a sharp left turn - trying to stay on the road. The marks are also grooved a bit sharply. That could indicate that the front wheels were already turned significantly to the left when the car entered the grass and reduced traction caused the turned wheels to plow straight ahead while the wheels were turned left. At that point, the driver may have tried to avoid a direct collision with a tree and hence ended up wedged between two trees instead.

Whatever happened, it has signs of excessive speed and loss of control being factors. I don't think any scenario with AP would have lead to those marks/ruts. Those look like driver panic.

Mike
Good point. They ARE acceleration marks indeed because there is no skid mark on the concrete curb.
 
...I think it is a problem that AP (and TACC) can be activated sometimes in strange places and at strange speeds...

I am now accustomed to clicking twice and I forgot that I could still click once to activate TACC alone.

When Elon said about Autopilot cannot be on when the road is laneless (youtube proves that yes you can in certain laneless roads as the system would pick up the shoulders as lane markers) but he didn't say about TACC. TACC doesn't need any lane!

So, Elon's statement might be true but there might be additional info that's not told so we need the whole car logs to make sense of his statement.
 
I am now accustomed to clicking twice and I forgot that I could still click once to activate TACC alone.

When Elon said about Autopilot cannot be on when the road is laneless (youtube proves that yes you can in certain laneless roads as the system would pick up the shoulders as lane markers) but he didn't say about TACC. TACC doesn't need any lane!

So, Elon's statement might be true but there might be additional info that's not told so we need the whole car logs to make sense of his statement.
Well, but if they used TACC, what was their plan? After all, they could take a 1999 Toyota Camry, turn on the cruise control and die in a similar way.
 
SOMETHING happened. I'm suggesting they created the situation through careful and unfortunate manipulation of behaviors in the Tesla's systems. Behaviors that other Tesla drivers have suggested do indeed exist.
Oh please. Are you suggesting two old suicidal hackers? If they unknowingly created such situation then they are SUPER extremely unlucky when the driver did something get out of the car and jumped in the back seat. If they knowingly did that, what exactly was their plan to stop the car?
No, not that at all. I'm suggesting the owner wanted to drive from the house down the street on Autopilot. He wanted to show how his car could drive all by itself. He fully expected the car to take the corner and drive down to the end of the street whereupon it would stop at the stop sign automatically. In fact, [he said?], it will even drive with him in the back seat. It may have required the seat belt to be attached, perhaps a speed dialed in, or a destination inputted. He then enabled Autopilot or the front passenger did by clicking down twice. Perhaps this has worked for him before. He fully expected the car to drive, steer, and stop. Their main mistake was him not being in the driver's seat. They did not expect to crash.


[I'm only conjecturing a theory to explain the situation. These are all suppositions. I don't know their intentions nor actions]
 
Last edited:
  • Like
Reactions: Microterf
Well, but if they used TACC, what was their plan? After all, they could take a 1999 Toyota Camry, turn on the cruise control and die in a similar way.

Their plan was revealed by the wives: Test the car's Autopilot capability. Maybe they try to prove that if the driver jumps off from the driver's seat the system would shut down. They were both engineers after all!

Maybe they couldn't activate the Autopilot by double-clicking on the laneless road, but maybe something went wrong and the single click for TACC worked with a set speed.

Under the menu "Autopilot" on your car's screen, there is also "Automatic Emergency Braking" whether you bought it or not, it is still automatically on by default at each drive.

TACC is part of Autopilot and FSD. Other cars' smart cruise is not the foundation to be built upon to become Autopilot, Enhanced Autopilot, then FSD.

So in order to provide a safe robotaxi system (FSD scheduled at end of 2019 and robotaxi scheduled for 2020), Tesla needs to prove that its basic automatic system--the TACC and its accompanied Automatic Emergency Braking--should be able to brake for obstacles in 2021.
 
Last edited:
Here's a video of a Tesla owner Sergio Rodriguez activating AP on a nothing street with no lines. AP sets itself to 45mph which he says is too high. AP proceeds to accelerate and fails to react to the curve except he stops it. Whether this video has any bearing on the crash in Texas it is interesting allowable behavior in the car. It seems to do what we are told Tesla's on AP cannot do.

Let's say the car "backed from the driveway" and started driving as we were told in witness statements. One guy in the back, one guy in the front. Some form of disable devices in the seatbelt and/or seat weight. Whatever was required. Let's suppose that starting position is halfway up the driveway at #2 Hammock Dunes Pl for the sake of argument, where there is a reddish car on Google Maps. Not sure if that is the correct house, but let's start there.
They back from the parking spot and drive down the driveway, through the cul de sac and down the street. Total distance is 450-550 ft. Somehow they have enabled Autopilot - as Sergio Rodriguez was able to do in his video. The Tesla "should" not allow this, but does anyway - as Sergio's car did. Let's say it accelerates to 45mph either given a nudge from the passenger's foot or by itself. Perhaps they entered a higher speed or resumed a previous set speed. Again, the Tesla "should" not do that, but perhaps they discovered that it actually does - as Sergio Rodriquez did. Perhaps they get it up much higher than 45mph.

Next, either AP fails to make the turn - like Sergio's car, or they nudged the wheel by mistake, disabling AP and it doesn't even try to turn. If they "nudged it off" then AP would show as not engaged, as Elon said.

45mph is not very fast, but perhaps fast enough to crash into the tree injuring the occupants. Perhaps they were not belted. Perhaps running over the curb/drain and hitting the tree was enough to cause an instant fire. Perhaps the fire was so sudden that though just injured they were incapable of escaping. Perhaps the passenger's seat was wedged against a tree, the battery had failed and the rear passenger didn't know the escape procedure. Perhaps 45mph was fast enough. Lots of "perhaps" I know, but it fits the scenario.

Just reconciling the witness statements with the list of possible reasons they might have tested the car in this manner. They wanted to see the car navigate on AP on their little street (maybe the owner had previously found that it would do it, even though it's not supposed to be able to do it). It seems AP can activate on streets with no lines. It seems the car can accelerate to an unsuitable 45mph on a small street. It seems the car on AP can fail to turn whatsoever on those streets (thanks to Sergio Rodriguez's video).

Would not be good for Tesla if AP could/did actually do all these dangerous things that Tesla claims it would never do.
Er, to continue your use of the words "suppose" and "perhaps", why dont we just suppose that just perhaps you are wrong?
 
I could be wrong, but the only 2 scenarios I can see are speeding/crashing, trying to get out of the back door, or completely circumventing all safety features that Tesla has to take a joyride in the backseat.

Neither one of these result in me feeling bad for them, only glad that they didn't hurt anyone else. Play stupid games, win stupid prizes.
 
No, not that at all. I'm suggesting the owner wanted to drive from the house down the street on Autopilot. He wanted to show how his car could drive all by itself. He fully expected the car to take the corner and drive down to the end of the street whereupon it would stop at the stop sign automatically. In fact, [he said?], it will even drive with him in the back seat. It may have required the seat belt to be attached, perhaps a speed dialed in, or a destination inputted. He then enabled Autopilot or the front passenger did by clicking down twice. Perhaps this has worked for him before. He fully expected the car to drive, steer, and stop. Their main mistake was him not being in the driver's seat. They did not expect to crash.


[I'm only conjecturing a theory to explain the situation. These are all suppositions. I don't know their intentions nor actions]

The problem with trying to pin it on TACC or AP is the need to have it activated.

To activate either one you have to be going as it won't start from a stop unless its stopped at stop sign or a vehicle is in front of it, and then the passenger can reach over and hit the stalk. But, you need to have weight on the seat along with the seatbelt engaged.

I think this is one of the cases where the the simplest explanation is likely the correct one.

The simplest explanation is what a guy will show off to a buddy in that situation. They may have been talking about AP, but why would an owner of an AP vehicle show AP off in an area where they know AP isn't going to work well? It simply doesn't make any sense to me.

The owner of the vehicle didn't have FSD so why would he care all that much about autonomous driving? He lived in a mansion, and owned a Performance Model S. The defining feature of a Performance Model S is ludicrous mode.

The defining nature of the accident was high speed. Such a high speed that TACC/AP likely wouldn't accelerate that fast to achieve.

Given all that what makes sense to be me is a quickness demonstration. It's a pretty simple demonstration, and you don't expect anything to happen but stuff happens. Maybe a cat darted across at the worst moment so they swerved to avoid it. Sure that explanation requires the driver moving from the front to the back after the crash, but people have survived Tesla crashes only to be burned alive before they could escape or get rescued. It seems like both doors were blocked by trees so what other option did he have?

Millions of guys every day show off their toy(s) to friends, and its inevitable that a few times things go wrong. Given Murphy's law things that work all the time perfectly fine will fail the second you show it off to a friend.

Now this does give a lot of the benefit of the doubt to the two individuals who died. One of them was a Doctor, and the other one was an Engineer. These are the very people we talk to every day on TMC. They're basically us. I fail to see how they're combined brain power would ever think activating AP/TACC from a rear seat would be a good idea. If Darwin was going to get them it would have gotten them in their 20's. As an Engineer myself my entire experience is built around the expectation that things aren't always going to work.

The other aspect of it is that nature of what its like to be a human male. Once you've gotten into your late 50's/60's its less about Darwin, and more about fighting deterioration. Heck I say this as a 40 something because the fight is real. It seems like your big prize for beating Darwin is getting cancer or heart disease.
 
...They may have been talking about AP, but why would an owner of an AP vehicle show AP off in an area where they know AP isn't going to work well? It simply doesn't make any sense to me.

The owner of the vehicle didn't have FSD so why would he care all that much about autonomous driving? He lived in a mansion, and owned a Performance Model S. The defining feature of a Performance Model S is ludicrous mode.

I'll need the car logs to make sense of what happened.

But for your question, AP may work well on some laneless roads. It's possible that it has been working so well especially with high speed approaching a curve that could scare a passenger but the AP in the past might have slowed down appropriately and never crashed at that curve before. It's a good scare in the past so why not doing another validated test?

It's possible that when Elon mentioned that the log said it's not on Autopilot, he might forget to say "at the time of the crash" and that doesn't rule out that it could have recorded that it started with Autopilot and somehow was it was off later as the driver crawled out of the driver seat and accidentally over-torqued the steering wheel with a kicking foot which disabled the AutoSteer but the TACC was still working.

It's all speculation and there's a host of unproven theories when there are no full car logs provided for the public to see.
 
The driver/owner was a board certified anesthesiologist and his friend was an engineer. These weren't some teenagers making a Tik-Tok video. The wives said they went for a test drive and moments later heard the crash. Seems far-fetched to think the doctor went through all the gyrations necessary to fool the system, convinced the friend to go along, and crashed within moments.

A far more plausible scenario is that the doctor gave a demo of the performance. Maybe the older engineer sat in the back to feel safer or see the display better. The doctor launched the car, lost control, crashed and crawled into the back seat in an attempt to save his friend. It's highly unlikely Elon would lie about the car not having AP engaged or being equipped with FSD when those statements could be disproven if false.
 
^And that is what confuses me. Were they both knocked unconscious or died on impact? It seems they would have to have their seatbelts on if they were still in the upright position after that incident. And no trying to escape the fire? Strange as it may be my vote still goes to murder cover up.
 
^And that is what confuses me. Were they both knocked unconscious or died on impact? It seems they would have to have their seatbelts on if they were still in the upright position after that incident. And no trying to escape the fire? Strange as it may be my vote still goes to murder cover up.

If they were not belted, they could be ejected through the front windshield in a frontal impact.

If they were conscious and trying to escape, their positions would change from being sitting upright.

Most likely, they were unconscious, securely belted to get that kind of same upright sitting position after the impact.
 
The driver/owner was a board certified anesthesiologist and his friend was an engineer. These weren't some teenagers making a Tik-Tok video. The wives said they went for a test drive and moments later heard the crash. Seems far-fetched to think the doctor went through all the gyrations necessary to fool the system, convinced the friend to go along, and crashed within moments.

A far more plausible scenario is that the doctor gave a demo of the performance. Maybe the older engineer sat in the back to feel safer or see the display better. The doctor launched the car, lost control, crashed and crawled into the back seat in an attempt to save his friend. It's highly unlikely Elon would lie about the car not having AP engaged or being equipped with FSD when those statements could be disproven if false.
I agree, this is what I felt was the obvious and likely activity. But I think maybe you meant to say the passenger was riding in the back and the driver simply moved over to the passenger seat in his attempt to open the car - a failed attempt as the car was wedged in by trees on both sides, and they both succumbed to the smoke and/or injuries before they could get the car open.

The responders don't think that anyone could have moved from the driver's seat into the back, but this scenario doesn't require that, and it would explain the "no driver" mystery.

As I posted before, the fact of it being a Tesla could easily create some confounding speculation about Driverless features, and this questionable theory quickly overtook the proper method of a deliberate, assumption-free and evidence-driven crash investigation. The media, of course, wasn't about to play the "Just the Facts" angle after being gifted a sensational theory by at least one authority figure.