Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
According to Fox News this morning, the Tesla driver was watching a movie on his screen and it was still playing when the paramedics arrived. How did he do it? I am fairly sure that it was a violation of the law.

That came from a single local paper. The truck driver claimed he saw the Tesla driver playing a Harry Potter movie.
He later (I have heard) recanted. Rightfully so as the center display will not play videos.

you are missing the fact that having the AP engaged to handle the steering allowed the driver to not have 100% attention to the road ahead of him

Drivers are always allowed to not have 100% attention to the road. AP specifically warns to keep your hands on the wheel and pay attention. The car will start beeping at you if you let go of the wheel for a period of time (while muting the radio/entertainment volume).
This was a tragedy, but the fault is primarily with the truck driver who didn't yield.
IF the Tesla driver was not paying attention (which is likely) his would be the secondary.
The AP failed to detect and prevent or mitigate the accident, which is a shame, but in no way, with the information we have now, caused the accident.

Which is a failure of some component if the system (hardware, software, Tesla decision making, etc).

There is a clear difference between an overhead sign that's 15-20 ft(?) in the air and a truck that's 4 ft off the ground that a Tesla can smash into.
How about one that is 12 ft? 10? 8?
There was no failure, the system is not able to identify everything with 100% accuracy.
Unless you define lack of perfection as a failure?

As others have noted, Mobiileye noted that "lateral" traffic detection simply doesn't exist yet.
Will AP get better? Yes.
Is it perfect now? No, of course not.
 
First things first...from a legal perspective, Elon and Tesla should not be commenting on their system, environment, sunlight, colors or anything else. Commenting on such cements them in without an out when the court proceedings happens...and y'all better believe it'll happen.

My father is a lawyer who works in the insurance field...and a happy model s owner. After speaking with him, he assured me that the lawyers are salivating right now. In his words, the vultures are circling.

He implied lawyers would easliy run circles around and punch holes in Tesla's AP warnings to drivers consent form. I won't go through some of his accident cases that he cited to me but he makes some compelling points from a law standpoint.

After thinking about all of this objectively and taking off my fan glasses, I think we should brace for a very rough ride pertaining to AP and tesla in general.

I love tesla and everything they stand for. I support their vision and admire their innovative posture against tremendous odds. But I have to tell ya, this one hurts.

One thing my dad said that stood out to me was to take who's fault it was out of the equation. Lawyers will focus on AP...not the drivers. What's left is a beta version of highly innovative technology. Did the tech do its job according to its capabilities...yes probably. Was it a good decision to release this beta tech with its capabilities knowing that people's lives are at stake? I have a feeling we're about to find out.

My gut tells me in hindsight of AP, it probably needs to be refined before releasing it in life and death scenarios.

He said if it were I in that accident, even if I were at fault or the truck driver were at fault, he would state his case on why this system should not be in this car..under the current disclosure by tesla.

Made me re-think all this from a nutrual standpoint instead of a tesla guru.

I feel like crap. My heart hurts...


This is why there are so many nag screens in cars. I have to agree to drive safely every time I start my Hyundai. It's stupid. Some lawyer will pat himself on the back that he has improved safety when, in actuality, he has caused more deaths and pushed back the advancement of the automobile. I see it all the time in aviation. A company that made an instrument was put out of business ant several hundred employees lost their jobs. The NTSB said the instrument was working fine. The instrument had nothing to do with the crash. The lawyers showed the instrument had had issues in the past. That was enough for the jury.
 
  • Like
Reactions: bhzmark
Tesla is saying the car can see these objects but ignores them because they think they're overhead signs.

They can't have it both ways here.

Sensors is only part of the system. They see what they see. The interpreting is done by software residing in computer.

Tesla is not having it both ways.

Statement is by Mobileye, supplier of the camera/computer chip and image recognition software to Tesla.
 
The mobile eye statement refers to this as a laterally crossing vehicle that no current autonomous breaking system can detect. This is not an autopilot failure folks. It is a current limitation of existing auto breaking technology. Repeat, it is not a Tesla autopilot problem. This has always been an emergency braking problem. No current cruise control or emergency braking system can solve for this particular issue. Mobileye says in 2018 they will be rolling out new systems the can fix this. Auto steer, which most people equate to the autopilot function since TACC is fairly standard on today's luxury cars appears to have been working fine. If no TACC system would have stopped that car I'm not sure what solution there is. These features make us safer but photon shields are still a few years out.

Tesla and Mobileye problem is that we only learn about this AP shortcoming after accident already happen. I personally would not expect the car to stop but I can see how some may see as an obvious obstacle and expect AP to react appropriately...until it maybe too late to react.
 
That came from a single local paper. The truck driver claimed he saw the Tesla driver playing a Harry Potter movie.
He later (I have heard) recanted. Rightfully so as the center display will not play videos.

The truck driver also said that Tesla was driving so fast, he did not even see it.

Apparently the only thing he saw is Harry Potter movie, watched by the driver of Tesla that was moving so fast that he did not even see it...:rolleyes:
 
This is a tragedy, and my heart goes out to the family.

Last summer I drove from the East Coast to Colorado in a 2013 S without autopilot. In Colorado, I purchase a 2015 S so that I could have all wheel drive for the Colorado winter. This new car has autopilot and I recently drove it from Colorado back to the East coast.

For a distance drive like this, I found clear benefits and a possible risk to autopilot. Using autopilot as it was designed -- with my hands on the wheel and paying attention to the road ahead -- I found driving to be less taxing than if I was steering myself. I was less fatigued at the end of the day. I think this adds a clear safety benefit -- albeit one that's hard to quantify. As well, the passive safety features that are in place when one is driving on one's own are a clear benefit.

At the same time, I can see the temptation to rely on autopilot too heavily and to use it in a way that is not recommended by Tesla. It seems as if the driver may have succumb to this temptation, but it's hard to know for sure. The system is certainly not infallible and so long as users don't forget this, it will add to road safety, particularly as it steadily improves.
 
  • Like
Reactions: Todd Burch
As for a court settlement, Elon Musk and Tesla are not going to abandon one of the key technologies in their cars just to make a lawsuit go away.

Your statement doesn't make sense to me but perhaps I don't understand what you are trying to say.

When an action is settled, the standard clauses in the Release and Settlement Agreement are that the settlement is not an admission of liability and, further, that the denial of liability is specifically maintained.

As to liability, it's not always on one party or the other in these types of accidents. If it went to Court and was decided by a trier of fact, there would likely be an apportionment in accordance with the Negligence Act of the jurisdiction. But most of these types of cases don't see a courtroom and no one really cares about liability apportionment when it comes to the terms of settlement. They only need to agree on the dollar figure -- not apportionment of liability. The Plaintiff's lawyer will take 10% more liability and 20% more damages, for instance. It's damages that need to be determined, or at least agreed upon, and while liability plays into the calculation of damages it need not be decided to arrive at the settlement amount. If it goes to court, then obviously liability must be decided but again I just don't see that happening in this case.
 
Last edited:
regardless of the skids leaving a continuous mark on the roadway or if it was a broken pattern, if it is determined that there were no skid marks then I would conclude that either the driver put too much faith into the abilities of the TACC or he just didn't have eyes on the roadway ahead of him.

My question was merely a technical one. Put another way, does the lack of skid marks imply no braking by the car? Is it possible that the callision system applied brakes but the stopping distance was inadequate? Will anti collision systems apply brakes strongly enough to leave skid marks? There seems to be a foregone conclusion that no braking was done. Based on what I know about single radar systems I strongly suspect that is the case. Tesla logs should tell whether brakes were applied but I don't think lack of skid marks means no braking.
 
Mobileye quote:
This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon.

Good grief! Laterally crossing vehicles are extremely common. This is like saying "This document involved the letter 'e', which currently designed word processors are not designed to handle."

One time my father was driving me to school, which was in a downtown city. Lots of traffic on a 6-lane boulevard. We're moving along in the right lane of 3 lanes at about 35, and guess what, some bozo from the opposite direction, sitting in a left-turn lane on their side of the median, decides to go for it and make a left-turn across 3 lanes of busy traffic. We plowed right into them at near full speed (my father had noticed the car at the last split second, and did two things: a, slammed on the brakes and b, his right arm came over across my chest to help prevent me from flying into the dash or windshield. As it was, my knee went right into the dash and left a huge V indentation. I was 12-13 at the time.

This was a classic accident involving a laterally crossing vehicle that should not have been crossing. It happens all the time.

Just another reason why you could not PAY me to rely on autopilot and/or AEB, let alone trust my life and my passengers' lives with it.

I can only wonder what *other* use cases Mobileye isn't designed to handle.
 
Months ago on this forum there was a member who mentioned that he would engage autopilot on the highway and begin reading a book (I won't mention his screen name).

I and others repeatedly mentioned that doing so was very dangerous, and that such behavior could ultimately lead to a tragic accident. I think that member finally got the message. If the story about using the laptop is true (complete speculation at this time), then unfortunately that prediction ultimately came true. And it will happen again unless drivers finally acknowledge that this is not an autonomous system, and shouldn't be treated as such.

Picked up my S90D on Wednesday and on the way home (with autopilot engaged in stop and go traffic), witnessed a Tesla driver in the next lane reading his Kindle! Got to agree with Todd Burch that I love the idea of having a second pair of eyes as a backup system, but I'm certainly not going to abdicate personal responsibility in favor of a beta program. My hand will stay on the wheel as advised by the manufacturer.

That said, regardless of the circumstances regarding the Joshua Brown tragedy, my heart goes out to the family. RIP JB.
 
Even understanding the limitations of the current Autopilot System this incident will still make me take stock and pay closer attention.

The system is very good on the interstates in its current form and can lull you in to a false sense of security. I have used it more than 20,000 miles and overall I find it to be a fantastic driver assistance feature.

RIP Joshua Brown and I hope by your tragedy many more of us are safer.

Mike
 
Tesla is saying the car can see these objects but ignores them because they think they're overhead signs.

They can't have it both ways here.

It isn't having it both ways. It is the issue of separating out the "noise" from the valid data. Radar measures the speed at which something is moving relative to the car based on changing distance to the car. That is why zero speed objects (signs) and ones moving laterally (distance changing the same as the road) are difficult to separate out. The Doppler shift of the signal is the same as the road surface and the street signs. You receive the signal. You just have trouble separating it out. Imagine a blue sheet of paper on a blue wall. Both are exactly the same color. You see the paper in that your eyes receive the reflected light. You just have trouble separating it out.
 
...Tesla and Mobileye problem is that we only learn about this AP shortcoming after accident already happen...

I believe that they have known a long time ago that there have been shortcoming with their system and that's why they pass the responsibility to owners.

I think there are two groups:

Google and others feel in addition to basic senors of cameras, ultrasonic, radar, there's a need for LIDAR also.

Tesla and Mobileye feel they can match the performance with the group above without a need for LIDAR.
 
Correct me if I'm wrong, but isn't Autopilot in its current state meant to only be used on a center divided highway like an interstate, and not roads that have intersections?

Tesla blog said the accident happened in its word "divided highway."

So whether a high-speed road has intersections sections or not, it still fits the required criteria of "divided highway."
 
It appears that both parties would have some sort of fault in this situation.
I spend an enormous amount of time in the Vehicle and the Autopilot is activated around 90% of the time. I'm very familiar with what triggers it's braking-
I'm wondering if there were other vehicles in the left lanes as the car approached the intersection, possibly limiting the view of opposing traffic in the turning lane about to come into view while also masking the approach of the Tesla in the right lane.

Not seeing the trailer becaue it was too high I can understand and have experienced- but how did it miss the Tractor pulling in front first? That type of situation slows my car everytime- Very curious.....
NEED MORE INFORMATION--- speeds- surrounding traffic- The sun's affect on view

He has proven to have a dash cam in the past- i wonder if that will help shed some lighton the circumstances leading up to it.
I drive over 1,000 miles a week -mostly autopilot- and i know you need to pay attention and help "train" the autopilot system. Comes with the territory of Beta Testing which form his obituatuary- shows that he was savvy with R&D.

Car View of accident site.png Truck view (left turning lane) before accident.png
The intersection in question:
Google Maps

Excerpt from the Police report below-

In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near the BP Station west of Williston, a 45-year-old Ohio man was killed when he drove under the trailer of an 18-wheel semi. The top of Joshua Brown’s 2015 Tesla Model S vehicle was torn off by the force of the collision. The truck driver, Frank Baressi, 62, Tampa was not injured in the crash.

The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A.

When the truck made a left turn onto NE 140th Court in front of the car, the car’s roof struck the underside of the trailer as it passed under the trailer. The car continued to travel east on U.S. 27A until it left the roadway on the south shoulder and struck a fence. The car smashed through two fences and struck a power pole. The car rotated counter-clockwise while sliding to its final resting place about 100 feet south of the highway. Brown died at the scene.Charges are pending.”


Car View of accident site.png
Truck view (left turning lane) before accident.png