Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Only for those makes and models that would drive on roads in which another car may enter the same road from a side street.

That seems to be a rare occurrence and so maybe will cut down on the applicable cars.

So it seems like you would not drive a car that was equipped with standard cruise control, because it will not brake under any circumstances, further you would recall all vehicles so equipped. That's a pretty narrow point of view.
 
  • Like
Reactions: wbrumfiel
not having his hands on the steering wheel is not the issue, I have driven many thousands of miles on AP on all sort of roads and rarely have my hands on the steering wheel, the issue is that apparently the driver did not have his eyes on the road ahead of him. you can drive the car safely with the AP engaged with your hands in your lap but you cannot drive safely on AP while taking your eyes off of the road.
Except that studies have shown that it takes between 5 to 8 seconds to react when you're hands are off the wheel, while tesla had stated unequivocally that their autopilot sudden may need a response of less than 2 to 3 seconds. Best to keep your hands nearby otherwise your car could be out of control for up to 6 seconds. Precious time in a critical situation may very well mean the difference between life and death.
 
Today's autopilot in commercial aircraft can land the plane. The only thing it can't do is taxi and takeoff/initial climb.

Not true in general. Large airplanes are CAT2 & CAT 3 capable. Smaller commuter planes may not be. CAT 1 capability will, in general, get you down to 200' AGL at which time the pilot take over and lands the plane. Also, AP's won't maneuver to avoid traffic. True TCAS will give the pilot a conflict resolution generally involving a climb or descent but as far as I know that isn't connected into the AP. Tesla's idea of an AP, something that keeps you on course during the cruise stage of the trip, fits with the AP in my plane. Actually, Tesla will slow down for traffic which my AP won't do. No pilot would confuse AP with autonomous control.
 
Except that studies have shown that it takes between 5 to 8 seconds to react when you're hands are off the wheel, while tesla had stated unequivocally that their autopilot sudden may need a response of less than 2 to 3 seconds. Best to keep your hands nearby otherwise your car could be out of control for up to 6 seconds. Precious time in a critical situation may very well mean the difference between life and death.
I hate it when people make up numbers like this. I might believe the numbers if they had decimal points in front of them. But I know from my airplane autopilot training that I can take over from a malfunctioning autopilot, including disabling it, in less than two seconds.
 
Except that studies have shown that it takes between 5 to 8 seconds to react when you're hands are off the wheel, while tesla had stated unequivocally that their autopilot sudden may need a response of less than 2 to 3 seconds. Best to keep your hands nearby otherwise your car could be out of control for up to 6 seconds. Precious time in a critical situation may very well mean the difference between life and death.

A drunken tree sloth would have a faster reaction time than 5-8 seconds. If you were watching the road, with your hands in your lap, reaction time to grab the wheel would likely be under 1 second.
 
Except that studies have shown that it takes between 5 to 8 seconds to react when you're hands are off the wheel, while tesla had stated unequivocally that their autopilot sudden may need a response of less than 2 to 3 seconds. Best to keep your hands nearby otherwise your car could be out of control for up to 6 seconds. Precious time in a critical situation may very well mean the difference between life and death.

If you look at automatic emergency braking systems you will see that it is this delay it intends to mitigate and not the full braking action. Histrionic Plus only uses 40% of max braking. Some systems won't go to a full stop. As I interpret things they want to avoid creating an accident where there is a false positive and the car brakes so heavy that the car behind rear ends it. Similar they don't want a false positive to bring a car to a total stop. So, the idea is to start the braking process to mitigate the driver reaction time with the driver then taking over.
 
I hate it when people make up numbers like this. I might believe the numbers if they had decimal points in front of them. But I know from my airplane autopilot training that I can take over from a malfunctioning autopilot, including disabling it, in less than two seconds.
You can take over in less than 2 seconds because you are paying attention. Sometime not paying attention, as is suspected in this case, will take significantly longer to react.
 
I'm not sure I understand your point. Are you suggesting that the driver in this case, as an acclaimed user and promoter of AP, was fully aware of the limitations of Autopilot? Evidence suggests otherwise. I see he posted a YouTube video of a trip from Boston to Orlando where he had his hands off the wheel 90% of the time.

When I say "the message isn't getting across," I mean it's not getting across to many (most?) Autopilot users, who drive with their hands off the wheel at least some of the time. This is directly in contravention of Tesla's warning: "Always keep your hands on the wheel. Be prepared to take over at any time." There are people who drive with their heads down, checking email and whatnot. It's certainly not getting across to the media or the public at large, who discuss this incident as an accident with a "self-driving car."

Thank you for clarifying Breezy. The DMV also requires us to drive with two hands on the wheel at 10 & 2 or 8 & 4 and to take avoidance action at anytime too. Is most of America not getting it? Comprehension is very different than free will/judgment/choice whatever you wish to call it. Yes, I believe he was not only aware of the limitations, but actually acknowledged them in publicly posted footage yet still endorsed the future promise of the technology.

Contrary to your sweeping statement, my point was to express that TSLA does indeed provide balanced disclosure and warnings of the tech limitations and its beta state, validated by my personal order/delivery experience. Did you not find that to be the case during your buying/testing experience?
 
I own and operate a P85D and use the driver assistance features extensively. That feature is not perfect but the vehicle is much safer with it than without it. I also regularly fly fixed-wing and rotorcraft with autopilots. I also own a software company that has 8 product lines so I fully appreciate the value of beta releases to users and makers. I am biased in favor of the technology making us safer in the long run.

This is all about the intersection of the accident. I just visited the site which happens to be near where I park an airplane. These are personal observations of the site and comments from one person who came upon the accident while traffic was being diverted as well as people having lived very nearby for more than 10 years. The Tesla driver was traveling east on a straight route 27 at approximately 2:40 pm (sun was behind him) and was approaching an intersection with about 480 yards (measured) line of sight from the crest of the hill to the intersection with 140th CT. That road does not cross to the north side of 27. The only structure at that intersection is a BP station which sits right on the south west corner of 27 and 140th. The station is so close that when pulling out of the gas station onto 140th to continue south on 27 if one does not make an effort it would be easy to block any vehicle turning off of 27 and on to 140th from being able to clear the intersection. Additionally, there is a dip off the shoulder of 27 onto the surface of 140th so braking on to that surface is probably a normal thing to do thus making clearing the intersection that much longer. One long term resident said there have been many accidents at that intersection over the years because of the hill from the west, the speeds and the structures. There is no signage or other alerts to the west of the intersection about there being an intersection just over the hill. Vehicle debris reportedly covered another quarter of a mile on 27 past the intersection (there was no clear evidence of where it left the road from by observations).
At the posted speed limit of 65 mph that 480 yards is covered in 15 seconds. If the Tesla was doing 110 as one law enforcement officer said to a person passing by the scene it would have been 8 seconds but it is really unclear how he would know the speed at that point. If he is basing it on just distance traveled compared to other accidents and no skid marks he may be way off without accounting for the greater mass of the Tesla and whether or not post accident braking (which level was selected by the driver) took place with the driver assistance disengaged. (Obviously Tesla will have some numbers.)
If the tractor trailer was stopped for traffic to clear and the driver then turned his attention to the maneuver one can see that it could take the trailer more than a couple of seconds to clear the intersection. Nothing coming. No need to rush, even if he could. And his attention is devoted to the turn. He reportedly saw nothing but just felt the impact. It would appear to me give the nature of the intersection that his comments could be consistent with a Tesla traveling at the speed limit or faster.
 
Thank you for clarifying Breezy. The DMV also requires us to drive with two hands on the wheel at 10 & 2 or 8 & 4 and to take avoidance action at anytime too. Is most of America not getting it? Comprehension is very different than free will/judgment/choice whatever you wish to call it. Yes, I believe he was not only aware of the limitations, but actually acknowledged them in publicly posted footage yet still endorsed the future promise of the technology.

Contrary to your sweeping statement, my point was to express that TSLA does indeed provide balanced disclosure and warnings of the tech limitations and its beta state, validated by my personal order/delivery experience. Did you not find that to be the case during your buying/testing experience?
I could be wrong but based on @Breezy 's profile he/she doesn't own a Tesla.
 
He reportedly saw nothing but just felt the impact. It would appear to me give the nature of the intersection that his comments could be consistent with a Tesla traveling at the speed limit or faster.

Several reports indicate the truck driver saw the Tesla and thought it was okay to proceed. Some of the confusing discussion included comments that the car was moving from lane to lane. Some reasons the car might do those things while on AP would be driver initiation, hills/dips, poor lane markings, or obstruction of sensors (via light or other).

As several in the thread have mentioned. The truck driver most likely made the assumption that the car would yield or change lanes to prevent an accident.
 
  • Like
Reactions: JeffK
not having his hands on the steering wheel is not the issue, I have driven many thousands of miles on AP on all sort of roads and rarely have my hands on the steering wheel

Autosteer is a driving aid, designed to help you stay in your lane. If you have your hands off the wheel, that's against Tesla's guidance and you're using it improperly.

Contrary to your sweeping statement, my point was to express that TSLA does indeed provide balanced disclosure and warnings of the tech limitations and its beta state, validated by my personal order/delivery experience. Did you not find that to be the case during your buying/testing experience?

I don't own a Tesla at this point. But obviously many people who have and use Autopilot don't understand how to use it properly. Maybe this gentlemen was a risk taker. Are all people who drive without hands on the wheel risk takers? I don't think we need any more of those on public highways. A closed track is the proper place to explore the limits of your vehicle.
 
A couple things.

1 - I have a S65 with Distronic Plus. It doesn't see stationary objects at all. The system has saved me by properly calculating stopping distance when I misjudged the situation and prevented a rear-end collision when traffic slowed faster than I expected. However, having used it extensively, I know for a fact it would not have reacted at all to the trailer across the road because it can't see any stopped traffic even if directly in front of the car. As an example, the car with Distronic Plus active would slam right into a line of stopped cars on a freeway. We as the driver are always responsible. These systems are assistance systems to make driving a little easier. Having driven the Tesla system it's way beyond what Distronic Plus can do.

2 - Check out this article showing what appears to be a LIDAR equipped Model S. TESLA OWNER
 
Autosteer is a driving aid, designed to help you stay in your lane. If you have your hands off the wheel, that's against Tesla's guidance and you're using it improperly.



I don't own a Tesla at this point. But obviously many people who have and use Autopilot don't understand how to use it properly. Maybe this gentlemen was a risk taker. Are all people who drive without hands on the wheel risk takers? I don't think we need any more of those on public highways. A closed track is the proper place to explore the limits of your vehicle.
Yes, people should have there hand(s) on the wheel. That's what Tesla says to do, drivers know that and even if just saves a fraction of a second in reaction time that could be the difference.
 
A couple of things are clear to me:

Terrible tragedy.

Terrible intersection.

And my fresh perspective after catching-up on the 40+ pages of this thread, is that the reason this is about Autopilot and not just about TACC is that everyone will want to know precisely "Why Didn't The Robot Car Save The Man" as all of AP was active for the entirety of the event.

I imagine there are investigations ongoing representing many parties. How / why it was not typical front-page-Tesla news for this long is certainly odd considering. But for their part Tesla now seems to have revealed what the telemetry says (or what they're allowed to say about it).

Side eco-panels for trailers probably pay for themselves in fuel savings (we already have them on most trailers here) or if needed an advertising deal will get these installed on all road-worthy trailers for no cost to the industry.

Intersections are built-in confrontations. This one appears much worse -- a coffee / truck stop that should really only be available to 1 side of traffic. Was the truck driver going down this local road, or stopping at the stop ?

A roundabout or traffic circle here would have eliminated the confrontation, as well as relocating the incorrectly placed Gas/Coffee stop to something slightly off the roadway, accessible to both sides, and still visible from the road. But that costs more.
 
  • Like
Reactions: Matias
More accurately, wearing a seatbelt makes you feel more secure so you drive faster and corner faster until your perceived level of risk is elevated to your risk tolerance level. Airbags don't affect the driving experience so they don't change perceived risk. Hence airbags and crumple zones don't come into play under risk homeostasis.

I am not sure I agree with that. I heard a similar talk where the speaker suggested something similar: if you wanted safer drivers, put a six inch spike in the middle of the steering wheel. His point was that airbags reduce the cost of screwing up, so they increase risk tolerance, so if you want safer drivers, do things that lower risk tolerance. Of course, the whole thing is an academic discussion, but it is important to drive home the point that we need to take human nature into account when assessing the impact of technology.
 
No, check out this
Now consider the truck driver in question is not at a light but instead a divided highway which allows him to pull forward all the way.
Let's count from when the driver in the video says "and we start turning left" at about 12-13 seconds in. one... two... three seconds and his trailer is already crossing the two lanes of oncoming traffic off screen to his right.

Three seconds is all it took.
Three seconds from being lined up with your turn lane to being perpendicular to the oncoming traffic.

Since police estimated the car was travelling at 65 mph then even if we gave the truck 5 seconds that means the car is less than 500 ft away when he turned in front of it.

I timed that turn in the linked video at 12 seconds from the time he started into the intersection (that's when the maneuver commences) until he said the trailer was straight (not just the tractor). Remember, the tractor in the crash had already turned, crossed the I-27 median, crossed both eastbound lanes and only his trailer was still across the eastbound right lane when the impact happened.
 
I don't own a Tesla at this point. But obviously many people who have and use Autopilot don't understand how to use it properly. Maybe this gentlemen was a risk taker. Are all people who drive without hands on the wheel risk takers? I don't think we need any more of those on public highways. A closed track is the proper place to explore the limits of your vehicle.

You're a good guy breezy.......please go test drive a Tesla. You might enjoy the experience, learn a ton and help further the public education you seek. Unfortunately knowledge doesn't cure all bad judgment, but its a necessary foundation. I've also attached a link Understanding the fatal Tesla accident on Autopilot and the NHTSA probe which offers a differing perspective than yours of the driver's competency, in his own words. I do hope you become a Tesla owner soon.
 
  • Like
Reactions: msnow