Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
I will. Don't think it's a coincidence I drive the route regularly. I was not aware I could do that...
don't bugreport feature does not collect any useful autopilot data , it's only good for UI/infotainment/other car state debugging.

Additionally it's not automatically collected from the car, only when you log some other concerns with the car by scheduling appointment or some such.
 
  • Like
Reactions: mtndrew1 and ARMARM
More information today from Electrek... Tesla under scrutiny again over fatal crash on Autopilot - Electrek

Tesla is under scrutiny again over a fatal crash on Autopilot in 2018. Now almost two years later, the National Transportation Safety Board announced “its intention to hold a board meeting February 25, 2020, 9:30 a.m. (EST), to determine the probable cause of the fatal crash of a Tesla in Mountain View, California.” NTSB wrote in a press release:

WHO: NTSB investigative staff and board members.
WHAT: An open to the public board meeting.
WHERE: NTSB Boardroom and Conference Center, 420 10th St., SW, Washington, DC.
WHEN: Tuesday, Feb. 25, 2020, 9:30 a.m. (EST).

Previously released phone data about the accident also showed that the driver was playing a game on his phone around the time of the crash — though it’s unknown how engaged he was with the game, or if he was even holding the phone. Tesla’s involvement in the new public hearing is not clear at this point.
 
  • Like
Reactions: VikH
More information today from Electrek... Tesla under scrutiny again over fatal crash on Autopilot - Electrek

Tesla is under scrutiny again over a fatal crash on Autopilot in 2018. Now almost two years later, the National Transportation Safety Board announced “its intention to hold a board meeting February 25, 2020, 9:30 a.m. (EST), to determine the probable cause of the fatal crash of a Tesla in Mountain View, California.” NTSB wrote in a press release:

WHO: NTSB investigative staff and board members.
WHAT: An open to the public board meeting.
WHERE: NTSB Boardroom and Conference Center, 420 10th St., SW, Washington, DC.
WHEN: Tuesday, Feb. 25, 2020, 9:30 a.m. (EST).


Previously released phone data about the accident also showed that the driver was playing a game on his phone around the time of the crash — though it’s unknown how engaged he was with the game, or if he was even holding the phone. Tesla’s involvement in the new public hearing is not clear at this point.

Time has been moved to 1 PM EST. Here is a link to the web cast -

NTSB Live
 
  • Informative
Reactions: SMAlset
Watching this now. NTSB stated that they made recommendations to 6 car manufacturers and 5 responded. Guess who didn't? Seems they are pretty pissed about Tesla not responding to their request so now Tesla is under the microscope. Right or wrong, probably could have been avoided by responding.
 
  • Like
Reactions: Matias
Thanks for the link to the live feed. Tuning in about 45 minutes late. Hope the report is published afterwards. I see the text is being captured.

Is this web streaming of a review typical or is it focused on Tesla? Any idea how long this meeting will go on? Sounds like the Vice Chairman wants to shut down AP.

Report on Mt. View (Model X) and Delray Beach (Model 3) will be released in about 15 days.
 
Last edited:
NTSB focusing on 3 AP failures and not considering the data that shows AP miles are safer. Falls into the perfection is the enemy of progress bucket.

Wouldn't be surprised if everyone involved in this hearing had copies of the questions ahead of time and most of this event is well scripted. Seems they have an agenda here; show Tesla that there are repercussions to ignoring requests from the NTSB.

My takeaway is that the NTSB wants to shut AP down to teach Tesla a lesson, not save lives. Otherwise they would use more than 3 data points.
 
Here is part of the transcript so far:

[missed first hour or so]

42 of the accident report, we talk about the probable cause of the Delray beach crash, and the third sentence in the probable cause reads, "further contributing to the crash was the failure of NHTSA to develop a method of verifying manufacturers' incorporate of acceptable 60 safeguards for vehicles with level two automation capabilities that limit the use of auto mated vehicle control systems to the conditions for which they were designed which we recommended following the Williston crash in 2016 but there was no action on that."
Unfortunately, NHTSA's taken a hands-off approach on regulating A Vs. NHTSA's informed us it plans to ensure safety through its enforcement authority and surveillance program. So I read NHTSA's enforcement guidance bulletin on safety-related defects and AVs. It says, "NHTSA is commanded by Congress to protect the safety of the driving public against unreasonable risks of harm that may occur because of the design, construction, or performance of a motor vehicle or motor vehicle equipment and to mitigate risks of harm, including risks that may result from automated safety technologies."
So Doctor, in your opinion, has NHTSA protected the safety of the driving public against unreasonable risks of harm that result from auto mated safety technologies and what does the draft report say about NHTSA's approach to the oversight of A Vs?
>> So no. I believe NHTSA has not taken the approach that is best for safety in this situation. In the draft report we say it is a misguided approach, and the Culver City crash -- I mean the Delray Beach crash that you referred to, I think, from my staff and myself there's nothing more inn dysappointing than investigating a crash, coming up with a good g solution, having no response in Tesla, and in NHTSA's case, we don't need to do that, and it not happening.
>> To clarify, the Delray Beach report will be out when?
>> The Delray Beach report will be released as a companion to the Mountain View report, which is about 15 days from this board meeting.
>> We find in this report that NHTSA's approach is misguide because it essentially relies on waiting for problems to occur rather than driving safety issues proactively. I'd agree.
Thank you.
>> Thank you very much. Member Graham.
>> Thank you, Mr. Chairman. I appreciate the staff's work on this, being this is my first board meeting for a crash like this. And I see why it's very hearing worthy why we have this crash on the docket here, because this is unfortunately a classic distraction crash here that included emerging technology that maybe lacks some standards as far as the development of and maybe some regulation.
Okay. I'd like to start with your find that's you pointed out early on, specifically one of the first ones there that says none of the following were factors in the Tesla's driver's actions in this crash. One being driver licensing or qualification, and two, familiarization with the vehicle and roadway.
With that being said, what's required to get a license? A driver's license?
>> In order to get a driver's license in this case, in California, it's required that you pass a written test as an initial step, you have to take a driving test. Once you have apublished that and passed the physical requirements such as eyesight and other medical requirements, you have a driver's license.
>> Okay. So that's a one-time deal, maybe early on, maybe in high school?
>> That's the initial step. Obviously, you have to get renewals and sometimes you need renewals of eyesights and that to have thing.
>> There's no requirement. Most of a driving test is manipulating the controls of a vehicle. There's probably no testing on automation of the vehicle, cruise control, anything like that? Is there any of that requirement for that?
>> No, there's not for that. However, for different types of vehicles, if you're going to be going and driving a motorcycle, there's a different licensing rim. For a large commercial vehicle, you have to have additional requirements to meet.
>> Okay. Thank you. So there is no required training for this autopilot system, autosteer, anything like that? And is there any across the board for any vehicle, a passenger vehicle like this that has this kind of technology?
>> No. The only type of training that comes in is whatever the specific dealship decides to do as far as providing training on the new technology that's provided in the car when you're purchasing a new vehicle.
>> Okay. So the dealership is providing training maybe, would you say? It's not required to? And there's no requirement or regulation out there that requires somebody to receive training on this new technology or automation.
>> No, there's not.
>> Okay.
>> Under the autopilot limitation the vice chair had brought out earlier, it talked about beta. You basically stated that beta is still in development. So it's -- beta means they're still testing it. Is that correct?
>> That's correct. And Tesla's also advised that they plan on referring it to beta for the foreseeable future, because there are constantly changes being made. One thing of interest that's not necessarily mentioned in the report, but one of the abilities of the car is if you're driving down the road and, for example, the car steers into the gore. Drivers have an opportunity to hit a button on their steering wheel and filing a bug report.
The bug report is you say there's a problem at this location, and that information goes directly to Tesla and Tesla tries to look for remedies for that. Again, that's part of the development stages that Tesla uses.
>> Okay. So they're using over the air to get the data back from the drivers to see there might be a bug in their software?
>> Yes, that is correct.
>> Okay. So we're basically requiring the consumer in this case to be a test driver, and I don't believe they have received any training on how to be a test driver, like I was a test pilot and I actually received test training. Is that correct?
>> That is correct.
>> Okay. I -- it's my understanding, too, that the firmware is passed over the air to update the software. Is that correct.
>> Yes, that is correct.
>> How does a driver know there's been an upload or update to the software?
>> There's notifications actually made on the screen, and more recently, Tesla has also provides mobile phone notifications to drivers every time they have to basically conduct the update.
>> Okay. But they don't know what T ACC was updated. Is that correct?
>> No. For every update, there is information as to what's in the update, whether a driver actually reads all the specifications is a different question.
>> Thank you. Thank you, Mr. Chairman.
>> Thank you very much. Member Chapman.
>> Thank you for your excellent work on this. It's much appreciated. I want to join in expressing disappointment by the lack of leadership which NHTSA has shown in the area of addressing issues related to the safe development and employment of vehicles with driving automation systems. I naturalry tend to view NHTSA's efforts in light of what I know about aviation, FAA seeks to manage the safe integration of drones into the national airspace system, it continues to prohibit the operation of drones over people and does so by regulation. A typical drone weighs no more than a few it pounds but FAA has maintained this limitation as drone technology further develops. In contrast, highway vehicles weighing thousands of of pounds and equipped with powerful yet still-evolving automation are operating on highways based, at best, on mere guidelines. When the technology fail failed or is misused, the risk to occupants and others on the roads is potentially catastrophic, as we've seen in this case.
As I understand, the NHTSA guidance is for SAE automation levels 3 through 5, and there is no such guidance for level two vehicles. Is that correct? And who decides if a given vehicle is a level two?
>> That would be correct. NHTSA's AV guidance focused almost entirely on higher level automation systems and it is essentially a manufacture that decides whether their vehicle is level two or different level of automation. That said, currently on the roads, only level two automation vehicles exist.
>> And given that, is there any incentive currently for a manufacturer to identify its vehicle was a level 3?
>> I'd rather not speak as to the incentives the manufacturers might have. But there may be potential concern about calling their vehicle level 3, as such they would be responsible in case of a crash. But maybe more importantly, there's also lack of proper, you can call it whether a standard or guidance from NHTSA to determine as to what makes a vehicle a level 3 and at which point does the responsibility of the auto mated system stop and driver begin?

>> And, again, at this point there are no vehicles on the road that are classified or considered level 3.
>> None that are production-level systems.
>> Within the report, there's an indication the driver of the Tesla may have experienced his vehicle drifting into the same highway gore area during previous drives along same autopilot. You talked about the bug reporting system, which I don't think is a feature on all level two automobiles. I know I have an automobile with some automation. I don't have that option that I'm aware of.
Is there a means of reporting similar experiences which may be the result of some road condition anomaly more broadly? Is there a NHTSA complaint database, for example, or some other means of reporting?
>> Yes. Every driver of a vehicle experiencing an ongoing problem, there is the NHTSA complaint database, and that is what NHTSA's supposed to be using to look for a trend in these complaints to determine what they should evaluate and what they shouldn't.
>> And is it well known to most owners of level two vehicles how to access that database or report to that database?
>> I don't have information to answer that question.
>> I'm going to suggest not. But that's probably something we should look at. Then finally, appendix C cites the Tesla's owners manual and it details something like 50 warnings about the limitation of the Tesla autopilot system.
From the available esearch, what do we know about the extent to which people read owners manuals or heed consumer warnings in general?
>> Regarding the more advanced driver assistance systems, there is recent research by maybe consumer reports or even AAA that shows that drivers do lack understanding of what their specific system does, its limitations, and how it works. In the four crashes being investigated, the lack of knowledge was really not an issue. Maybe the issue was the adherence to the system.
>> Thank you. Thank you, Mr. Chairman.
>> You're quite welcome. Thank you. Before I get into the questions directly to staff, I think the public needs to know, we generally think our report should come out with 12 to 15 to 18 months. Next month will mark the two-year anniversary for this one. So I know you've had a lot in the last two years have been very busy for the office of highway safety.
How has this impacted the ability to bring this one to the board? Talk about that.
>> Thank you, Mr. Chairman. About the time we took on the Mountain View investigation, in that same week, actually just prior to the Mountain View investigation, we launched on the bridge collapse investigation in Florida, and we also launched on the Tempe pedestrian crash involving the Uber. So that creates some complexities on rung more or less on the same schedule. We've had board meetings in October, November, and now February back to back. It's a limitation to how many my staff can do at a time. In addition, as we were concerned about how widespread some of these issues are with regard to the autopilot or level two systems, we had opportunity to pick up a case in Delray Beach, Florida, that we've talked about today. That's less than a year old. We needed to finish our investigation of that and incorporate it.
Those are some of the reasons.
>> You've got a lot going on. To be very clear, do you very good work. As you pointed out, March 15, pedestrian bridge collapse. Few days later, three days later, Tempe, and five days after Tempe was this crash. Just right in there, in eight days. Plus you've got a limousine crash with 20 fatalities. You have a lot going. I think it's important the public understand we're not just sitting around doing nothing.
I do want to talk about cell phone. There are a number of areas I will explore in my line of questioning. There's been a lot of research that cell phones, PEDs, smart phobes, can be potentially even addictive. I know I sure wouldn't want to do without mine. I'm somewhat addicted to it.
There's an article that further explores how addictive these devices can be. People want that dopamine fix of knowing that somebody responded to a text message or posted something on social media site. So I think Apple and the other smartphone developers in this case. I want to talk specifically about Apple. They've created something that people are, in some cases, literally are addicted to. In fact, Apple is the biggest manufacturer of smartphones in the U.S.
So given that, I'd say that Apple has yet to recognize their own responsibility as an employer; that they have failed to say, of our over 135,000 employees, that we care about you, and we don't want you to go out and kill yourself or others on the roadway. Apple has failed in that respect, and Mr. Carol, I thought I would ask you about this, but in the interest of time, I won't.
But in the docket, there's an e-mail to you from the network of employees for traffic safety, and they do point out in that e-mail that banning cell phone use resulted in measurably fewer crashes in a lower percentages of vehicles involved in those crashes.
So it is very important for organizations to adopt strict PED devices, and Apple has fallen down in that role.
We've seen several crashes where people have been on phones or in some form or fashion. I won't even try to enumerate them right now, but I will say we've seen at least five crashes where people have been operating level two automation and they've been fiddling with a cell phone or something. We couldn't quite figure out a few of them, actually. But yeah. Tempe was our last board meeting where a driver of a -- not a Tesla, but an Uber AT -- ATG, automated technologically -- whatever the ATG stands for group. That driver was doing a movie. We found that -- NHTSA found in a 2019 study that nearly 10% of drivers are using PE Ds at any one given moment. So this is a problem. It requires a lot of work. We have a long way to go. Vice chairman, all yours.
>> Thank you. I'll follow up a little bit on that line of questions.
Do humans multitask well?
>> Without going into specifics of attention and tasks, a good rule of thumb would be that when we perform two or more tasks at the same time, our performance for each one of them is worse than performing them individually. So there is a cost, for example, when talking on a cell phone while driving. We may be able maintain the position within a lane and a constant speed relatively well, but may miss an exit we intended to take or be too slow to respond to a vehicle ahead.
>> So is there science -- do we have science to back this up that says that, you know, this multitasking thing that people say -- they used to put it on resumes. Hopefully they're learning that's not a good thing to do now. Do we have scientific basis for this?
>> Yeah. That is a lot of research. Multitasking on a resume is different.
>> So I think in the Tempe crash, you coined a term of automation complacency. Talk a little bit about that.
>> Well, I can't take credit for the term but automation complacency in driving, driving task outside controlling of the vehicle includes sustained attention to the environment. And when we automate vehicle controls, it becomes more challenging to monitor that environment, especially in the automated system performs pretty well and we learn to trust it. At that point, we might be more likely to start mind wandering. So inattentional complacency or purposely engage in a distract task such as checking our phone. Auto automation complainsy, you know, driving is an expected risk.
>> So it would be fair to say that this is a foreseen problem with level two vehicles?
>> Correct.
>> Okay. How effective is autosteering in lane keeping? Specifically in a Tesla in this case. I mean -- and I don't intend this to be a rhetorical question. In case, obviously, it didn't work. But --
>> Well, overall, autosteer does a very good job in lane keeping and centering most of the time, and that is where the problem rises and ties in with automation complacency and so forth. Because you kind of foster a false sense of security that the system's going to work and it results in an over reliance on a system that shouldn't be over relied upon.
>> It kind of sounds like we have 1 foot on the dock and one in the boat here in this automation business which you know what that leads to.
Does the quality of the paint marking make any difference?
>> Yes, it does.
>> And when the system senses that it's losing it, is there some way for it to inform the driver that, hey, you got it, I'm out of here?
>> Yes. If there's not enough data from the camera or the sensors, it does allow you sometimes to receive an alert about the autosteer being temporarily unavailable. After this crash in an update to the firmware that Tesla supplied, they added an alert with circumstances that, if they detect the hands are not in the steering wheel and it loses lane line acquisition that it will provide a more of an immediate alert. So they tried to come out with a fix that would fix this specific crash.

>> Interesting. Okay. I'll defer until the next round. Thank you, Mr. Chairman.
>> Thank you very much.
>> Thank you. Just picking up on the autosteer issue. The driver of this vehicle was part of a Tesla model X Facebook group which I joined just to see what was being discussed in the group. I'm probably now being removed from the group. But I wanted to see what the discussion was on autopilot and autosteer, and there seemed to be some issues with autopilot and autosteer with b unfortunately, people taking videos of their vehicle leaving the road and them having to come back.
So my first question is, Doctor, do you think people understand the technology and the limitations of the technology?
>> So I think people understand conceptually what the technology does. I think on the limitations of the technology, that's something they don't fully understand, and those limitations, how they will impact the safety, they don't fully understand.
>> Yeah. And that seems to be clear from a lot of the discussion in the group. There were a lot of questions and a lot of people didn't really know where -- that there were areas they were operating that they shouldn't be. So I do think there's some educational issues, and member Graham had asked about, do you get notifications when your vehicle is updated. I wonder how many people actually read that. I know when I update my phone, which is very different from a vehicle, I don't read what the updates were on my phone. So I wonder how that translates when you're looking at a vehicle -- or dealing with a vehicle as well.
I think after this board meeting, there will be some people who say we're just pick on Tesla, and this was obvious. **** This was somebody who was distracted. They were gaming. The person experienced a similar situation with a vehicle moving towards the gore area twice before, and California should have fixed their attenuator. ****
So I'd ask you, Doctor to return to the first. Are we just picking on Tesla, and second, why do we look at other issues beyond just what happened?
>> So the reality is our interest in automated vehicles on the highway began in 2014 as we were being briefed by Google -- now Waymo -- about their systems. I think a crash happened in '72 where the board found the operator's over reliance on the system can be manifested in such a way as they grow more confident. It can be more reliant, and then they'll be willing to do other things. something we've known for a long time with automation. We started there. It just so happens that the crashes we've investigated, four have been Tesla, one has been Uber, one has been Navia. In each of those, we're finding some of the automation issues we've seen in other modes here. So the recommendations we made, basically one of them -- a couple of them are to Tesla because they've got a specific problem they need to fix.
The reality is that recommendations to NHTSA and to the other auto manufacturers are to show this is not a unique problem to Tesla. This is a problem related to automation and how people work with automation.
>> Great. Tesla did make some changes to the vision system processing software after this crash. Would that have -- would those changes have prevented the Mountain View crash? In other words, can the new system detect unusual or worn lane markings?
Second, did Tesla and NHTSA implement our safety recommendations and what is the status of those?
>> From our understanding in discussing the updates that Tesla has made regarding their vehicles, they have added some features. So, for example, if it lose lane line acquisition or predict -- be able to predict a lane and your hands are owe were it will give the driver a warning. That hopefully will help. They've improved some of their detection ability with gore areas. So those are good things.
But as we've talked about before, fixing problems after people die is not a really good highway approach. And then also with regard to both Tesla and NHTSA, you know, we've received no response from Tesla regarding their operational design domain recommendation to limit the use of their vehicles to that. Then we received from NHTSA that they don't intend to take action on our recommendation.
>> Thank you. And just one last point. In this report we are considering redesignating those as open unacceptable action for both Tesla and NHTSA.
>> So in the report for the NHTSA recommendation, the current recommendation status is unaccept. We are changing it towards data recorder recommendations. For the other two they were already unacceptable.
>> Okay. Thank you.
>> Thank you. Member Graham.
>> Thank you, Mr. Chairman. I just want to make a comment about my last comment about the training or lack of and qualifications for vehicles. Driving is the most popular form of transportation in the United States. And it's also the form that, mode that kills the most more than any of the other modes of transportation. It's ironic it's become acceptable that those of us that get behind the wheel are probably least trained when compared to all the other modes and vehicles we operate in the other modes. We're the least trained of that. I'll switch here a little bit and talk a little bit about the lane markings and everything like that since the high school chairman had brought it up.
I want to talk about the gore area. What is a gore area?
>> The gore area is, in this case, a triangular shaped area, usually is the dividing section between a main line of a freeway compared to the exit lane of the freeway. It's a separating area.
>> And how is a gore area typically demarcated?
>> It's typically demarcated with -- well, I can speak for California. It's generally an eight-inch-wide white reflective line to separate it as -- from the other lane lines which are generally four to 6 inches wide.
>> Okay. So there's no standard across the U.S. for a gore area, then, for marking?
>> Yes, there is standards. It's in the manual for uniform traffic control devices as to what is required for lane lines at all locations, including a gore, as far as the minimum specifications.
>> Okay. So what the markings are strictly visual markings, then. Is that correct?
>> They're visual markings. There's also out of a previous crash that we investigated in San Jose, under very similar circumstances to this where a greyhound bus impacted a crash attenuator. We recommended that all gores at left exits be -- have supplemental cross-action hatching or chevrons be implemented in those locations.
>> Still that would be just a visual indication, correct?
>> That's correct.
>> Okay. With that, all the lane markings and everything out there, I assume, have been designed over the years for the human eye. Would that be a correct statement?
>> Yes.
>> Okay. Now the autosteer function of this Tesla and, say, other vehicles that have a similar technology, are they using pretty much a visual system to keep within the lane?
>> It really depends on what manufacturer you've talking about, but generally, with autosteer and the autopilot system, it relies primarily on the vision system to predict the path of travel.
>> Okay. So the technology's been designed basically along the lines of the human eye is what we're looking at here.
>> Well, there's other manufacturers that have built-in additional redundancy such as having the area be fully mapped with GPS mapping so that you have mapping of the area so if you're out of the lane it will tell you that. Some manufacturers use Lidar to actually map the area so they have more high-def anythings information as to the area it's passing through.
>>> Okay. So thank you. Not everything is visual as far as the technology is going. So maybe our road designs and markings maybe need to be looked at and new standards set for them.
I know I had a discussion with staff in our premeeting, and everything about that area was a visual indication that you were outside the line of traffic and that there was actually fading on the striping or the line itself that the -- that probably caused the vehicle to steer into the gore area. It's kind of unfortunate that maybe we don't think outside the box here a little bit and maybe recommend some grooved rumble strips or something like that to get the attention of a driver who's not paying attention and maybe that distraction might have helped in this case or in other cases, but I think it's something worth looking toward in the future. Thank you, Mr. Chairman.
>> Thank you, member Graham. Member Chapman.
>> With regard to driver distraction, one of our colleagues here at NTSB recently showed me how to activate the do not disturb while driving function on my mobile phone. I don't think my teenage son would describe me as tech savvy, but I do think of myself as relatively proficient with technology. And I was frankly unaware I had such an option on my phone. And that, of course, is the point. It makes more sense to turn on such a feature as the default setting to better ensure that users are aware that such an option exists. Taking that approach, using it as the default setting would require that users make a special effort if they choose to turn the function off rather than requiring them to make a special effort to turn the function on.
Have we spoken to any of the cell phone manufacturers specifically about the idea of locking users out while driving? Are they amenable to this approach?
>> Staff has reached out and had some discussions with Apple prior to this meeting, and the primary discussion was relatessed to the innovations that Apple has initiated such as the do not disturb while driving and their car play system in vehicle system. We did address issues such as the need to automatically lock it out.
We also are concerned that this voluntary application that's available in a study done by insurance institute of highway safety found that only one out of five people have it set on automatic to automatically lock it out.
So that is why staff is proposing a recommendation for an automatic lockout mechanism being installed on all portable electronic devices as a default setting.
>> Appropriately, the draft proposes that NHTSA and SAE international should work cooperatively to develop performance standards for driving monitoring systems to help minimize driver disengagement, prevent complainsy, and account for foreseeable misuse of the automation.
Do we have evidence there are systems more effective than steering wheel torque and specifically, what do we know about how well eye-tracking systems work?
>> As to the evidence, because there are no standards or guidance, we cannot say that there are, and there are too few of vehicles with different types of a system to make an inference based on crashes, something that IHS might do.
The most -- that said, most vehicle manufacturers use some type of steering wheel interaction as a metric to infer driver engagement. After the Williston crash investigation and, again, today, we stated we do not believe steering wheel torque monitoring is an effective metric for inferring driver engagement. For that reason, we cannot say hands-off period of zero seconds is good enough, simply because hands on the steering wheel may not necessarily indicate the driver is attentive.
Most vehicle manufacturers, as I said, use some type of steering wheel interaction for engagement. One uses a camera that traction head position movements. As far as I know, the systems that might be used in a production vehicles that aare affordable use -- track only head position rather than gaze.
>> One last question I want to circle back to, the discussion of road markings. We know state transportation departments often struggle with funding to keep roads safe and maintained. To what extent can we expect roads to be designed to accommodate automated driving system technology, and is it realistic to think state DOTs will be in a position to respond?
>> As part of this investigation, we saw this crash may be an opportunity for recommendations in this area to look at this.

>> Even, you know, whether you need to have traffic control at intersections. Because of the action being taken in there, we basically proposed a finding in this area, and obviously going to be trying to monitor what's proceeding in this area, because we do believe the infrastructure plays a critical role in the performance of the automation.
>> Thank you.
>> You're quite welcome, member Chapman. Thank you. I want to follow up on something that member Chapman said.
In 2011, I think we're proposing to close that one out. Because apparently we never received a reply from them. So we are going to close that no longer applicable and then create a new recommendation directly to the cell phone manufacturers, I believe.
But once again, heres a arecommendation where some organization, some entity, decided they didn't even need to write us a letter. Is that basically correct?
>> I'd say there's a slight difference. They did actually respond to us. They didn't inform their members of the issues that we outlined in the recommendation as the other recipient did. They talked about some other things in their letter to us which is not appropriate. That's why it stayed open and await response. We didn't actually get a response that was tangent. It was tangent to the recommendation.
>> Bottom line is yes, we have weighed in on that and yes we are going to pursue it in this board meeting. We will entertain a recommendation to further get that point out directly to the cell phone manufacturers.
If you could pull up slide 27.
While we're pulling that up, I'm curious about the role -- on slide 27, it appears the lane marking, the right-hand marking of the gore, the win that does not have the red arrows on it, it looks like that's very faded. Is it because of that ill-painted, mostly faded line that the camera system of this Tesla SUV thought there was no line there and started hugging what is the left line, the one with the arrows?
>> We looked at a number of factors because we wanted to answer that, why the vision systems steered it into the grow. While we think that that was potentially part of the issue, we also looked at the fact of the bright light. We looked at the firmware that was in place. In this case, it was awe lead vehicle that was 80 feet ahead, potentially, and obstructing a vehicle two lane lines. So the one thing we do know is the vision system, the chasm p cameras in conjunction with the computing software is what makes the prediction of where it goes. We know because of the limitations in the system, that's what caused it to make a left steering motion. One other item we looked at is we removed the autopilot computer system from the Tesla and myself and a recorder specialist from research and engineering went to Tesla with their engineers to try to extract video imagery and so forth from it. Because of the catastrophic nature of this crash, we weren't able to get that. If we were able to get information from that, they may have told us more as to why the vision system did what it did.
>> So we can't say for certain why it veered into the gore. I'm putting my mine money on the faded land lines. But we know it was a persistent problem in this particular area. I think it was member Chapman that said this driver had reported a number of issues in this particular area. It's not written in way in the draft report but it is in the docket that when interviews with the family and friends that about seven out of ten times when he was traveling in the vicinity of this particular gore his car would take a dive towards the barrier. In fact, on just a few days prior to the crash, this fatal crash, on Monday, March 19th, just four days before this fatal crash, he stated in a text message to a friend of his that that morning the autopilot almost led him to hit the median at the I85 separation.
So it was numerous occasions that this vehicle of his would attempt to head at that location towards the barrier. And you would think that with the knowledge that this is something that my car is going to act pretty squirrely on, that maybe at least if I'm playing a game through the rest of my ride, maybe I should at least be more vigilant at this particular location.
This kind of points out two things to me.
The semiautonomous vehicles can lead drivers to be complacent, highly complacent about their systems, and it also points out that smartphones manipulating them, can be so addictive, that people aren't going to put them down. We've been in here two hours. We're going to take a 15-minute break. We will come back, let's say, at 3:10. We are in recess.


(In Recess.)
 
Last edited:
Live coverage should resume shortly. My impression is that they want to kill all driver assist functions, stop people from using a phone in their car--seeing as that falling to phone manufacturers (guess that would include passengers--how do you differentiate--and in-car connections), and require State departments of motor vehicles to make everyone tested on how to use their car and it's features--given the number of car manufactures and features don't see this even possible for the testers. Basically from what I've heard so far anyone who uses a driver assist feature can fall into complacency. So how do they address simple old-fashioned cruise control because you're on the highway with cruise control set and you start to relax you legs, look off to the side and can fall asleep since you don't have to have your foot on the gas pedal? Isn't that complacency?

Discussion on highway road markings was discussed and whether there was a national standard for them and how there would be funding to mark everything better or improve the markings. Where they were talking about asssisted driver functions that were tagged by GPS, etc. no one brought up what happens when there's construction on that highway or other factors that could influence it's performance. I don't see many questions if any questions being asked of the board members in response to their questions where it seems like they are going with their line of thought.
 
Starting up if you want to listen in.

I don't know how some of these guys expect to compel owners of any car to read the manuals etc. and even if they do read or test on their cars, how you make them drive to the Boards level of whatever. People switch driving cars all the time, at home or if they travel and rent a car. You could be driving any handful of cars over a number of years--all with having some different features.
 
Last edited:
Sounds like they blamed everyone as far as Mt. View.

Have to read through the list when published as it went by fast. However they even are tagging employers of anyone to not use their phones in the car they are driving. Specifically they called out Apple because the Mt View driver was an Apple employee and on an Apple phone (he was playing a game on his own and am sure Apple asked him to do that).

As I understand it regaring phones used in cars, there isn't a way to determine if the driver is using the phone or a passenger is using a phone. What about in-car use with hands off? Don't think that was addressed.
 
Last edited: