Now we're using Uber as a proxy for Tesla? SMH.
Tesla is not releasing any data on their FSD program (safety or otherwise), therefore we have to look to analogs to see what problems Tesla may have with their current plan of limited public (non-employee) testing and future wide release testing program (e.g. the topic of this thread). Uber is an appropriate analog because there is a fair amount of data available on what is essentially a "worst case" scenario. From there, we can look at the circumstances of that accident, and see if Tesla's testing program may run into the same problems.
The report I will be referencing can be found here:
Collision Between Vehicle Controlled by Developmental Automated Driving System and Pedestrian (download of pdf is on the right side of the screen).
Starting on page 22, we get some background information on the driver. Quick summary is:
44-year-old female
No alcohol or other drugs in blood
In the 10 years preceding the accident, she had 4 traffic violations, the last one in April 2016 for speeding.
Worked the day before crash
Slept for 7 hours the night before crash
Started her shift a 7:30 pm.
Crash happened at 9:58 pm on March 18, 2018.
She worked for Uber in the autonomous car division (ATG) since 2017.
I would say that her profile is pretty typical of a Tesla owner. If a hypothetical Tesla owner was doing some testing on FSD, it probably would happen after work at maybe the same time. We have already watched videos of non-employee testing happening at 3 am.
Now let's look at the training the operator took prior to participating in testing.
Training program was three weeks long, with the first two weeks in Pittsburgh. The first week was three days of classroom instruction and two days of familiarization with the vehicle. The second week was closed-course and on-the-road training. Training included encounters with aggressive drivers and jaywalking pedestrians. During some of the training, motorized dummies were used to simulate pedestrians. Drivers were also trained on the limitations of the system. The final week of training was at the home base with a mentor. Once that was complete, the driver was approved for testing duties.
Now let's consider Tesla. What sort of training program are the Tesla employees receiving? Do the current non-employee testers receive any training? I have not seen anything on the employee testing or on non-employee testing, so my assumption is that there is no training. Uber trained their drivers for three weeks and still had an accident after the driver had gained a year's worth of experience. A rhetorical question at this point; Is Tesla handling this better or worse than Uber?
About six months before the crash, Uber went from a two safety driver model to a single driver model. Previously, there was a driver in the drivers seat watching for problems. The second monitor was recording any problems or issues where the car did not do something correctly. When they consolidated to the single driver monitor, the driver was tasked with watching for problems and also recording issues encountered.
Prior to the crash, the driver had driven this same route 73 times. The driver was watching a video on her cell phone prior to the crash. She was looking down when the pedestrian walked in front of the car. She claims that she was interacting with the display in the dashboard, but evidence shows she was looking at her phone. She looked up one second before the crash, and tried to turn away 0.02 seconds before the crash. She hit the pedestrian at 39 mph.
This was not her first day on the job, she had traveled this same route many times. It was her fault the pedestrian died. However, the automation was a contributing cause, as it says in the NTSB report,
"Research pertaining to automation monitoring and operator interaction with automated systems is comprehensive. Across domains, automation complacency is identified as a critical consequence of automation—a decrement in performance that results from less-than-adequate monitoring of an automated system by a human operator."
Now, as that relates to Tesla's automation testing plan, how do we expect untrained, non-employees to react to FSD testing on public roads? I'm sure everyone on this forum is an above average driver and will not succumb to complacency when using FSD beta software, but think about the people that do not follow this forum. When FSD shows up in their car, how prepared will they be? What is the potential for tragic accidents? What would the government response be to several of these accidents? How good does FSD have to be before that happens? Is it good enough already?
Just some things to think about as we all eagerly await our chance at trying this software out.