Why is the NTSB trying to embarrass Tesla in public? Here's how I see it.
The NTSB got involved with Tesla when investigating the 2016 traffic death of Joshua Brown. Brown was driving 9 mph over the speed limit on a divided highway in Florida with the old Autopilot Convenience Feature engaged. Brown wasn't watching the road, a truck pulled across the road in front of him, the car didn't stop and Brown was killed. NTSB investigated and ruled that Tesla Autopilot worked as designed. But in the wake of that incident, NTSB sent letters to several automakers asking questions.
Part of NTSB's objection is that Tesla allows Autopilot to be used in situations where they think it isn't designed to be used. The flaw with that view is that Autopilot was designed to be used on the type of road where Brown was killed. It seems that NTSB's position is that any roadway where Autopilot doesn't prevent every accident is one where it shouldn't be allowed to operate; i.e., all Level 2 autonomous driving systems should be disallowed. NTSB is only picking on Tesla because they are a visible target and the main player in the self-driving space.
NTSB has no regulatory authority, so all they can do is try to embarrass Tesla in public to get them to stop selling Autopilot. That sounds extreme, but that's essentially what they are doing since Autopilot is a Level 2 system that will inevitably have accidents when drivers abuse it.
Tesla's response is that in a Level 2 system, it is the driver's responsibility to decide what roads are suitable for self-driving features. (I will note, however, that there are roads Autopilot will not activate on, like ones with no lines, and Autopilot restricts speed significantly on undivided highways. Smart Summon only works on private roads.)
Some automakers like GM define "compatible roads" and their "Super Cruise" won't drive anywhere else. I suppose NTSB likes the fact that those cars won't drive on 95% of the paved roads in the US. That's a lot less Level 2 autonomy on the road.
The NTSB got involved with Tesla when investigating the 2016 traffic death of Joshua Brown. Brown was driving 9 mph over the speed limit on a divided highway in Florida with the old Autopilot Convenience Feature engaged. Brown wasn't watching the road, a truck pulled across the road in front of him, the car didn't stop and Brown was killed. NTSB investigated and ruled that Tesla Autopilot worked as designed. But in the wake of that incident, NTSB sent letters to several automakers asking questions.
Part of NTSB's objection is that Tesla allows Autopilot to be used in situations where they think it isn't designed to be used. The flaw with that view is that Autopilot was designed to be used on the type of road where Brown was killed. It seems that NTSB's position is that any roadway where Autopilot doesn't prevent every accident is one where it shouldn't be allowed to operate; i.e., all Level 2 autonomous driving systems should be disallowed. NTSB is only picking on Tesla because they are a visible target and the main player in the self-driving space.
NTSB has no regulatory authority, so all they can do is try to embarrass Tesla in public to get them to stop selling Autopilot. That sounds extreme, but that's essentially what they are doing since Autopilot is a Level 2 system that will inevitably have accidents when drivers abuse it.
Tesla's response is that in a Level 2 system, it is the driver's responsibility to decide what roads are suitable for self-driving features. (I will note, however, that there are roads Autopilot will not activate on, like ones with no lines, and Autopilot restricts speed significantly on undivided highways. Smart Summon only works on private roads.)
Some automakers like GM define "compatible roads" and their "Super Cruise" won't drive anywhere else. I suppose NTSB likes the fact that those cars won't drive on 95% of the paved roads in the US. That's a lot less Level 2 autonomy on the road.