Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Idea to Counter Autopilot FUD

This site may earn commission on affiliate links.
One problem with trying to express the safety benefits of Autopilot is that you usually only hear about the bad news--accidents and crashes blamed on Autopilot. You generally don't hear about the "saves" because--well--no accident, so nobody's interested.

Because of the Tesla's connectivity, a thought occurred to me.

Based on statements from Elon/Tesla, Autopilot is always running on auto-pilot equipped cars, even if not engaged and controlling the car. This is one way of testing new versions before they're activated, as well as collecting massive amounts of data.

Since they have this information, Tesla should be able to gather information (to some extent) about the number of accidents it avoided. Of course, you can't say definitively that an accident was avoided when it never occurred in the first place, but you can give an idea of cases where there probably would've been accidents.

Massive swerves on the interstate to avoid a car trying to change into you could be logged (at least one of which was captured by dashcam and posted on YouTube, ironically, by the driver that died in the AP accident).

Hard braking to avoid a frontal collision could be logged.

Emergency braking when a car pulls in front of you, as famously documented by another YouTube video.

With all of this data logged, Tesla could present some reasonable information on accidents possibly or probably prevented by Autopilot, and with vehicle log data have an idea of which had at least a possibility of being fatal.

Not bulletproof, but it would help Tesla build its case, with real-world data, of the advantages of the system.

No other automaker has that, because no other automaker gets all of that real-world data.

Thoughts? If this info is in Tesla's upcoming blog post RE Autopilot, it could be brilliant.
 
Tesla should be able to use that data in a presentation about AP performance that goes way beyond crude fatalities. Since there has only been 1 fatality that doesn't mean much either way.

The deeper reason for Tesla to think AP is safer is that data about all the other collisions and safety issues including violations of traffic laws over the millions of miles AP driven compares favorably to owner driven miles. It helps that the data can compare what happened with the same driver.

If the critics arguments about driver distraction are valid, it should show up in the data about near misses and non fatal collisions not suddenly pop up in a fatal crash after 150 million miles.
 
  • Like
Reactions: lzolman
Referring to AP as a suite of DriverAssist features wouldn't hurt either, in the short term especially. The car's at Level 2 autonomy today. Arguably Level 3 once it can manage traffic lights and stop signs. And that's it for now until at least the incorporation of what's been colloquially referred to as AP2.0.

Driver Assist implies assisting the driver. Which is exactly what autopilot does for an aviator or the skipper of a floaty thing. And aviators and skippers/captains understand this clearly. No reliance *ever* upon a sole source for navigation, and so forth. See the tragic case of the Aegean off the Baja coast and countless other cases for why.

But the general public? They need all the help they can get. I wouldn't trust some of these people to drive a blender - and that's on a daily basis here in SoCal. "Autopilot, a collection of driver assist features can and does save lives...." "Autopilot, otherwise known as DriverAssist..." and so it goes.

Absolutely the future view is Autopilot - no question. But as long as you've got yahoos uploading video from the back seat, again, the Great Unwashed will misconstrue DriverAssist less than Autopilot, period.

The high road in the meantime goes to Tesla for even attempting to embark upon a massive education campaign for essentially the entire world at this point.
 
Tao, I'd expect the first actual Level 3 skills would be on the highway/protected roads. Level 3 means there are some circumstances under which it's OK to take a nap/watch a movie/etc. I'd say it's already almost there on Interstates. It's got a long way to go off the highway, at least in the Northeast.
 
Tao, I'd expect the first actual Level 3 skills would be on the highway/protected roads. Level 3 means there are some circumstances under which it's OK to take a nap/watch a movie/etc. I'd say it's already almost there on Interstates. It's got a long way to go off the highway, at least in the Northeast.
No, not nap. L3 requires a "fallback ready driver". She can watch a movie though.
 
  • Like
Reactions: lzolman
It was interesting to see that 7.5 years ago, AP was considered 1. misnamed, and 2. Level 2 Autonomy.

In 7.5 years, nothing has changed.

Interesting how much 'buzz' can twist people's impressions. Most of the general public and a lot of owners consider AP not misnamed, it still doesn't do what they expect it to do (like allow the driver to drive while texting. And most of the general public and a lot of owners would agree with the statement "tesla is a leading edge tech company." Seven and a half years is a long time to sit on the edge and not make it to Level 3 Autonomy. But don't worry, in 2 weeks when V12 is released.......
 
Haha, so true Dewg. The way I've come to view this entire situation is that effective FSD is really hard, and Elon may really have believed they had a chance to conquer it in the time frame he used to give. I still believe (not through any direct evidence, so maybe I should say I "opine") that Tesla's approach, the Elon way of failing your way to success, now pivoted to real AI and not just pretend AI (300K lines of C++), is the approach most likely to first succeed.

I would not be surprised if, after the dust settles and Isaacson writes volume II (or whatever), the method to the apparent madness will reveal itself, and I have my theories on what that method was meant to be.

But I knew, and probably some others figured as well, that when the FSD people started jumping ship 2-3 years ago, it was because they could not stick around when they knew full well Elon's time frame predictions were FOS. And I suspect Elon had a pretty good idea of it too, because he's Elon and probably doesn't really do all that much self-deception (you just don't achieve the kind of great results he has if you don't have a reasonably firm grip on reality. Note: that's not the same as having a firm grip on how your actions are going to be judged in the court of public opinion).