Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Autopilot Survey (from MIT)

This site may earn commission on affiliate links.
My name is Lex Fridman. I'm leading an effort at MIT aimed at analyzing human behavior with respect to Autopilot by using computer vision to analyze 300,000+ miles and billions of video frames of real-world driving that we collect in Tesla vehicles. We're finding some very interesting and exciting results, and would like to supplement those results through a large-scale survey of Tesla owners to check whether the results hold across the larger population. If you're an owner of a Tesla, please consider taking this survey (it should take 3-15 minutes, depending on whether you have Autopilot enabled on your Tesla). Here's the link:

Tesla Survey

If you do take the survey, I would really appreciate it if you could give advice on things that should be clarified, changed, rephrased, expanded, deleted, etc. You can always email me [email protected] or direct message me here if you would like to take the conversation offline. Here's a video introducing our study:

 
...human behavior...

Google found out quite very early that their own employees would trust an imperfect and experimental partial automation system so much as recorded by the interior video with all kinds of driver's distractions:

Texting:

upload_2018-4-6_18-47-43.png



Fumbling around to find a way to charge cell phone:

upload_2018-4-6_18-49-23.png



Applying makeup:

upload_2018-4-6_18-50-31.png



And of course sleeping!

upload_2018-4-6_18-52-7.png



They then decided to skip the research on Advanced Driver-assistance Systems because they don't trust human to pay attention on the road.

Instead, they have concentrated on developing Autonomous system without a steering wheel and without brake pedal to bypass unreliable human drivers.

Is there any dispute of what raw data from Google video on distracting human drivers?

How will new studies attempt to find any thing else different than what Google found?
 
How will new studies attempt to find any thing else different than what Google found?
We have been hard at work for the last 2+ years to run exactly such a study, and I believe we found some new, profound results. Every system is different. You can't just say that human beings are flawed so there's no way to build a successful collaboration between human and machine. It's hard, but not impossible. Good engineering solves exactly such problems.
 
We have been hard at work for the last 2+ years to run exactly such a study, and I believe we found some new, profound results. Every system is different. You can't just say that human beings are flawed so there's no way to build a successful collaboration between human and machine. It's hard, but not impossible. Good engineering solves exactly such problems.

A full up fully autonomous highway in which every vehicle is autonomous and all are communicating with each other is not that tough a problem to crack. Airliners have had that system with TCAS since the early 1990s. Small planes still aren't in the network, but the larger aircraft are all talking to one another in the tightest air spaces around airports.

Airliners are a fairly limited number of vehicles and they are only in danger of running into each other in limited spaces. When they are at risk of collision, there are three dimensions available to move the planes. With any highway solution, you need to approach it knowing the first entrants are going to be operating in a space where the vast majority of the rest of the vehicles are controlled by humans, and considering how long cars stay on the road, cars will need to deal with human driven cars at all times and all situations for at least a decade or more.

It's the transition from all manually driven vehicles to self driving that is the toughest thing to code. As it appears your study found, many users tend to get a sense of false security assuming the technology is better than it actually is. It appears the accident in the Bay Area a couple of weeks ago was caused by a driver not prepared to take over when the car got confused. Same thing with the Uber car that killed someone a few weeks back.

I've heard there are now autopilot defeat devices that makes autopilot think your hands are on the wheel when they aren't. AP can be annoying with the nags, but they are there because people let their attention drift. I'd be interested to hear your ideas on how to keep driver's attention on the road while on AP without annoying them so much they figure out how to shut off the warnings.
 
My name is Lex Fridman. I'm leading an effort at MIT aimed at analyzing human behavior with respect to Autopilot by using computer vision to analyze 300,000+ miles and billions of video frames of real-world driving that we collect in Tesla vehicles. We're finding some very interesting and exciting results, and would like to supplement those results through a large-scale survey of Tesla owners to check whether the results hold across the larger population. If you're an owner of a Tesla, please consider taking this survey (it should take 3-15 minutes, depending on whether you have Autopilot enabled on your Tesla). Here's the link:

Tesla Survey

If you do take the survey, I would really appreciate it if you could give advice on things that should be clarified, changed, rephrased, expanded, deleted, etc. You can always email me [email protected] or direct message me here if you would like to take the conversation offline. Here's a video introducing our study:

Completed 17 MS75D w/AP2.5 firmware 10.4
 
I did the survey and later realized I should have suggested a possible improvement.
Better late than never.
You had us choose what made us disengage autopilot. More options should be provided. For me, driving on the freeway, here is why I disengage, in rough order of decreasing frequency.
Need to take off ramp.
Need to change lanes in heavy traffic, so want to control my speed to seek a gap between cars in the next lane
A car pulls in front of me going slower than me. When the car is close, I don’t wait to see if AP will brake, I do it manually.
Passing a large truck and I want to be on the left side of my lane to keep my distance. Tesla has stated it does this automatically, but many times it doesn’t.
I approach a construction zone.
My car gets too close to the edge of my lane.
 
  • Like
Reactions: CL600 and Lex_MIT
As it appears your study found, many users tend to get a sense of false security assuming the technology is better than it actually is.
No, that is the conventional thinking. I never said this. If I found results that confirmed this, I wouldn't say the findings are exciting. The results we found is very interesting and counter-intuitive. Stay tuned for their release. I am passionate about saving lives and sometimes that may require learning how to step toward danger and build systems that successfully manage it, as opposed to running away from it. Successful human-machine collaboration is difficult, but I believe it can be done well.
 
Where did you get the 300,000 miles of video footage from? Did Tesla provide it?
We instrumented 21 Teslas (and growing) for 1 to 2+ years of video recording in each car. Watch the video in the first post for details. Tesla doesn't have this kind of data, because it's not just the external scene, but also two HD cameras on the driver face and driver body.
 
I tried filling out this survey, but found it frustrating because TACC isn't acknowledged.

The Survey needs to clearly distinguish between TACC, and AP.

TACC is absolutely wonderful because it reduces the stress of driving while also being extremely predictable. I use it the vast majority of the time that I'm on a highway/freeway, and I can't imagine going with out it. I tend to use a following setting of 3 to 5 even though it results in a larger following distance than I'd normally drive with. I do this to give me time to react if it doesn't respond how I expect. I'm absolutely convinced that this makes me safer as it doubles up on safety. I'm also well aware that the radar system (the radar plus how the data is processed) filters out stopped objects. So I won't be running into any fire trucks anytime soon.

AP is one of those technologies in their infancies that needs to be treated with extreme caution.

It's extremely problematic for a lot of reasons.

Technical problems or UI issues.
1.) Tesla doesn't tell us how AP map tiles work or whether the map tiles are loaded. They don't tell us whether the AP map tiles cache gets cleared or not. I'm of the opinion that the whole HD maps thing is a complete joke.
2.) All the indication of AP status/warnings is on the IC, but my eyes are on the road.
3.) The entire hand sensing/wheel torque thing doesn't work very well.
4.) The side spot monitoring doesn't work which means it can't adjust itself in the lanes according to cars next to you.
5.) It doesn't know anything about pot holes and can't detect debris in the road
6.) It doesn't care about some massive rut in the road. No, it just wants to travel in the center of the lane regardless.
7.) It is constantly adjusting. I can feel it constantly changing the steering. The Tesla engineers seem to have no idea about a thing known as hysteresis. It's like a PID loop being corrected in a very tight loop which isn't how people drive.
8.) It still sometimes will dive for the exit because it follows the lines.
9.) If it loses a line it will haunt. It can't just leave things alone for a few seconds to let the lines come back.
10.) Was prone to truck lust, and it wasn't all the time. It was like 1/200 trucks or so. The whole truck lust thing was so annoying especially on I5 between Seattle and Portland where trucks essentially own the road because there are so many of them.
11.) The entire AEB/FCW thing is a complete joke. Even a Subaru has a better system. I think it's important to back up some advanced assistive driving technology with an equally sophisticated AEB/FCW warning. The technology needs to consider the user.

Human Issues
1.) I used AP a lot starting with 7.0, but I found through experimentation that I lost situational awareness while driving with AP. Not having to steer meant that I wasn't as engaged with actual driving. So my mind used it as an excuse to wonder.
2.) It was easier to convince myself to text while driving because hey AP is on so it's less risky. I admit that despite the fact that I'm extremely opposed to texting while driving.
3.) People are still prone to using it despite bad experiences previously. There always seems to be the urge of "Well, maybe I should give it a chance" or "I wonder how well this update will do".

Media issues
1.) People perceive AP as a Tesla thing while failing to recognize that lots of cars hand adaptive cruise control plus lane-steering. This means a lot of the same technical/human issues will be present in them. They won't get noticed though because no one pays attention if a MB/Audi/BMW crashes.
2.) The obsession over the name Autopilot. I don't think they realize the percentage of Tesla owners who clearly know that AP isn't a self-driving system yet.
 
  • Like
Reactions: arcus
  • Need to take off ramp.
  • Need to change lanes in heavy traffic, so want to control my speed to seek a gap between cars in the next lane
  • A car pulls in front of me going slower than me. When the car is close, I don’t wait to see if AP will brake, I do it manually.
  • Passing a large truck and I want to be on the left side of my lane to keep my distance. Tesla has stated it does this automatically, but many times it doesn’t.
  • I approach a construction zone.
  • My car gets too close to the edge of my lane.

This is great thanks. I added some more options but of the ones you listed I believe are already covered. Here's how I would align them. Actual options are in bold:
  • Need to exit highway or turn at intersection
    Need to take off ramp.
  • Lane change (manual) <-- new one added thanks to your suggestions
    Need to change lanes in heavy traffic, so want to control my speed to seek a gap between cars in the next lane
  • Leading vehicle braking, moving slower, or stopped
    A car pulls in front of me going slower than me. When the car is close, I don’t wait to see if AP will brake, I do it manually.
  • Autopilot drifts too close to other cars
    Passing a large truck and I want to be on the left side of my lane to keep my distance. Tesla has stated it does this automatically, but many times it doesn’t.
  • Approaching construction zone <-- new one added thanks to your suggestions
    I approach a construction zone.
  • Autopilot starts drifting out of the lane
    My car gets too close to the edge of my lane.
Let me know if the above matches could be made better.
 
  • Like
Reactions: WannabeOwner