TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

Estimated Tesla Autopilot miles reaches 1.3 billion

Discussion in 'Autonomous Vehicles' started by Lex_MIT, Jul 27, 2018.

Tags:
  1. Lex_MIT

    Lex_MIT Member

    Joined:
    Jun 8, 2016
    Messages:
    33
    Location:
    Cambridge, MA
    • Like x 5
    • Love x 2
    • Informative x 1
  2. strangecosmos

    strangecosmos Non-Member

    Joined:
    May 10, 2017
    Messages:
    696
    Location:
    Koopa Kingdom
    #2 strangecosmos, Jul 29, 2018
    Last edited: Jul 29, 2018
    Waymo full self-driving miles vs. Tesla Enhanced Autopilot (i.e. Hardware 2+) miles:

    [​IMG]

    Obviously not an apples to apples comparison, but interesting nonetheless. The gap will continue to widen as 1) Model 3 production continues to ramp and 2) as Enhanced Autopilot gets better and more Tesla owners use it.

    I am curious how valuable the Enhanced Autopilot data is compared to the Waymo data. If the main bottleneck or one of the major bottlenecks in self-driving car development is getting deep neural networks to correctly perceive stuff in diverse driving environments with superhuman accuracy and reliability, then it seems like maybe just collecting passive sensor data is enough. In that case, the 1.6 billion total miles driven in Hardware 2 Teslas is the relevant figure and it’s way beyond anything of Waymo is capable of doing.

    In terms of actually getting data on the performance of driving tasks like lane keeping, then Enhanced Autopilot miles is the relevant figure. But this is quite limited in its scope relative to full self-driving tests.

    Andrej Karpathy says that the Software 2.0 “coding” is done by people labelling the sensor data collected by cars. If this is true, it stands to reason that Tesla will leap far ahead of Waymo, since it can collect 200x as much driving data and that gap is growing fast. If it really is all about collecting and labelling passive sensor data, I don’t see how a company with two or three orders of magnitude more data than the runner-up doesn’t eclipse everybody else. This is a radical yet seemingly logical and straightforward conclusion. Can anyone tell me why it might not be true?

    I guess one potential answer is that it is not true if perception is not a major bottleneck, and the main bottleneck is instead something that requires test vehicles driving around in full self-driving mode. In that case, Tesla’s production cars don’t matter and only its secretive internal test cars make a difference. Is there any compelling argument for this point of view?

    I guess here is the broader philosophical question. How much of the remaining progress on self-driving cars is in perception and how much is in action? If Waymo or Tesla discovered a magic neural network that could do perception perfectly, could they immediately commercialize Level 5 autonomy?
     
    • Like x 3
    • Helpful x 1
  3. Bladerskb

    Bladerskb Like how many times do i have to be right?

    Joined:
    Oct 24, 2016
    Messages:
    1,070
    Location:
    Michigan
    Hey Trent, i think i have addressed this alot. But there are several things wrong here.
    One is that Perception is NOT the bottleneck in SDC. Mobileye for example already solved perception per say.
    Second is that less than 1% of of AP2 raw data is actually being sent to Tesla HQ. So there is no billions of miles of data.
    Its simply a MYTH. This has been confirmed dozens of times by @verygreen but no one wants to listen.

    If i take a Tesla and drive it to and from work (30 miles). NONE of the camera data is sent to Tesla. None. Zero. Nil. Zip.
    I don't know how else to explain this so people will understand. Every single Tesla hacker has confirmed this and it also matches up with Tesla owners data uploads.

    Continuing to proclaim billions of miles data is simply ignoring the truth. Out of the dozens of sdc companies that submitted disengagement, only 4 averaged more than 45 miles between disengagement for 2017.

    Think about this, if a car has to disengage every 1 mile. If an engineer needs to take over every 1 mile. Then going 1 billion miles is pointless because you have a new bug to investigate/fix/retest every 1 mile. The bugs you already have pre-occupy you therefore SDC problem is not a data bottleneck. Cruise CEO already came out and said they are in no way data starved. It doesn't make any sense. Not logical at all.

    Perception bottleneck?

    Take for example Zoox disengagement report: (1 of 14 disengagement being related to perception)

    [​IMG]

    Another to check out is Drive.ai (19 of 151 disengagement was perception related)

    https://www.dmv.ca.gov/portal/wcm/connect/f0eaad15-bb19-425b-a4ba-a08b095beac3/Drive.ai.pdf?MOD=AJPERES

    or Cruise (13 of 105 disengagement was perception related).

    Around 10% give or take is perception related. But the actual bottleneck are in the planning (driving policy), hardware, and software. Mobileye aims to eliminate most of this bottleneck by keeping their camera based software and their lidar based software separate.
     
    • Disagree x 2
    • Informative x 1
  4. Phrixotrichus

    Phrixotrichus Member

    Joined:
    Jul 31, 2017
    Messages:
    532
    Location:
    Germany
    • Love x 1
  5. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    1,878
    Location:
    TN
    1%? wtf, how do you calculate it it even? When I drive for 20 minutes and the result is to send 10 frames from 3 cameras + the trip-log, I guess it qualifies as under 1%, but still...

    Of course proponents would quickly tell you this is so the car only sends "interesting" data back but it still processes the camera input 100% of the time therefore offloading some of the intial processign to the "front end", though I am not sure it's a particularly good argument (see below).

    This is not really true. Some data camera might get sent. Also in addition non-camera data will be sent for sure.
    See the autopilot trip log. this gets sent all the time. Camera calibrations are sent every few minutes (what for? I have no idea! but it's camera-derived).

    Anyway, returning to the camera data question. Camera data might get sent to Tesla on your trip to/from work IF:
    - you have it enabled in privacy settings (obviously)
    - Triggers were sent to your car (or you hit one of the hardcoded triggers like "I am in a crash" or "silent rob" or some others like that)
    - One of the trigger conditions matches and happens to include camera data as one of the outputs.
    - Tesla would not reject the output right away
    - your car will be on wifi before the expiration data specified for the trigger.

    It should be noted that entire internal autopilot state is not actually sent back (you would think they can reconstruct it from the data being sent, but no, they don't always include enough data for that).
    But it's clear they use those to gather some more training data for one of their new hires to process. and I am hearing they've been increasing their data gathering efforts lately.
    They also use this for debugging of sorts too - i.e. fishing for false positives in NNs and such.

    Some of the triggers might only activate when you are in autopilot mode, but many others would activate no matter what mode you are in.
     
    • Informative x 3
    • Helpful x 2
    • Like x 2
  6. boonedocks

    boonedocks Active Member

    Joined:
    May 1, 2015
    Messages:
    1,286
    Location:
    Gainesville, GA
    I am not sure what is being uploaded but every day when I arrive home and (remember to check) my car uploads a tremendous amount of data. Somebody is gathering a LOT of data about my commute. \o/
     
    • Informative x 1
    • Funny x 1
  7. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    1,878
    Location:
    TN
    There's no guarantee any of that is camera data. There are noncamera data outputting triggers that generate quite a bit of data too.
     
    • Helpful x 3
    • Informative x 1
  8. boonedocks

    boonedocks Active Member

    Joined:
    May 1, 2015
    Messages:
    1,286
    Location:
    Gainesville, GA


    I have no idea what is being uploaded. I was just responding to @Phrixotrichus statement that "Since the continuous data upload is simply a myth this counter is bs" Something, quite large, is being uploaded after every drive for me.
     
    • Informative x 1
  9. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    1,878
    Location:
    TN
    I guess continuous is a wide enough term to be ambiguous too which does not help.

    A bit of upload after every drive is not the same as continuous uploads all the time while you are driving.
     
    • Like x 1
  10. strangecosmos

    strangecosmos Non-Member

    Joined:
    May 10, 2017
    Messages:
    696
    Location:
    Koopa Kingdom
    I don’t think any expert in the field, including Amnon Shashua or anyone else at Mobileye, would say that perception is solved. Please correct me if I’m wrong.

    Other aspects of self-driving cars like path planning might be responsible for more disengagements than perception, but that doesn’t mean perception is solved.
     
    • Like x 2
    • Informative x 1
  11. Tezlanian

    Tezlanian Member

    Joined:
    Apr 24, 2018
    Messages:
    214
    Location:
    North America
    Perception is far from solved. The criteria for what counts as acceptable perception changes upon what needs to be actioned upon for certain scenarios.
     
  12. Bladerskb

    Bladerskb Like how many times do i have to be right?

    Joined:
    Oct 24, 2016
    Messages:
    1,070
    Location:
    Michigan
    #12 Bladerskb, Aug 17, 2018
    Last edited: Aug 17, 2018
    Amon Shashua has said multiple times that it has. if you actually watched Mobileye presentation from 2014 till now you would know they have since stopped talking about sensing. Its of no interest to them anymore. They have basically solved it for SDC. Nowadays for quite a while they only give talks on their RSS (Responsibility-Sensitive Safety).

    First of all the question of " solving vision or solving perception" as most people put it is wrong. you don't need to solve vision to get SDC.
    If you were trying to create a AGI then yeah. For a SDC you simply need a system that can reach a certain level of accuracy/verification.

    for example the mobileye's eyeq3 1 false positive of pedestrian detection every 400,000 miles according to Amon.

    in reference to sensing aka solving vision for SDC, amon said
    "When people think about sensing, they think about object detection, vehicles, pedestrians, traffic signs, lights, objects, etc. You receive an image as an input and your output is a bounding box...this is the easiest problem. this problem has been solved."

    In reference to second level of sensing (semantic free space) where you can drive and where you can't drive:
    "this is already in production.

    in reference to the third level of sensing (drive-able path) where your input is an image and the output is a story. For example this lane is ending in 15 meters, etc.

    Amon says its an open problem in the industry that is solved by REM maps.

    So yes vision for SDC is SOLVED.

    To see how ahead Mobileye is in respects to sensing. Check out for example the bounding box accurary of Zoox's sensing system.
    Notice how inaccurate and jumpy the detection is?

    You can do this by simply going Zoox website. They have a video on their home page

    https://zoox.com/wp-content/uploads/2018/07/Vision_Video_graded-Window-3.mp4

    more loose boundary box


    Now compare that to the tight and accurate 3d bounding box of mobileye eyeq4.
    The accuracy between the two is night and day. Its not even comparable.

    [​IMG]


     
  13. malcolm

    malcolm Active Member

    Joined:
    Nov 12, 2006
    Messages:
    2,833
    And yet.....

    Intel’s Mobileye wants to dominate driverless cars—but there’s a problem

    There are questions about sensing...

    and there are questions about RSS

     
  14. Bladerskb

    Bladerskb Like how many times do i have to be right?

    Joined:
    Oct 24, 2016
    Messages:
    1,070
    Location:
    Michigan
    What does that have to do with Mobileye having solved vision for SDC?
    Mobileye approach has always been completely different from the entire industry.
    Tesla have tried to copy them every step of the way.
     
    • Disagree x 1
  15. Bladerskb

    Bladerskb Like how many times do i have to be right?

    Joined:
    Oct 24, 2016
    Messages:
    1,070
    Location:
    Michigan
    Mobileye is the only one today actually delivering SDC tech to production. I fairly believe if mobileye didn't exist today, the entire car industry (including tesla) would be in trouble.

    The fact that their EyeQ4 and REM Map is fully in production today is amazing. This is sdc tech with solved vision and mapping in production TODAY.

    One of the companies that will be using eyeq4 is BMW. Their new 2019 BMW G20 i3 (officially unveiling at Paris Auto Show in Oct) uses eyeq4, tri-focal camera configuration, 5 radars (4 corners, 1 front), 1 Lidar, 1 driver facing monitoring camera (similar to supercruise) and REM map.

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]




    @Trent Eady
    @malcolm
    @Snuffysasa
    @paraglide
    @croman
    @S4WRXTTCS

    paraglide and snuffsasa i know you both specifically asked for what cars were using eyeq4 in 2018. All 2019 BMW are, VW and nissan have cars that do aswell but i haven't researched them yet.

    Trent i hope in your next article when EAP is released which i know you will write. I hope you talk about these new 2 million cars that will be rem mapping the entire road for Mobileye. And also acknowlege how the 1 billion miles data isn't really 1 billion miles of raw data, but that actually less than 1% of raw data is uploaded to Tesla HQ as confirmed by @verygreen.

    Also verygreen, i used the previous data you released in the camera thread to come up with the accurate 1% raw data upload.
     
  16. Snuffysasa

    Snuffysasa Member

    Joined:
    Feb 24, 2017
    Messages:
    208
    Location:
    Minneapolis
    #16 Snuffysasa, Aug 18, 2018
    Last edited: Aug 18, 2018

    Wow where are these photos and info coming from about this BMW?


    You are saying every single 2019 BMW car is going to have an EyeQ4? How do you know this?
     
  17. ItsNotAboutTheMoney

    ItsNotAboutTheMoney Well-Known Member

    Joined:
    Jul 12, 2012
    Messages:
    7,272
    Location:
    Maine
    Since all he did was post a counter that is intended to give an idea of how much AutoPilot is used, and since his research is focused on human interaction with driver-assistance systems, I fail to see the relevance of the data upload to the counter.
     
  18. croman

    croman Active Member

    Joined:
    Nov 21, 2016
    Messages:
    3,303
    Location:
    Chicago, IL
    @Bladerskb -- I've seen this show before with the Audi "amazing world's first L3" using MobilEye's latest and greatest. Wake me when it ships and actually does what it says. Its all smoke and mirrors. BMW's lane keeping right now is COMPLETE GARBAGE. EyeQ5 couldn't fix that hot mess.
     
  19. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    1,878
    Location:
    TN
    Demoware is demoware. I wonder if raw data is possible to get out of those boxes to doublecheck the results.

    I mean I agree they send less than 1% of camera data, but 1% seems to be super generous since it's probably even less than 0.1% too (unless we add a lot of other qualifiers)
     
    • Like x 1
  20. lunitiks

    lunitiks ˭ ˭ ***** ∆ ***** ˭ ˭

    Joined:
    Nov 19, 2016
    Messages:
    2,474
    Location:
    Prawn Island, VC
    This is pretty neat, though...

     

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC