TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC
Start a Discussionhttps://teslamotorsclub.com/tmc/tags/

Beyond LIDAR and standard cameras: Time-Of-Flight cameras

Discussion in 'Autonomous Vehicles' started by KarenRei, Sep 9, 2017.

  1. KarenRei

    KarenRei KarenRei KarenRei KarenRei KarenRei KarenRei

    Joined:
    Jul 18, 2017
    Messages:
    1,387
    Location:
    Iceland
    There was an interesting article posted on Seeking Alpha today arguing that cameras will win the "LIDAR vs cameras" debate for autonomous vehicles - but not in the way they're currently used:

    Tesla: Cameras Might Win The Autonomous Race After All - Tesla Motors (NASDAQ:TSLA) | Seeking Alpha

    In the autonomous space today, you have two very different philosophies - those based primarily around LIDAR (such as Google/Waymo) and those based around cameras (like Tesla). Versus cameras, LIDAR:

    * Is much more expensive (formerly ~$75k per unit, now $7,5k, as per Google), versus low double-digits per camera.
    * Requires a bulky, awkward spinning rig mounted to the top of the vehicle
    * Produces a very detailed, low-error model of the world around it (cameras are prone to misstitching problems)
    * May still require cameras for precise identification of what it detects

    LIDAR, however, appears to be evolving into a camera-based technology: Time-Of-Flight. In this technology, light is emitted in bright pulses across a broad area, and cameras record not (just) how much light they receive, but more specifically, when they receive it, with sub-nanosecond precision. There's no spinning rig, greater vertical resolution, and most importantly, the hardware is producible with the same sort of semiconductor manufacturing technology that makes cameras so cheap. The same cameras can also double as colour-imaging cameras for identification.

    In short, the article argues that cameras will win the day, but not the type Tesla is using, leaving it with a large liability for underdelivering on driving capabilities vs. competitors that spring up with ToF camera-based systems.

    It's an interesting argument, although I'm not entirely convinced, for a number of reasons.

    * What you really want is a fusion of 3d geometry and identification of what you're seeing. A line on a road or text on a sign or a lit brakelight or so forth has no detectable 3d geometry. Is that thing sticking out ahead some leaves on a tree or a metal beam? Is that a paper bag on the road or a rock? Etc. If in the future Tesla has to switch to ToF cameras for building 3d models of the world around them, they don't lose any of the progress that they've made based on a system where their 3d models are built with photogrammetry; they just map their imagery to better models.

    * Tesla's liability isn't so much for the cost of FSD as it is for the cost of hardware retrofits if the current hardware proves inadequate. Swapping out cameras is almost certainly much cheaper than the cost of refunding thousands of dollars. Any vehicles not swapped out still continue to work, just with the (potentially) poorer photogrammetry-based 3d modelling.

    * Unlike Tesla's competitors, Tesla's early start gives them reems of data collected from its vehicles's sensors in real-world environments - radar, ultrasound, and imagery. Even if competitors happen to choose a better 3d-mapping technology, they remain well behind in the size of their datasets. And data is critical; if you want to test a new version of your software, you can validate it against every drive in your dataset to ensure that it performs as intended.

    * We actually don't know Tesla's plans for what sorts of cameras they plan to incorporate and into what. Liabilities for upgrading existing MSs and MXs would be vastly lower than for, say, a couple million M3s.

    Tesla took a big hit with the Mobileye divorce, and is still playing catchup with AP2. And their insistence on using technology that was affordable to put on all vehicles without serious design compromises left them with no choice but cameras; traditional LIDAR has just been too expensive and awkward. But now that a potentially useful "upgrade" may be coming into play, will Tesla switch gears?
     
    • Informative x 3
  2. Tam

    Tam Active Member

    Joined:
    Nov 25, 2012
    Messages:
    2,869
    Location:
    Visalia, CA
    I generally don't like to read Seeking Alpha because it's like a short-TSLA cult that kept losing money and the more money it loses the more its belief that it's the best time to short!

    I have no idea about Time-Of-Flight but of course, there are many promising technologies out there that look good in theory but the question is how to make it work for the public.
     
    • Like x 1
  3. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    1,169
    Location:
    TN
    you need to switch this to future tense and also consider that a bunch of unclassified data is useless, and you need lots of manpower to classify it (or lots of manpower to write a perfect AI to classify it for you, but they you won't need the data anymore).
     
    • Informative x 1
  4. lunitiks

    lunitiks ˭ ˭ ʽʽʽʽʽʽʽʽʽ ʭ ʼʼʼʼʼʼʼʼʼ ˭ ˭

    Joined:
    Nov 19, 2016
    Messages:
    1,709
    Location:
    Prawn Island, VC
    Some copy-paste info for them who lazy to google:
    Time-of-flight camera - Wikipedia
     
    • Informative x 1
  5. lunitiks

    lunitiks ˭ ˭ ʽʽʽʽʽʽʽʽʽ ʭ ʼʼʼʼʼʼʼʼʼ ˭ ˭

    Joined:
    Nov 19, 2016
    Messages:
    1,709
    Location:
    Prawn Island, VC
    • Informative x 1
  6. lunitiks

    lunitiks ˭ ˭ ʽʽʽʽʽʽʽʽʽ ʭ ʼʼʼʼʼʼʼʼʼ ˭ ˭

    Joined:
    Nov 19, 2016
    Messages:
    1,709
    Location:
    Prawn Island, VC
    #6 lunitiks, Sep 17, 2017
    Last edited: Sep 17, 2017
    Using Time of Flight Imaging (TOFi) for applications like hand gesture sensing and room mapping is fine. However, if you take a deeper look into the physics of light and time, you'll quickly realize that this technology is pretty useless for automotive vision.

    All papers, articles and youtube videos on this subject plainly state that ToFI measures distances (depth) by using c - the Speed Of Light. Nothing, and I mean nothing, travels faster than c. As shown in this diagram:

    [​IMG]

    Now, in nineteen bows and arrows, a white bearded German realized that light travels at a finite speed. C is actually constant, no matter how fast the light source is going. This means that time slows down, as opposed to speed up.

    [​IMG]

    A particular consequence of this is that all our astronauts traveling at near light speed above the world actually age slower than people down on Earth.

    When TOFi'ing your fingers using a steadycam, the light source is not moving relative to anything else and thus causes no problems. But the Planck Second you move your light emitting camera from - or towards - your subject, a time dilation and redshift error will immediately and completely change the ADAS computer's perception of reality. Not only does the deep network neurons fire at a slower rate: Your car will in fact get heavier.

    [​IMG]

    This is due to a principle called General Relativity (which btw is far too complex for anyone to really understand).

    The effects described above could in theory be demonstrated by aiming a telescope at a star or a black hole, whereby light deflected from stars behind it will seem distorted or gravitationally lensed ("parallax").

    So in conclusion, tofi looks good on the surface but will IMO not stand the test of time.

    There, my instant gratification monkey just stole 30 seconds of your time and 5 minutes of mine.
     
    • Funny x 2
    • Helpful x 1
    • Informative x 1
    • Like x 1
  7. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    1,169
    Location:
    TN
    I think this is about where one can safely stop reading ;)
     
    • Like x 1
  8. R.S

    R.S Member

    Joined:
    Mar 8, 2015
    Messages:
    702
    Location:
    Munich, Bavaria, Germany
    While I really like your post, the only equation you posted basically disproved your whole point. Just try using real speeds, even at 200 mph the error due to time dilation is almost non existent, 0.000015%. At 80 mph it's just 0.000006%. That error is totally irrelevant when it comes to cars. That kind of error is totally irrelevant anywhere.

    And if your assumption would be true, we couldn't use radar as well, since it works with exactly the same principle. And don't get me started on sonar, we can't even exactly say how quickly it will travel.
     
  9. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    1,169
    Location:
    TN
    Whoosh....
    [​IMG]
     
    • Funny x 2
  10. RDoc

    RDoc S85D

    Joined:
    Aug 24, 2012
    Messages:
    1,821
    Location:
    Boston North Shore
    I think this is going to become a major issue, if not for Tesla, at least for SpaceX, once they upgrade the BFR to Totally Plaid mode.
     

Share This Page