TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

Seeing the world in autopilot, part deux

Discussion in 'Autopilot & Autonomous/FSD' started by verygreen, Sep 25, 2018.

  1. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    2,387
    Location:
    TN
    That's debatable. Cars have this tendency of backing out too, and then have you never seen cars moving sideways? ;)

    You might have noticed different colors at edges of driving spaces around objects. They seem to mean different things (1 color per different value, but we do not know what the values mean. perhaps some means "this is a car front/side/whatever"?)

    2d bouding box might just be a debug aid for a different team for al we know and not used by the actual driving algorithm that uses interpreted data in the drivable space, though? That's why I say 3D bounding box while neat eyecandy, might have the same data expressed in other ways.
    There's radar return to tell you things and then of course there's relative speed.

    There is a huge difference on hilly roads. HUGE. Also I have no way of extracting data like this from eyeq3 ;)

    Well, instability is definitely there, no argument about it.

    Like I said in the comment, more information is availble, just did not want to update the old tool. there's still speed and direction and lane and overlap information and all that.

    The relative speed tells us if it's going in the same direction or not. Other ways to tell if the car is turning, like it stays withing turning lane bounding lines?

    Just forget about the 2D bounding box, ok? Let's assume all the really useful info is reflected via the driving space border around the obstacle.

    Nah, not really. I need to know the direction of their movement, though.

    There's little indication any such planning is actually going on in this mostly current firmware, though, so this point is kind of moot.
     
    • Informative x 4
    • Like x 2
  2. AnxietyRanger

    AnxietyRanger Well-Known Member

    Joined:
    Aug 22, 2014
    Messages:
    9,451
    Location:
    EU
    You won't know that without orientation, though, if they are stopped.
     
  3. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    2,387
    Location:
    TN
    Yeah, the posr is standard on all APE units, but currently it's not used on non-dev units. I susepct they won't open the port for V9 dashcam but rather let you grab short clips (think 10 seconds?) on "As needed basis", at least all the current developments point in this direction.

    in the part we are looking there's no space for additional details like that. But it's clear some road markings have other effects. e.g. an arrow on the pavement usually triggers the left-bending lines as if for a turning lane even if there are no visible lane markings at all.
     
    • Informative x 5
  4. run-the-joules

    run-the-joules Active Member

    Joined:
    Aug 13, 2017
    Messages:
    2,510
    Location:
    SF Bay
    @verygreen I'm trying not to let this make me think there might be something to the claims of onramp-to-offramp coming soon-ish. I'm staying skeptical… but allowing a small bit of hope.
     
  5. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    2,387
    Location:
    TN
    yeah, it works relatively well, even properly recognizes driveway connections as driveable as they are approached.
     
    • Informative x 1
  6. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    2,387
    Location:
    TN
    yeah, there are indications this seems to be being extensively researched/tried/debugged for some time now.
     
    • Like x 1
  7. run-the-joules

    run-the-joules Active Member

    Joined:
    Aug 13, 2017
    Messages:
    2,510
    Location:
    SF Bay

    I suspect, based on watching some of this, that the green is used for "irrespective of legality, I could drive here", and a purple line seems to be somehow involved in rating the level of confidence in the border of a legal operation area, beyond which it may be technically safe to pull over in an emergency, even if you can't drive there normally:


    Screen Shot 2018-09-25 at 1.23.21 PM.png Screen Shot 2018-09-25 at 1.23.06 PM.png Screen Shot 2018-09-25 at 1.23.00 PM.png
     
    • Like x 7
  8. kdday

    kdday Member

    Joined:
    Oct 29, 2016
    Messages:
    719
    Location:
    AZ
    @verygreen can you comment as to if this type of AP vision is constantly running in our car (ie "Shadow Mode") or, does it only run while AP is actually engaged (I realize you probably manually engaged this recording to extract your videos)?

    I'm curious as to how legitimate "shadow mode" is in reality and how Tesla is evaluating AP against actual driver inputs and using that to improve driving.
     
    • Like x 1
  9. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    2,387
    Location:
    TN
    Yeah, the detections are run 100% of the time. Apparently some of this state could be matched from the "triggers" to cause snapshots to be generated and sent back to Tesla. That said I do not think it actually compares this model to the actual driving input at this time.
     
    • Informative x 3
    • Love x 2
  10. run-the-joules

    run-the-joules Active Member

    Joined:
    Aug 13, 2017
    Messages:
    2,510
    Location:
    SF Bay
    #50 run-the-joules, Sep 25, 2018
    Last edited: Sep 25, 2018

    Purely anecdotal but I figure this is as interesting of a place to note it as anywhere else: The other day I was driving and saw a light at the upcoming intersection was yellow. Out of an abundance of caution (I didn't see it change and didn't know how long it'd been) I went max-effort braking and stopped a few feet over the line (while the light was still yellow, hilariously. Overcaution isn't always good!), then backed up to be a good boy and be where I should have been. That night, I had significant upload activity to the Hermes-snapshot host at Tesla. Coincidence? Maybe.

    Also find it interesting that they named a host after the patron god of the roads and travelers :p
     
    • Informative x 1
    • Love x 1
  11. pyraca

    pyraca Member

    Joined:
    Mar 3, 2018
    Messages:
    17
    Location:
    San Jose, CA
    I've had my S for 8 months now and seen AP2.5 improve over that timeline. In the video ... it's interesting how a car ahead changing from "my-lane" to "overlap left" to "Imm-Left" is shown. The letter transition is done when the car enters the "Imm Left" lane.

    Autopilot reaction is always to wait until the lane is clear before accelerating ...almost always doing it "late" .... I'd probably accelerate some amount of time before a car ahead changes position from "overlap left" to "imm left" ... that is when the intent is clearly known and there is path to pass clearly opening up.
     
    • Like x 1
  12. run-the-joules

    run-the-joules Active Member

    Joined:
    Aug 13, 2017
    Messages:
    2,510
    Location:
    SF Bay
    Yeah, I found the transitional state being recognized to be interesting as well. I think it'll be a while before overlap is accepted as a state where the car can accelerate, but still cool to see.
     
  13. Snuffysasa

    Snuffysasa Member

    Joined:
    Feb 24, 2017
    Messages:
    267
    Location:
    Minneapolis
    We don't know if the data that has been tapped into is the final stage of perception processing. or do we?

    This data could be unstable because it is the raw output from the detection / NN system before it is processed by tracking / prediction algorithms. That's what it looks like to me.
     
    • Helpful x 1
  14. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    2,387
    Location:
    TN
    did you see IC display? similarly chaortic so it does appear to be the final version. The data is clearly already processed by other systems - see the predicted path thingie - that must be dependent on the earlier detected lanes
     
    • Informative x 1
  15. Snuffysasa

    Snuffysasa Member

    Joined:
    Feb 24, 2017
    Messages:
    267
    Location:
    Minneapolis
    IC display?

    I realize the predicted path must be dependent of the lanes,

    but is it not possible the data stream that it saves, includes a mix of data output some that is early/raw output, and some that is final processed data, and some in between?


    It seems unlikely to me that Tesla AP does steering control based on the lanes and projected path in this video, way too shaky. I feel there has got to be more processing of this data downstream even before the lower level control software. So I'd guess the same is true for the object data.
     
  16. Icepicknz

    Icepicknz Member

    Joined:
    May 19, 2018
    Messages:
    46
    Location:
    Auckland new zealand
    @verygreen any way to see or obtain this data ourselves? Are you looking for any volunteers down under in New Zealand?
     
  17. jimmy_d

    jimmy_d Deep Learning Dork

    Joined:
    Jan 26, 2016
    Messages:
    412
    Location:
    San Francisco, CA
    • Like x 5
    • Love x 2
  18. gearchruncher

    gearchruncher Member

    Joined:
    Sep 20, 2016
    Messages:
    722
    Location:
    Seattle, WA
    It's kind of interesting to me that they have all this information about drive-able areas, yet the current system won't let you change lanes into or outside of an HOV lane in Washington, because we use a sold while line, and AP treats those as the edge of the road, no matter what. It's both amazing that they can tell the difference between dashed and sold lines, but disappointing that they can't tell the difference between a white and yellow line, or that an HOV lane is an acceptable place to go since it's fully bounded by two lane lines. Pre 2018.10 I could go in/out of HOV lanes but it knew the outer side of the HOV was the edge of the road, so this is a regression.
     
  19. gearchruncher

    gearchruncher Member

    Joined:
    Sep 20, 2016
    Messages:
    722
    Location:
    Seattle, WA
    At 0:13 to 0:18, is that video showing the barrier as a drive-able area?
     
  20. Snuffysasa

    Snuffysasa Member

    Joined:
    Feb 24, 2017
    Messages:
    267
    Location:
    Minneapolis
    Perhaps they are using data from the map to ensure that they do not allow crossing into lanes like this.

    If they are to enable to automatic lane changing... they will need to implement this capability anyways.. likely by using map data.
     

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC