TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

Seeing the world in autopilot, part deux

Discussion in 'Autonomous Vehicles' started by verygreen, Sep 25, 2018.

  1. Bladerskb

    Bladerskb Senior Software Engineer

    Joined:
    Oct 24, 2016
    Messages:
    1,670
    Location:
    Michigan
    #61 Bladerskb, Sep 25, 2018
    Last edited: Sep 25, 2018
    Are you talking about the red swizzle line? That's semantic free space showing you that there is an object at its boundary.

    The bounding box is the representation of actual detection data and the actual data, the vectors, are used in a Self driving system.

    Mobileye are able to create accurate 3d environmental model because of their 3DVD system.

    26 mins 45 seconds


    Yeah but you are trying to compare a 2014 tech with a 2018 tech.
    That in of itself speaks volume. But especially the fact that based on your analysis and performance comparisons by other tesla owners, Tesla hasn't surpassed the 4 years old eyeq3. That is alarming.

    But we will see when v9 drops. Hopefully you can get your hands on it.

    But what you posted is where the industry is at other than Mobileye. For example look at Nvdia Drive Platform which is similar to what Tesla is at (although Nvidia has a poor mans inaccurate 3DVD compared to ME, but atleast it has it).
    Mobileye seems to me light years ahead of everyone in computer vision.
     
  2. BigD0g

    BigD0g Active Member

    Joined:
    Jan 12, 2017
    Messages:
    1,874
    Location:
    Somewhere
    #62 BigD0g, Sep 25, 2018
    Last edited: Sep 25, 2018
    It’s possible this is what KP is referring to as the “hard” solution and its being trained now and these aspects are the data gathering and labeling needed to build out the NN stack and decision engines vs the “easy” approach that is basically if then statements. I suspect they are currently running on the if/then statements based on the networks telling them the lanes availed and localization data through gps telling them actions permitted ala no local lane changes for example.

    The 2.0 stack KP refers to doesn’t just need to learn the driveable space, but it would need to learn all the rules of the road and where it can drive in the space,,,

    No, I suspect this riddle will be solved with a bit of both approaches with hw3 to interpret more data for decesioning for more accurate and defined if then statements.
     
    • Informative x 2
    • Like x 1
  3. gearchruncher

    gearchruncher Member

    Joined:
    Sep 20, 2016
    Messages:
    722
    Location:
    Seattle, WA
    This is not map data. It always changes about 3 seconds down the road after it goes dashed. One of the roads is an express lane that changes direction in the middle of the day, and depending which way you go, the lane is dashed on your left or on your right. It always gets this, where a map would be totally lost. It also works in tunnels where GPS is dead.

    If they could get this much data so perfect about lane lines in maps, why can't they get speed limits right?

    They don't need maps to know exact lane lines to enable lane changing. All they do is get into the rightmost lane 1 mile before the exit, and then follow the exit off when it's time. You don't need a map to pick a lane either.
     
  4. S4WRXTTCS

    S4WRXTTCS Active Member

    Joined:
    May 3, 2015
    Messages:
    3,998
    Location:
    Snohomish, WA
    #64 S4WRXTTCS, Sep 25, 2018
    Last edited: Sep 25, 2018
    Yeah, my MobileEye based 3DVD performs awesomely in my brand new car.

    Oh, wait.. Crap.. There isn't a car with it or anything close to it, and I had to get another Tesla because it's still the only game in town. You're much loved L3 Audi A8 scampered away from the US market with it's tail between it's legs.

    I'm not sure why you make assumptions about where a company is at based on hackers trying to figure out how things work. Their conclusions might not be complete, and they're most definitely not working with the latest developer builds.

    I can certainly understand challenging their conclusions about how something works, and helping them figure out what's there. But, whey twist it around to try to attack Tesla with it?

    Can you imagine working on something, and getting it to the point where it can be used to log events in a shadow mode. But, then someone tries to analyze it without understanding the limits/intention of it. When I develop something odds are I'll have something that works well enough to gather data, and then I'll have some other branch with my latest stuff.

    What you're comparing it with is all carefully crafted, and selected by solution providers like MobileEye. You're not getting that kind of unfiltered data that we see on the OP's post.

    It's pretty obvious to any owner such as myself that Tesla has a long ways to go. I really wish I could help out because I'm curious why there have been so many reports of false braking with AP 2.5. So I'd love to see what the car see's when false positive happens. What's really throwing it off?

    I'm not sure comparisons to other vendor solutions is really right for this thread. This is just seeing something these guys figured out how to get access to, and I'm grateful for their efforts.
     
    • Like x 14
    • Love x 3
    • Helpful x 1
  5. gearchruncher

    gearchruncher Member

    Joined:
    Sep 20, 2016
    Messages:
    722
    Location:
    Seattle, WA
    If they are doing this, they have the logic very broken, because crossing this line is perfectly legal and expected in WA (it's the only way in/out of an HOV lane) yet since 2018.10, I have never once been able to use AP to get in/out of an HOV lane. As far as I can tell, they have very simple logic of "do not cross a solid line. period." which goes against the federal guidance for lane lines.
     
  6. S4WRXTTCS

    S4WRXTTCS Active Member

    Joined:
    May 3, 2015
    Messages:
    3,998
    Location:
    Snohomish, WA
    There has been a ton of reported activity with downloading new maps of recent. So there is some speculation that the new maps will be required for V9.

    But, I have no clue how they're going to handle HOV lanes especially the ones on 405. You can cross a single line, but not a double line. Is it going to know what the double line even is? Is it going to know it's free after 7pm.

    Now I don't really expect it to know. It probably won't cross them like it currently doesn't.
     
  7. Snuffysasa

    Snuffysasa Member

    Joined:
    Feb 24, 2017
    Messages:
    267
    Location:
    Minneapolis

    I do not know if Tesla is using HD maps for this application or not.

    But I do know that the system I work on does use HD maps for this application, and it works in tunnels and in situations you describe with lane changes in the middle of the day.

    "It also works in tunnels where GPS is dead."

    GPS is dead, but when a self driving car is using HD maps for autonomous driving... they do not use GPS for localization.

    "why can't they get speed limits right?"

    typically speed limit data does not come from HD maps in adas systems. Also, speed limit data in maps is not precise, nor safety critical.

    That is the natural assumption
     
  8. strangecosmos

    strangecosmos Non-Member

    Joined:
    May 10, 2017
    Messages:
    1,041
    Location:
    The Prime Material Plane
    This is awesome! Clearly a lot of work went into creating this visualization, and it’s the most visceral insight we have so far into what Autopilot currently sees. It’s really fun to watch the Tesla driving around Paris and the crazy amount of bounding boxes.

    I’m really excited for Autopilot v9. I hope it lives up to the hype.
     
  9. gearchruncher

    gearchruncher Member

    Joined:
    Sep 20, 2016
    Messages:
    722
    Location:
    Seattle, WA
    Can you explain this further? In most places, exceeding the speed limit is prima facie evidence that you were being unsafe. So if a self driving car doesn't get the speed limit right, and goes too fast, it is safety critical.

    Also, are you saying that the HD maps in ADAS systems are safety critical and if they are wrong the vehicle will crash?
     
  10. verygreen

    verygreen Curious member

    Joined:
    Jan 16, 2017
    Messages:
    2,383
    Location:
    TN
    I had an AP1 loaner for a wekk and it was absolutely unusable on a windy hilly road I have by my house. works perfectly on AP2, though. AP1 has other benefits though, like speed sign recognition, I guess.
    well, we mostly have those unverifiable demos from other industry members, though. If you ever worked on demos yourself you know the difference between a tech demo video vs live tech demo vs technology demo you let others to try vs technology you just let others to use at any time.
    So I don't think they are lying or anything, but there's a good chance the footage being shown is selected from many examples and is the best possible case on a lucky day and the car was eaten by a grue 10 seconds after the footage stopped being shown ;)
     
    • Like x 7
    • Funny x 3
    • Disagree x 1
  11. AnxietyRanger

    AnxietyRanger Well-Known Member

    Joined:
    Aug 22, 2014
    Messages:
    9,451
    Location:
    EU
    Apples and oranges, though.

    You are comparing EyeQ3 being fed with one narrow, camera (AP1) to AP2 being fed several (including fisheye) cameras. EyeQ3 also supports several camera inputs but Tesla does not provide them with it in AP1. It would be no surprise that a single narrow FoV camera like on AP1 would lose sight of things on a hilly road for obvious reasons - such a camera may only/mostly see sky.

    Remember that Tesla did work on two-camera EyeQ3 system (the "AP 1.5"), but shipped only the frame for it in AP1 Model X...
     
  12. caligula666

    caligula666 Member

    Joined:
    Mar 2, 2017
    Messages:
    72
    Location:
    Switzerland
    I understand there are multiple "steps" (I think of them as a pipeline of steps) that go, say, from sensing (camera, LIDAR, whatever) to acting (turning wheels, accelerating, etc.) What I'm unclear on is when I look at the footage above (congrats btw!), it seems that if a vehicle (say, on the highway) that has been properly identified by the camera goes behind another vehicle (like a truck), the system seems to not "track" it anymore i.e. the colored box just disappears. Where in that "pipeline" does the system keep "inferring" that, given the law of physics, this object most likely didn't just pop out of the ether, nor did it fly above the vehicles before/after the ones he is stuck in-between? i.e. is there a layer in the pipeline that creates "assumptions" where vehicle are likely given past observations? This is what we normally do as drivers continuously and yet I don't see any of this in the Tesla footage (either because it doesn't exist or because it can be observed at the layer that has been analyzed)
     
  13. jnuyens

    jnuyens Member

    Joined:
    Feb 19, 2016
    Messages:
    115
    Location:
    Belgium
    Great post, @verygreen !
    It's amazing what you accomplished and how you spend your spare time to enhance the insights into Tesla for everybody!
    Respect and keep up the good work!
     
    • Like x 5
  14. boonedocks

    boonedocks Active Member

    Joined:
    May 1, 2015
    Messages:
    1,872
    Location:
    Gainesville, GA
    That may be by design and not an error. Our HOV lanes have particular areas you can enter and exit the HOV\PPU lanes and if you cross the solid line there is a HUGE $$ fine. You can only enter and exit when the lines become dashes. I would NOT like them to change this as automatic lane change could result in heave fines if the NAV / EAP decided to change lanes. (when available lol)
     
  15. llavalle

    llavalle Member

    Joined:
    Sep 9, 2013
    Messages:
    678
    Location:
    Somewhere around Montreal in Quebec, Canada
    Pretty sure auto lane change will be an on or off option... just turn it off...
     
  16. gearchruncher

    gearchruncher Member

    Joined:
    Sep 20, 2016
    Messages:
    722
    Location:
    Seattle, WA
    I just looked, and GDOT page says that HOV lanes use double white lines to indicate no crossing, not just a single line. This follows the federal guidance around lines. Double white is prohibited cross, single while is restricted crossing. For Tesla to obey the laws, they need to track if it's a single or double line.

    I agree that auto lane change can't cross solid white all by itself as it doesn't understand the restriction. I just want it to cross when I ask it too, instead of pretending that lane doesn't exist at all.
     
    • Like x 4
  17. Snuffysasa

    Snuffysasa Member

    Joined:
    Feb 24, 2017
    Messages:
    267
    Location:
    Minneapolis
    First, we need to separate if we are talking about self driving cars here or ADAS systems.

    In an ADAS system, the driver is responsible for ensuring safety and proper speed limit, so if the the system makes mistakes it is not safety critical.... for example, automatic emergency brakes that only works 50% of the time is not a safety critical issue.

    for SDCs, there is a lot more being taken into account than just posted speed limits for choosing speed even when they are the one car on the road. They would be using a different kind of map to get target speeds for various segments, rather that what ADAS systems use for posted speed limits. And even in these cases the map data is not safety critical.

    I did not mean to suggest that any map data is safety critical.
     
    • Helpful x 1
  18. Bladerskb

    Bladerskb Senior Software Engineer

    Joined:
    Oct 24, 2016
    Messages:
    1,670
    Location:
    Michigan
    Again you a comparing a 4 years old product with a 2018 product that only finally today (with 9.0) is able to match/eclipse the 4 years old product in driving output performance and features. Although they are lacking behind in other detection networks that eyeq3 has.


    Mobileye isn't like the other industry members though. it would be very disingenuous to lump them together. Every single SDC company are research labs (including waymo) other than Mobileye. Why? Mobileye is actually SELLING their products. No othe company is selling complete SDC hardware and software today other than Mobileye. Let that sink in.

    This isn't some demo. This is real! Tesla could just as easily be using eyeq4 now (with fully done EAP software ready on AP2 Oct 2017 launch using eyeq4) if they didn't like doing everything in house.

    Mobileye product ARE actually REAL! Prove in point is AP1 and Super cruise using 4 year old eyeq3.
    Think about it, Mobileye was 4 YEARS AGO where Tesla is today.
    You think the last 4 years mobileye has just been sitting on its ass doing nothing?

    Or do you think mobileye is lying to their TIER 1 companies buying their product. Because that's the only conclusion if you think the footage that Mobileye showing could be lies. Then Nissan, VW, BMW, GM and more are being fooled.

    That's like seeing you can't trust Samsung video on their OLED screen because it might be fake yet Apple uses the same screen in their iphone. So has Apple been fooled? Or do you think companies keep quiet while mobileye lies about their chip capabilities?

    Amon has said time and time again, unlike other companies, they don't show demo ware, they don't show research projects, they don't show anything that doesn't have a production deal. Eyeq4 is IN production. BMW cars are getting it this year. Nissan is saying they will have LVL3 using eyeq4 early 2019.

    While other companies are toying around with HD MAP. Mobileye's crowd sourced REM MAP are actually IN PRODUCTION RIGHT NOW!

    This is a huge difference between mobileye vs other companies, they actually have to sell stuff to make money. You can't sell a lie (unless you are Tesla :p) Because the person you sell to already tested your chips and know its capabilities before they sign a contract.

    Mobileye doesn't deal with theories, they deal with production deals!
     
    • Disagree x 7
    • Like x 1
  19. mongo

    mongo Well-Known Member

    Joined:
    May 3, 2017
    Messages:
    9,168
    Location:
    Michigan
    Dropping MobileEye was not a unilateral Tesla decision.
    The Tesla / Mobileye story
     
  20. croman

    croman Active Member

    Joined:
    Nov 21, 2016
    Messages:
    3,736
    Location:
    Chicago, IL
    No one uses mobileye products in retail cars. Stop pushing the lies. It's not Beach Week. Mobileye only has demos. Audi L3 was as much of a joke as I thought it was. No one else uses anything because it's not ready or other companies are too chicken *sugar* to have confidence in the tech.
     
    • Like x 2

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC