TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

MobilEye's approach

Discussion in 'Autonomous Vehicles' started by lunitiks, Mar 26, 2017.

  1. Bladerskb

    Bladerskb Senior Software Engineer

    Joined:
    Oct 24, 2016
    Messages:
    1,676
    Location:
    Michigan
    Jason Hughes‏ @wk057 13 Sep 2016
    @letsgoskatepool Not quite. No way to get any real stream of data from the camera to the MCU, which is why this is only a few event frames.

    Mobileye doesn't allow raw camera data.
    Second of all there is no other chip to actually process the visuals of the camera.
    There is no secondary chip to do an additional machine learning process.
    All tesla has access to is converting the outputs from the EyeQ3 to actuators.

    That's it. They don't have access to the chip to do whatever they want.
    This is why their miles data only consists of literally gps location.

    "Our chip receives the video feed from a camera and processes this video to find vehicles, to find pedestrians, to find traffic signs and speed limit signs, to find traffic lines and also to support automated driving," Shashua says.

    Its mobileye chip, running their software. period.
    There are no secondary tesla chips. how is it that you don't understand?
     
    • Like x 2
    • Funny x 1
  2. Bladerskb

    Bladerskb Senior Software Engineer

    Joined:
    Oct 24, 2016
    Messages:
    1,676
    Location:
    Michigan
    the best GPS will get you an accuracy of 3-10 meters, not 2-3 ft. That's a FACT.
    The same GPS tesla uses is the same every other automaker uses. These are industry standard.
    The reddit post you posted is pure none sense. i can't believe you even posted it.
     
    • Like x 1
    • Funny x 1
  3. bhzmark

    bhzmark Supporting Member

    Joined:
    Jul 21, 2013
    Messages:
    2,848
    Source? And specifically source proving yr claim that Tesla was not able to collect and use any video from AP1.
     
  4. stopcrazypp

    stopcrazypp Well-Known Member

    Joined:
    Dec 8, 2007
    Messages:
    9,454
    #24 stopcrazypp, Mar 29, 2017
    Last edited: Mar 29, 2017
    WAAS can provide accuracy better than 1 meter (~3 ft) in the continental USA.
    Wide Area Augmentation System - Wikipedia

    If you augment that with some inertial sensors, it's possible to get even better accuracy.
     
    • Like x 1
  5. stopcrazypp

    stopcrazypp Well-Known Member

    Joined:
    Dec 8, 2007
    Messages:
    9,454
    #25 stopcrazypp, Mar 29, 2017
    Last edited: Mar 29, 2017
    He's talking about the MCU, not the vehicle controllers Tesla is using (which would have access to that video feed). The MCU receives data from Tesla's through a gateway that goes through the CAN bus. He noted there is not enough bandwidth to send the full video:
    “It has to send these messages over the CAN bus very quickly to save them from the camera to the MCU,” Hughes explains, “so they have to be dumbed-down resolution so that they can actually make it to the MCU before anything bad happens to it in a crash.”
    Tesla Autopilot Automatically Stores Crash-Cam Footage After a Collision | Inverse

    The MCU referenced is the CID in this diagram. The EyeQ3, Tesla's chip, and the camera is under the vehicle controllers part of this diagram.
    [​IMG]
    Hacking a Tesla Model S: What we found and what we learned | Lookout Blog

    But obviously whatever chip Tesla is using that sent those frames over the CAN Bus has access to the video, not simply some outputs as actuators as you keep characterizing.

    So you are basically saying there is no chip(s) between the EyeQ3 and the vehicle interfaces? I'm pretty sure whatever logic Tesla is using to do the GPS based lane keeping (which is used when there are no lane lines visible), and also geocoded radar whitelisting is requires another chip to handle it. Also the autopilot visualizations done on the IC (instrument cluster) requires some custom Tesla software.

    Just looking at the Audi zFAS board which also uses the EyeQ3, there are plenty of other chips that can do processing (the two huge Tegra X1 chips being most obvious).
    [​IMG]

    Also, the EyeQ3 block diagram has a video out (lower right corner). What is the point of the video out if automakers are never allowed to use it?
    [​IMG]
    Exclusive: The Tesla AutoPilot - An In-Depth Look At The Technology Behind the Engineering Marvel - Page 5 of 6
     
    • Informative x 3
    • Love x 1
  6. ariaga

    ariaga Member

    Joined:
    Mar 24, 2017
    Messages:
    19
    Location:
    Madison, WI
    To play devil's advocate: Why is having HD map data an absolute requirement for full self-driving? The obvious counterpoint is that humans are able to drive without them.

    I could argue that any *critical dependency* on HD map data for a self-driving system is a good indication that it will never work that well for all the corner cases. Lane markings change all the time on a road (e.g., road construction). Signs come and go. Barriers come and go. I'd hate to rely on those from a database.

    Humans are able to drive with just their eyes and a brain containing "low res" map data (i.e., rough knowledge of where roads and places are).
     
    • Like x 3
    • Informative x 1
  7. lunitiks

    lunitiks Cool James & Black Teacher

    Joined:
    Nov 19, 2016
    Messages:
    2,708
    Location:
    Prawn Island, VC
    Finally some f'ing real discussion on this topic
     
    • Like x 2
    • Funny x 1
  8. Tam

    Tam Well-Known Member

    Joined:
    Nov 25, 2012
    Messages:
    5,863
    Location:
    Visalia, CA
    According to Tesla's statement on the divorce:

    "MobilEye had knowledge of and collaboration with Tesla on Autopilot functionality for the past 3 years.

    Tesla has been developing its own vision capability in-house for some time with the goal of accelerating performance improvements. After learning that Tesla would be deploying this product, MobilEye attempted to force Tesla to discontinue this development, pay them more, and use their products in future hardware.

    In late July when it became apparent to MobilEye that Tesla planned to use its own vision software in future Autopilot platforms, MobilEye made several demands of Tesla in exchange for continuing supply of first generation hardware, including:
    • Raising the price of their product retroactively
    • Demanding an agreement to extremely unfavorable terms of sale and
    • Demanding that Tesla not use data that was collected by its vehicles’ cameras for any purpose other than helping MobilEye develop its products
    • Requiring that Tesla collaborate on Tesla Vision and source future vision processing from them until at least level 4"

    It sounds like MobilEye knows that Tesla can do its own "vision software" and can collect data from MobilEye's camera so it lays out the terms and conditions for Tesla to agree to restrict the data collection as well as to share Tesla Vision with MobilEye.

    It sounds like the quote above does not confirm that Tesla is technologically challenged that it is unable to obtain raw data from MobilEye's camera but legally, MobilEye wants to impose terms and conditions which Tesla does not accept.

    Again, it's the legality, not the capability.
     
    • Informative x 4
    • Like x 1
  9. croman

    croman Active Member

    Joined:
    Nov 21, 2016
    Messages:
    3,755
    Location:
    Chicago, IL
    And mobileye failed to put restrictions into their original license with Tesla and Tesla balked at those terms to continue using hw1. Hence TeslaVision without mobileye.
     
    • Like x 1
  10. Bladerskb

    Bladerskb Senior Software Engineer

    Joined:
    Oct 24, 2016
    Messages:
    1,676
    Location:
    Michigan
    You can't run safety critical system on off the shelve gpu. They have to be specially engineered with safety critical and fault tolerance in mind. Also you can't run safe critical application with other application. You can't use the IC gpu and the 17inch display GPU because when those gpu crash because you went to the wrong browser. Guess what happens to your car? it crashes.

    All car actuators algorithm are done on the SOC EyeQ3 using mobileye's sdk.
    The gps logging and radar signature blocklist program also happen on mobileye's SOC.
    Its a complete system.

    http://www.movon.co.kr/download/board.asp?board=blog&uid=822

    Lastly Mobileye demand that tesla not use their camera data has nothing to do with terms but rather IP.
    also has nothing to do with raw camera feed but the data processed by the deep learning algorithms on the SOC.

    The fact is that tesla miles data consists only of gps logging and radar/gps blocklist.
    Its not under debate. It fact. its settled.
    I don't discuss speculation, i discuss facts.
     
    • Funny x 2
    • Like x 1
  11. bhzmark

    bhzmark Supporting Member

    Joined:
    Jul 21, 2013
    Messages:
    2,848
    In many sentences you still didn't cite a source for your claim that Tesla was not collecting and using any data from the cameras.

    In fact stopcrazy cited info that suggests that Tesla was collecting data and one of the dealbreaker points of negotiations was Mobileye wanted to limit that when it came time to extend/renegotiate their arrangement. Try to limit your factual claims to statements for which you have some factual basis. That is not one of them.
     
    • Like x 2
  12. stopcrazypp

    stopcrazypp Well-Known Member

    Joined:
    Dec 8, 2007
    Messages:
    9,454
    #32 stopcrazypp, Mar 30, 2017
    Last edited: Mar 30, 2017
    What "GPU" are you referring to? Note I'm not referring to the Tegra 3/4s (which are not GPUs either) handling the CID and the IC when I say there is a Tesla chip between the EyeQ3 and actuators, but a chip that falls under the "vehicle controllers" part the diagram. The CAN bus does not have enough bandwidth to handle a full video stream, so any video processing has to happen before the CAN bus (the rear view camera on the other hand may be connected directly to the CID).

    I don't have picture of whatever board Tesla is using, but from the Audi board, even something like the X1s are not "GPU". They incorporate CPU cores just like the EyeQ3 chip. They have 4 ARM Cortex-A57 CPU cores + 4 Cortex-A53 CPU cores, plus 256 Maxwell GPU cores.
    Tegra - Wikipedia

    The Parker SOC in the PX2 Tesla is using right now to do their Tesla Vision AI uses a similar architecture:
    4 ARM Cortex-A57 CPU cores + 2 Denver 2 CPU cores + 256 CUDA GPU cores.
    Nvidia reveals new details on its Drive PX2 platform, Parker SoC - ExtremeTech
    So obviously this is reliable enough to do the processing for semi-autonomous driving, since Tesla is using these cores to do exactly that for AP2!

    Let's compare to Mobileye's chip which has 4 MIPS cores and 4 VMP cores:
    Exclusive: The Tesla AutoPilot - An In-Depth Look At The Technology Behind the Engineering Marvel - Page 5 of 6

    EyeQ3 uses 1004 from the block diagram, which is base on the previous 34K used by EyeQ2. MIPS 1004K or 34K is not some kind of special high fault tolerance architecture. It's actually used in a lot of set top boxes, for example in this one:
    http://www.edn.com/Home/PrintView?contentItemId=4442600
    Here's the applications listed from the datasheet for the MIPS 1004K architecture:
    Key Applications
    Digital Home:
    • Enhanced set-top boxes (STBs)
    • HD digital consumer multimedia
    • Residential gateways (RGWs)
    Enterprise Communications Infrastructure
    Network Attached Storage (NAS)
    Office Automation/Multi-Function Products (MFPs)
    • Medium/large office print/fax/scan
    https://imagination-technologies-cloudfront-assets.s3.amazonaws.com/documentation/MIPS32_1004K_1211.pdf

    Seems like you are changing the subject. My original point was only about the raw camera feed and whether Tesla has access to it. I showed the Mobileye chip has a video out. Your link is to the older EyeQ2, but even that chip has a video out, look at page 3.

    And once again you resort to the usual tactic of stating what you are saying is "fact" when you have absolutely no evidence.
     
    • Informative x 1
  13. Bladerskb

    Bladerskb Senior Software Engineer

    Joined:
    Oct 24, 2016
    Messages:
    1,676
    Location:
    Michigan
    What are you talking about?

    I have always maintained that The only facts we have with us is that.

    Tesla fleet learning consists only of gps logging and radar/gps logging.

    gps based maps are not suitable for anything above level 2 autonomy.

    The above are facts. Frankly there is no point for further discussion because they are irrelevant.

    This is despite whether tesla has access to raw camera feed or not.
    However they do not based on the statement from Tesla themselves.
    They referred to the abstracted data from cameras not the camera raw feed.
    Lastly they don't have a processor to even process the info.

    THe difference between tesla and other car markers is that they load their cars with excessive sensors and processor unit.
    Something elon would never do in-order to save cost.

    Infact the 2015 mercedes benz has more sensors than AP2 for example.
    The same can be said about audi cars.

    Hence you cant look at their board and say, hey they have this then tesla must also.
    That's a very naive way to come to a conclusion.

    and how did AP2 get into this discussion....
     
    • Like x 1
  14. Bladerskb

    Bladerskb Senior Software Engineer

    Joined:
    Oct 24, 2016
    Messages:
    1,676
    Location:
    Michigan
    Tesla has no GPU to process the data.
    regardless we know they are not processing any further data because everything they have done they have bragged about.
    We know they collect gps logs and radar logs.
    Whether its fleet learning or shadow mode. we know.

    When they start collecting data from cameras and mapping out every lanes, traffic light, road sign, road marking, intersection, etc in the world.
    we will know about it because elon won't hesitate to brag about it.
     
    • Funny x 1
  15. bhzmark

    bhzmark Supporting Member

    Joined:
    Jul 21, 2013
    Messages:
    2,848
    So in other words you are just speculating. Okay got it.
     
    • Like x 2
  16. Kanting

    Kanting Member

    Joined:
    Apr 21, 2016
    Messages:
    657
    Location:
    Pacific Coast, US
    Another methodology difference I think is (with many years working with Israeli chip design houses)... Mobileye runs complete simulations in house, but Tesla mainly relies on you and me.
     
  17. bhzmark

    bhzmark Supporting Member

    Joined:
    Jul 21, 2013
    Messages:
    2,848
    Tesla and Mobileye worked together in unclear proportions to develop and roll out AP1. Since they split Tesla working on its own has come up with ap2 with the functionality in the most recent update.

    What's the best that mobileye has done since then without tesla? That is on streets and can be purchased now?
     
  18. mrkisskiss

    mrkisskiss Member

    Joined:
    Jan 26, 2017
    Messages:
    184
    Location:
    London
    I'm not sure Mobileye's success can be measured by what's on the streets right now. Neither Tesla nor Mobileye have a self-driving car on the road as of today.

    However, the split was really over the choice of technology and partners for a full self-driving system: Tesla wanted to use Tesla Vision on GPU, whereas Mobileye wanted them to wait and use EyeQ4. Everything else was just a catalyst. Although, I have to say, I understand where Mobileye were coming from when they expressed concern over Tesla's "gung-ho" attitude to safety. I mean, think about it for a second: Tesla sell premium cars with a feature (labelled as beta) that could easily kill you and others around you. My AP2 car has definitely attempted it. It's not for the faint of heart!

    One thing is for certain... Prof. Shashua is a brilliant mind - perhaps the best when it comes to this area. His involvement in academia no doubt keeps him sharp and up-to-date with the latest breakthroughs... but more importantly, his communication via numerous technical talks, interviews and press conferences is extremely reassuring. I might not know the maths behind the Mobileye system, but I feel like I understand the general problems, the overall building blocks of their solutions, and where they're at with it right now. They have 1000+ people working on it, with a few industry luminaries leading them on a very focused mission. Tesla have the guy that made Swift leading some random PhD graduates.

    It's annoyingly human of me, but after listening to hours and hours of Prof Shashua's talks, I actually have trust in Mobileye, because it's clear they know what they're doing. Add to this my own experience of the general "feeling" of driving AP1 (safe, reliable, steady) vs AP2 (nerve-wracking, volatile) and, well, I'd probably buy Mobileye if I could.

    I have no idea what the Tesla approach is... and I do wish they'd tell us. It'd make me more confident in them - and the system as a whole - if I knew what was going on over there in Fremont, or at least what the general plan is (is there one?!!)
     
    • Like x 2
    • Love x 2
    • Funny x 1
  19. lunitiks

    lunitiks Cool James & Black Teacher

    Joined:
    Nov 19, 2016
    Messages:
    2,708
    Location:
    Prawn Island, VC
  20. stopcrazypp

    stopcrazypp Well-Known Member

    Joined:
    Dec 8, 2007
    Messages:
    9,454
    I believe their efforts can certainly be measured based on what are on the roads today: non-Tesla level 2 systems using Mobileye.

    As for Tesla's approach they talked about it in this presentation last year:
    Tesla reveals new details of its Autopilot program: 780M miles of data, 100M miles driven and more
    MIT Technology Review Events Videos - Delivering on the Promise of Autonomous Vehicles
     
    • Like x 1

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC