Jason Hughes @wk057 13 Sep 2016 @letsgoskatepool Not quite. No way to get any real stream of data from the camera to the MCU, which is why this is only a few event frames. Mobileye doesn't allow raw camera data. Second of all there is no other chip to actually process the visuals of the camera. There is no secondary chip to do an additional machine learning process. All tesla has access to is converting the outputs from the EyeQ3 to actuators. That's it. They don't have access to the chip to do whatever they want. This is why their miles data only consists of literally gps location. "Our chip receives the video feed from a camera and processes this video to find vehicles, to find pedestrians, to find traffic signs and speed limit signs, to find traffic lines and also to support automated driving," Shashua says. Its mobileye chip, running their software. period. There are no secondary tesla chips. how is it that you don't understand?