You can't run safety critical system on off the shelve gpu. They have to be specially engineered with safety critical and fault tolerance in mind. Also you can't run safe critical application with other application. You can't use the IC gpu and the 17inch display GPU because when those gpu crash because you went to the wrong browser. Guess what happens to your car? it crashes.
All car actuators algorithm are done on the SOC EyeQ3 using mobileye's sdk.
The gps logging and radar signature blocklist program also happen on mobileye's SOC.
Its a complete system.
What "GPU" are you referring to? Note I'm not referring to the Tegra 3/4s (which are not GPUs either) handling the CID and the IC when I say there is a Tesla chip between the EyeQ3 and actuators, but a chip that falls under the "vehicle controllers" part the diagram. The CAN bus does not have enough bandwidth to handle a full video stream, so any video processing has to happen before the CAN bus (the rear view camera on the other hand may be connected directly to the CID).
I don't have picture of whatever board Tesla is using, but from the Audi board, even something like the X1s are not "GPU". They incorporate CPU cores just like the EyeQ3 chip. They have 4 ARM Cortex-A57 CPU cores + 4 Cortex-A53 CPU cores, plus 256 Maxwell GPU cores.
Tegra - Wikipedia
The Parker SOC in the PX2 Tesla is using right now to do their Tesla Vision AI uses a similar architecture:
4 ARM Cortex-A57 CPU cores + 2 Denver 2 CPU cores + 256 CUDA GPU cores.
Nvidia reveals new details on its Drive PX2 platform, Parker SoC - ExtremeTech
So obviously this is reliable enough to do the processing for semi-autonomous driving, since Tesla is using these cores to do exactly that for AP2!
Let's compare to Mobileye's chip which has 4 MIPS cores and 4 VMP cores:
Exclusive: The Tesla AutoPilot - An In-Depth Look At The Technology Behind the Engineering Marvel - Page 5 of 6
EyeQ3 uses 1004 from the block diagram, which is base on the previous 34K used by EyeQ2. MIPS 1004K or 34K is not some kind of special high fault tolerance architecture. It's actually used in a lot of set top boxes, for example in this one:
http://www.edn.com/Home/PrintView?contentItemId=4442600
Here's the applications listed from the datasheet for the MIPS 1004K architecture:
Key Applications
Digital Home:
• Enhanced set-top boxes (STBs)
• HD digital consumer multimedia
• Residential gateways (RGWs)
Enterprise Communications Infrastructure
Network Attached Storage (NAS)
Office Automation/Multi-Function Products (MFPs)
• Medium/large office print/fax/scan
https://imagination-technologies-cl...onaws.com/documentation/MIPS32_1004K_1211.pdf
http://www.movon.co.kr/download/board.asp?board=blog&uid=822
Lastly Mobileye demand that tesla not use their camera data has nothing to do with terms but rather IP.
also has nothing to do with raw camera feed but the data processed by the deep learning algorithms on the SOC.
The fact is that tesla miles data consists only of gps logging and radar/gps blocklist.
Its not under debate. It fact. its settled.
I don't discuss speculation, i discuss facts.
Seems like you are changing the subject. My original point was only about the raw camera feed and whether Tesla has access to it. I showed the Mobileye chip has a video out. Your link is to the older EyeQ2, but even that chip has a video out, look at page 3.
And once again you resort to the usual tactic of stating what you are saying is "fact" when you have absolutely no evidence.