Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
That's really impressive, because AP1 has gotten much, much better over the course of the last year. If AP2 is matching the current state of AP1, it's far better than the AP1 I had when I got the car with 7.1 on it.

In the situations where I can compare the two head to head, I see a much higher rate of success from AP2. Where AP2 still falls short is the actual driving. I guess there must be some sort of difference in whatever algorithm actually generates steering outputs because AP1 is still much smoother.
 
  • Informative
Reactions: Saghost
That's a clever approach; doing transfer learning from an old system to a new one. The training might take a while if you do it with hardware - it's usually done algorithm-to-algorithm inside the same fast machine to save calendar time. But it would probably work and it probably would have been a quick way to duplicate the AP1 functionality in the AP2 hardware.

But it's very likely that the IP agreement between ME and Tesla would have prevented this. Reverse engineering is almost always prohibited in these kinds of contracts and you can be sure that Tesla using transfer learning to duplicate the ME functions in the presence of a no-reverse clause would bring a lawsuit. On top of that it seems likely that Tesla's ginormous self confidence would lead them to believe that they can outdo ME even without a head start.

Still, it's a great observation.

Great point about legal agreement, didn't think about that.
 
I always had the theory that Tesla planned to run the EyeQ3 concurrently with the nVidia hardware in an attempt to have the EyeQ3 act as training wheels. To this day, I believe the data streams output from the nVidia hw and EyeQ3 are of similar (if not the same) format.

I think part of the problem arose when MobilEye refused to let their product be used to train a separate system, forcing the immediate removal of the EyeQ3 from any board that had an nVidia processor on it. I know Audi zFas uses both, but that's probably with prior agreement that the EyeQ3 isn't being used to train another system.
 
  • Helpful
Reactions: Rouget
MobileEye is brilliant, there is no doubt. However they must be spending 20%? 40%? of their development effort supporting multiple OEM customers with SDKs, documentation, etc.

This should/could/might create an opening for Tesla to be a single-source manufacturer for themselves as captive customer to accelerate development of features (i.e L2-L3-L4) faster than MobilEye. OTH, this kind of engineering *is* what MobileEye does best without the distraction of actually designing and building vehicles.

We as consumers get the fruits of this thru choosing the best vehicle offerings as they become available.

Very exciting.
 
I'd like to inject a bit of my insight into this matter from the research I have been doing reverse engineering the Subaru EyeSight camera.

Hitachi Automotive who makes the EyeSight unit uses custom silicon in conjunction with a general-purpose CPU to create a unit with very good safety and lane keeping abilities. MobilEye does essentially the same thing and combines their custom silicon with several ARM cores. The reasoning is complex, but boils down to power envelope and compute power. Custom silicon is the way forward because no matter what nVidia can churn out ie, PX2 etc. they will always be limited by the power draw vs. compute capability. The advantage nVidia holds is that in the interim as SD research and programming gets better, cars with their processors can be updated. Silicon cannot be updated over the air unless you happen to be using a FPGA (enter Intel/Altera). With field programmable gate arrays you can strike a better balance with the power draw vs compute power issue. The Intel move to buy MobilEye and fuse their custom IP into silicon on an advanced silicon process node (14-10-7nm) with some FPGA fabric is the prize. They will have something that meets the power envelope, compute envelope and long life-cycle the automotive industry demands for component suppliers.

Musk went with MobilEye because that was state-of-the-art at that point in time. Hotz demonstrated you can go quite far with a high-end mobile SoC leveraging the GPU and roll out updates quickly and more or less unbounded by the limitations of fixed silicon. Musk has a team working on their own silicon. Tesla needs to own their IP to control their future Uber-clone platform. Autonomous intelligence is the differentiation from all these automotive companies.

IMG_20170907_082938.jpg
 
That's a clever approach; doing transfer learning from an old system to a new one. The training might take a while if you do it with hardware - it's usually done algorithm-to-algorithm inside the same fast machine to save calendar time. But it would probably work and it probably would have been a quick way to duplicate the AP1 functionality in the AP2 hardware.

We discussed this in... AP2.0 camera thread I think in the past.
I think the resultant model would be inferior if you do not have all the same source material that produced the first model. And if you do (And it's labeled accordingly) - you do not need the extra step.

All the corner cases that are carefully coded in MobileEye chip would be lost if you did not happen to capture them in your datastream.
 
@lunitiks I can't divulge how I know what I know. But here's a clue for you since you seem very interested.

The MobileEye chip has a raw data stream that is closed off to Tesla. Mobileye takes the raw data stream from their chip and then applies algorithms and such to output a much more simplified data stream from which actions can be taken from. However, Mobileye's system is far from perfect... or rather I should say the Mobileye's algorithms tend to ignore some important data (i.e, consider Joshua's Brown accident). Tesla only has access to the simplified data stream... or basically a data stream that's been stripped of important raw data. But with access to the raw data stream, Tesla could do so much more and improve the safety of the system (which to Tesla's mission is essential). Mobileye saw it a different way... very different.

Mobileye under Intel seems to be very different with a few levels of [DYI] APIs.

Open EyeQ5 API discussed starting here (CES 2018 pres)
http://youtu.be/GQ15HWCw_Ic?t=2807

YXfzWBM.jpg
 
Fantastic stuff. Truly amazing stuff. Completely autonomous car and trucks. Darting here and there.

So it is 2024. The Southern Wall is up and all the truck drivers and taxi drivers State Side have lost their jobs and cannot benefit from the current zero tax regime and hyper Wall Street growth. Conversely, there is a lot of coal dust and fake news in the air, and there is a little bit of irritation and agitation with the hordes that continue to pour in from the Northern border, eager to see autonomous cars dart here and there. How much will it take to build a Northern Wall and water barriers?
At least all the cars can see you and hear you, trapped behind them walls. I wonder what it will cost to hail one of them to take us home or to the nearest underpass.

Is Technology and AI moving faster than
our social systems and current crop of leaders cope with? Just musing...​
 
Why don't you watch their CES presentation where they name them and what they will do (it's actually fairly impressive):

I watched the first 10 minutes. They didn't name any specific car models that are being sold now that use mobileye for their lanekeeping type driver assistance. The closest is that they had a slide showing GM supercruise and Audi something. But not clear if available on cars now, and if so, which ones.

So do the Cadillac supercruise and Audi lane keeping products available now use mobileye? Any other cars?

How do those compare to Tesla AP?
 
  • Disagree
Reactions: Rouget
I watched the first 10 minutes. They didn't name any specific car models that are being sold now that use mobileye for their lanekeeping type driver assistance. The closest is that they had a slide showing GM supercruise and Audi something. But not clear if available on cars now, and if so, which ones.

So do the Cadillac supercruise and Audi lane keeping products available now use mobileye? Any other cars?

How do those compare to Tesla AP?
Cadillac super cruise is better than ap2 right now but did not work everywhere. It's more consistent in not trying to kill driver. There was an article comparing them.

We tested out Tesla Autopilot and Cadillac's Super Cruise — here's which one we liked better
 
I agree with Anxiety Ranger's take. It is also what Mobileye said publicly.
Sometimes, corporations tell us the truth?
Probably.

Timeline:
17 Apr 2016 Musk retweets Josh brown Video
May 7, 2016 Joshua Brown accident
July 14, 2016 Consumer Reports to Tesla: Disable automatic steering and quit calling it Autopilot
July 26, 2016 Mobileye Ends Partnership With Tesla


*
Yesterday, GM announced a self driving car with no steering wheel. Scheduled for 2019.
"If they are granted permission from regulators and they can solve hazardous driving scenarios, the floodgates for advanced autonomous vehicles will open," Kelley Blue Book analyst Akshay Anand

No Steering wheel.

The most likely roadblock to FSD? Government regulations, law, insurance, liability.
These are not solved, not clear, need to be done, now.

The dramatic presentation of the steering wheel -less car is I believe, GM/Intel shocking the sleepy public and legislators into crafting a proper legal framework. Hey this is here now, deal with it.

Elon's space cowboy shtick does not play at all with this deadly serious, hard core reality of billions in liabilities.
In fact, it puts all of it t risk.
I am still surprised this public beta test doesn't have someone saying, "There oughta be a law".
Thankful for that, and to you owners who have avoided further incidents. Well done.
It would not take much to turn the public against it, and stop the whole show.
 
Kinda on topic I hope.

What is the difference between AP2.0/2.5/FSD hardware?
Do all cars with AP2.5 have the capability for FSD in hardware at some point?
If the tech challenges for FSD require a new set of hardware is that going to be installed free for all that bought FSD?

I have a car with AP 2.5 and IF Tesla can figure out/implement FSD I would gladly pay the price of $4000 for after purchase implementation.
 
Cadillac super cruise is better than ap2 right now but did not work everywhere. It's more consistent in not trying to kill driver. There was an article comparing them.

We tested out Tesla Autopilot and Cadillac's Super Cruise — here's which one we liked better

He tested the Cadillac Supercruise driving from NY to DC --which means on the ever crowded divided highway I-95. And he tested the Tesla in a drive to rural Maryland. He didn't test the Cadillac Supercruise on a drive to rural Maryland because he couldn't; it only works on divided limited access highways -- an environment where Tesla AP also does best.

Apples to oranges.

And he gets facts wrong. He says when you ignore Tesla warnings it disables AP until you charge, when really you only need to put the car in park to reset the access to AP.

Not a reliable comparison.
 
He tested the Cadillac Supercruise driving from NY to DC --which means on the ever crowded divided highway I-95. And he tested the Tesla in a drive to rural Maryland. He didn't test the Cadillac Supercruise on a drive to rural Maryland because he couldn't; it only works on divided limited access highways -- an environment where Tesla AP also does best.

Apples to oranges.

And he gets facts wrong. He says when you ignore Tesla warnings it disables AP until you charge, when really you only need to put the car in park to reset the access to AP.

Not a reliable comparison.

If you manage to ignore Tesla AP warning's you probably are drunk or asleep.