1. So they operate completely independent of each other. Different dev teams too?
2. So AP2 was just a stop gap? Is this the reason its execution capacity has yet to be reached, and will it ever or is the goal to move to FSD asap? If FSD replaces AP2, does AP2 become unsupported? What if I'm not interested in FSD, but simply enjoy using APx?
3. Both my AP1 and AP2 cars appear to be reading speed limits, fwy & route numbers etc. I've had my car pick up the 110 fwy number as a 110 speed limit for example, and it slows where temp construction zones have reduced speed posted. So if the collective APs are not doing this, what is?
4. Are they actually enhancing AP1, so our AP1 cars can expect future improvements (per HW limitations) while also piggybacking nVidia? Or enhancing the AP1 code for temp AP2 (to be replaced by FSD) or solely for FSD?
I'd also like your opinion on the lifecycle of AP1 and AP2.....in other words, what the intended max capabilities might be, before fully focused on the next best thing, be it FSD or Mars. And lastly, do you view competitors having tech advantages or disadvantages as compared to the circuitous dev process at Tesla? Thanks.
1. This is my assumption based on the fact that Tesla was forced to develop AP2 in a hurry and the people who would be working on Autonomous driving would be more specialized AI/Machine Learning people. There is a lot of overlap with the image processing vision systems, but there is almost no other overlap as AP does not identify things like street signs and read them to know exactly what they mean. AP only sees the lane markings/curb and the car in front of it. As most will note, everything is a car in front of you so they are not wasting time on trying to figure out exactly what class of vehicle is in front of you.
2. Yes, sadly. Elon stated clearly that they wanted to run the development of EyeQ3 in conjunction with HW2. The crash and fall out between Mobileye and Tesla, which in part had nothing to do with the crash but the fact that Tesla was developing a competitive solution. All of this forced them to hack together what we lovingly call AP2 on HW2, when really its AP0.5 as some have noted.
3. Neither read signs, they rely on speed limits from the maps for that area.
4. HW1 doesnt have anything to do wtih nVidia. What nVidia is supplying is raw processing power and some machine learning and vision tech to accomplish AP2.0 features. I dont believe FSD will utilize much of what nVidia supplied but will use its own machine learning algos that have been in the works for sometime. Elon didn't just wake up in early 2016 and say.. wow we can just slap nVidia's GPUs in our cars and make them fully autonomous with nVidia's software. They have been working on this for a long time and since AP1 work started. Remember, they have billions of AP1 miles stored, that can be used for training the machine. Now they have millions of HW2 miles to include the different camera views.
Not quite because all the raw image processing where done by mobileye and on their chip.
There are over 2 million cars with that hardware in it and none have the ability of the 150,000 or so HW1 cars. I dont really know where the line is drawn between what Mobileye has done and what Tesla has done as it relates to AP1, but wherever that line is, its significant. As someone posted, just list me the cars that have AP1 or AP2 capabilities today in production? I will give you a hint, there are none.
This is actually false, while HD maps can be created with lidar AND camera, they cannot be updated by radar. Radar has a very difficult problem reading stationary objects and simply ignores it. Radar is 1D and gives you speed and direction and you can't create a point cloud (like lidar) with it. When tesla start developing their hd map, they will leverage their front facing cameras.
I agree, Lidar is used to create the HD maps and they cannot be updated with Radar, but radar can update and share landmarks that can be shared. An example is how HW2 would suddenly decelerate when you saw an overhead road sign on a freeway. Those landmarks where seen by the radar but not fully baked into the HD maps, probably in places where HD maps had not been completed. Enough cars see the same sign and all the sudden it becomes whitlisted. This could be a person who reviews these situations and manually whitelist the location or it could be learned over time or a combination.
Nothing Reciprocity said suggests that they have any insider information. Instead it's a solid post that summarizes a lot of what someone who clearly has read up a lot on self-driving cars has interpreted it as.
Which is great because it allows others who have done the same to nitpick it.
True, no inside info, just a lot of reading/listening. Everything I have noted is speculation, but its not without reading a lot. I purchased the Model X HW2 because I believed Elon when he said he considers autonomous driving basically solved. I think what he means is that its now just a matter of getting enough miles to teach the machine how to drive and enough data to prove the system is safer then a human. They should be able to process the millions/billions of miles of video data to show that the car would make better decisions then humans, yielding less accidents. They did something similar where they showed a graph of where drivers were in the lane vs AP1 and AP1 was like 4x more likely to stay in the center of the lane vs the human driver. I honestly don't think that is that valuable in terms of safety, because people often give more room to larger vehicles and cement walls. But it could be valuable if you analyze millions of miles and validate that the autonomous car would have made the same decisions as people in most cases and would have avoid accidents in other situations.
I think his timeline is aggressive, but we will see a coast to coast drive that is fully autonomous by the end of the year. My guess is that Elon is actually driving a car with basic FSD features today so he is seeing first hand how far along the software is at this point. Something similar to the Video they put out in Oct. 2016.