gearchruncher
Well-Known Member
But let's trust the other numbers he gives, right?Clearly he's not updating this statistic.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
But let's trust the other numbers he gives, right?Clearly he's not updating this statistic.
Is Xpeng using production fleet learning with these vehicles or not? And if so, in what form exactly? What sources substantiate this?
There are lots of vehicles equipped with cameras, radar, and ultrasonics that don't upload sensor data and don't do firmware updates.
This is no different than attacking the person instead of the idea.I'm reading Glassdoor reviews for "Xmotors.ai", Xpeng's R&D subsidiary in Mountain View. The reviews are dismal.
This is no different than attacking the person instead of the idea.
Elon is often considered a pretty awful person- pedo tweets, firing people he doesn't like, poor husband, lying about ex employees, abusive to employees, takes credit for things he didn't do, anti-Union, COVID denier, etc. There are whole podcasts about this. But we don't discuss that when we are discussing what Tesla as a company has actually done.
I can find every single category of complaint you found on the Tesla glassdoor page also. So we're in agreement Tesla can't be collecting a lot of data?Read the reviews!
I can find every single category of complaint you found on the Tesla glassdoor page also. So we're in agreement Tesla can't be collecting a lot of data?
No it isn't. Its a multi domain controller, it replaces the hundreds of ECU found in cars with a single computer unit. It facilitates the communication between the main chip (Nvidia..whatever) and the car's actuators. You either design and create your own hardware or work with tier 1s to create one based on the chip of your choice. With Xpeng they used Nvidia's Xavier.There is little visibility into Xpeng's software. I'm unsure how much of Xpeng's ADAS software they actually own, control, or developed in-house.
Xpeng's ADAS software is called Xpilot. The latest production release of Xpilot is Xpilot 3.0.
Xpilot 3.0 uses a system called IPU-03 (third-gen Intelligent Processing Unit). IPU-03 is made by a company called Desay SV Automotive. Desay seems like the equivalent of Mobileye and the IPU series seems like the equivalent of Mobileye's EyeQ series. IPU-03 isn't just chip hardware; in fact, the chip itself comes from Nvidia. As with Mobileye's EyeQ, IPU-03 appears to be an integrated hardware-software package.
QNX is the most popular OS for providing functional safety, security and real time operation and is a standard in the auto industry. Its in over 150 million cars. SDC companies either use QNX or Linux. But to disparage Xpeng because they use QNX secure OS is like disparaging a company because they run their software on Linux or Windows.IPU-03, in turn, runs Blackberry's QNX OS.
Then, Nvidia says:
"Development of the P7 began in Xpeng’s data center, with NVIDIA’s AI infrastructure for training and testing self-driving deep neural networks.
With high-performance data center GPUs and advanced AI learning tools, this scalable infrastructure allows developers to manage massive amounts of data and train autonomous driving DNNs.
Xpeng is also using NVIDIA DRIVE OS software, in addition to the DRIVE AGX Xavier in-vehicle compute, to run the XPilot 3.0 system. The open and flexible operating system enables the automaker to run its proprietary software while also delivering OTA updates for new driving features."So, how much of Xpilot 3.0 did Xpeng actually develop, versus buy from suppliers, namely Desay and Nvidia?
How much of the backend infrastructure for Xpilot did Xpeng actually develop vs. purchase?
Alot of SDC companies uses Nvidia Drive OS including Zoox
I think neural nets benefit from lots of "dirty" data. Training on clean professional data isn't of much value.Ok, so let's discuss data collection. You claim Tesla has a wealth of data, more than anyone else, and this creates a lead in autonomy.
But what about the quality of data? Every time I see this, I hear Tesla has X million/billion miles of data. Except all they generally have is metadata. They clearly are not uploading videos of every car driving all the time, or even just on AP. We can all tell than since they don't upload gigabytes after every drive. Machine learning / vision systems need this raw video to do anything as the "training signal."
So why does it matter at all that Tesla has a bunch of cars out in the world? Aren't 1,000 vehicles logging data all the time with professional drivers giving feedback more useful than 1,000,000 cars just sending back disengagement metadata and a video clip now and then, with the trigger to send those video clips created by the very same network that is being trained?
99% of my disengagements with AP are just because AP is totally incapable of doing the thing I need it to (like slowing down for a red light or moving over for stopped traffic a quarter mile ahead). How do my disengagement stats even help to learn where AP needs to be trained more, or even form a useful metric on if changes are leading to a more useful system?
Tesla's own fracturing of autopilot among AP/EAP/FSD with different feature sets makes this even worse. My car doesn't stop for stop signs/red lights, but other cars do! So I have way more disengagements than a FSD car, just because I didn't want to pay $5K for this feature. Now my data is also worth less to Tesla. It seems if this data was so darn valuable to them, creating immense future value, that they would want as many people using as many features as much as possible, not maximizing their revenue this quarter.
Then you can't fly on aircraft or use medical devices. Huge numbers of safety critical systems rely on 3rd party operating systems that have been proven to support safety critical needs, and come pre-certified and fully supported.If I were an AV company, I would want to use an OS that is either a) my own proprietary OS or b) a free and open source OS. Or a mix of (a) and (b). I would feel uneasy about using a proprietary, closed source, licensed OS owned and controlled by another company.
Then you can't fly on aircraft or use medical devices. Huge numbers of safety critical systems rely on 3rd party operating systems that have been proven to support safety critical needs, and come pre-certified and fully supported.
Look up wind river.
Nobody writes their own OS. It's insane. It's a total waste of developer time. Everyone needs an OS- it's one of the things you buy because it's a commodity, not a differentiator.
Ok, so let's discuss data collection. You claim Tesla has a wealth of data, more than anyone else, and this creates a lead in autonomy.
But what about the quality of data? Every time I see this, I hear Tesla has X million/billion miles of data. Except all they generally have is metadata. They clearly are not uploading videos of every car driving all the time, or even just on AP. We can all tell than since they don't upload gigabytes after every drive. Machine learning / vision systems need this raw video to do anything as the "training signal."
So why does it matter at all that Tesla has a bunch of cars out in the world? Aren't 1,000 vehicles logging data all the time with professional drivers giving feedback more useful than 1,000,000 cars just sending back disengagement metadata and a video clip now and then, with the trigger to send those video clips created by the very same network that is being trained?
99% of my disengagements with AP are just because AP is totally incapable of doing the thing I need it to (like slowing down for a red light or moving over for stopped traffic a quarter mile ahead). How do my disengagement stats even help to learn where AP needs to be trained more, or even form a useful metric on if changes are leading to a more useful system?
Tesla's own fracturing of autopilot among AP/EAP/FSD with different feature sets makes this even worse. My car doesn't stop for stop signs/red lights, but other cars do! So I have way more disengagements than a FSD car, just because I didn't want to pay $5K for this feature. Now my data is also worth less to Tesla. It seems if this data was so darn valuable to them, creating immense future value, that they would want as many people using as many features as much as possible, not maximizing their revenue this quarter.
As for people without FSD, the FSD algorithm can still run in the background making predictions, and pull data only when it is grossly different from user action.
Yes, everything you said makes sense up to here. The issue here is ground truth. There is no ground truth when a Tesla customer car is out driving. You have no idea if the customer disconnected because of danger or because they just wanted to go get a donut, and you have no idea if you missed that speed limit sign. Tesla drivers are not training or even testing the system every time they drive, because the feedback the system can get is minimal, and all learning systems need a feedback signal.For instance if the algorithm is just using accelerometer, but the phone also has a barometric pressure sensor, than I can use the latter to make a ground truth (or closer to).
Only for a very narrow set of actions. Without FSD active, there is no planned route, so FSD has no idea if the user turned at that intersection because they meant to or there was construction. If you watch a video of someone driving a car in the city, and you had no idea of their destination, how would you determine when they did something unexpected? Even if they have NAV on, how often does it get ignored because of traffic, missed turns, or new information?As for people without FSD, the FSD algorithm can still run in the background making predictions, and pull data only when it is grossly different from user action.
You can imagine them, but Kaparthy specifically says he hand codes for specific detections to collect data. They teach it a few stop signs, then hope it can pick up more and more stop sign variations. They are nowhere near just having the system learn by itself and just record and upload "odd" situations.People also underestimate Tesla's increasing ability to identify and collect edge cases as the car's NN predictions improve. Once Tesla Vision can accurately position / predict all road objects, Tesla can ask the fleet for more odd situations, allowing them to tackle the trailing 9s quicker. You can imagine all sorts of odd situations:
They are nowhere near just having the system learn by itself and just record and upload "odd" situations.