strangecosmos
Non-Member
all of that could be requested
Awesome! This is really important info! This would seem to indicate that Tesla collected (in some unknown number of snapshots) the requisite data to do imitation learning.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
all of that could be requested
Awesome! This is really important info! This would seem to indicate that Tesla collected (in some unknown number of snapshots) the requisite data to do imitation learning.
Yes, they have a resource they can tap for training FSD when they're ready to do that
if you want to avoid biasing your model, you need representative data -- this means you need data when nothing "interesting" is happening
And when they do tap it, it will be expensive.
And you may have noticed that they don't have a lot of cash right now.
Since at least November 2017, then, Tesla started hiring for this position. That puts work on a full self-driving simulator at ~1 year before Amir's article, at least.
Awesome! This is really important info! This would seem to indicate that Tesla collected (in some unknown number of snapshots) the requisite data to do imitation learning.
(Source: BMW technology: How the carmaker is developing its autonomous driving system)
Sure, BMW says it's doing imitation learning on an engineering fleet of 80 cars, but not on hundreds of thousands of production cars. Similarly, Waymo is doing imitation learning on its hundreds of engineering cars, but not on any production cars.
Tesla also collects some raw sensor data from production cars, as Karpathy has discussed at length and as verygreen's hacking has confirmed. The true scale of collection is hard to know.
We don't have insight into the scale of Tesla's data collecting ...
I think navigation and driving are different problems. I believe the navigation and route is handled by a traditional GPS navigation system like Google Maps or TomTom or whatever. I believe supervised imitation learning only comes in at the level of discrete driving tasks, like taking a right turn at an intersection, or taking an exit off a highway, etc.
Awesome! This is really important info! This would seem to indicate that Tesla collected (in some unknown number of snapshots) the requisite data to do imitation learning.
Handling other humans. They work great when people follow rules, but one idiot doing illegal things or completely unexplained decisions requires human intervention.
this is oversimplification, but 0.1% is also a great exaggeration.We do, we already estimated based on @verygreen research that only about ~0.1% of data driven are actually uploaded.
It's probably not super sensible to collect regular driving data from customer cars - you can just pay some money to a dedicated driver and get better product (additional inline annotations).
Well, the outliers are still valuable, if your perception system is good enough to recognize them and you then collect all of them (something that does not happen).The problem with this is, if this is true then @strangecosmos entire thesis falls apart.
@strangecosmos entire thesis falls apart.
hw2.5 adds a bunch of redundancy of course.The lack of redundancy in various systems likely means this will never actually pass the liability hurdles and so Tesla will never allow it, but that's a separate issue
hw2.5 adds a bunch of redundancy of course.
And additionally hw3 does on the internal level
hw2.5 adds a bunch of redundancy of course.
And additionally hw3 does on the internal level
Well, the outliers are still valuable, if your perception system is good enough to recognize them and you then collect all of them (something that does not happen).
What good is your 0.01% probability trigger for something that happens once a year to a single car in your fleet? So the argument would jus shift that "in the future it's still a valuable capability" I suspect.
There is very little in terms of sensor redundancy as well as known blindspots for the cameras at low speeds.
You still have only one forward-facing radar, and many key areas are only within view of a single camera. I don't know how much redundancy they have in power delivery, transmission, and actuation either, nor how much fault tolerance in those systems. If the compute system goes down or haywire, do the actuators enter a fail-safe state, or do they go haywire? Is there any kind of monitoring system to ensure correct operation of every component? These are the things you need to consider if you're going to take liability away from the human. Do they truly have enough compute power available to bring the car to a safe stop if the HW3 chip stops working? Do they even have any way to know that it's not working anymore? What kind of "internal redundancy" does the HW3 chip itself have?
In almost all cases, the car can move to the side of the road with a sensor missing. If redundancy is required, then that is what is would need to do anyway (redundancy lost).There is neither sensor redundancy nor sensor diversity. Basically if any single sensor fails it is game over -- and this includes the sensor working nominally (sending camera frames) but the software fails to detect objects in the frames due to a flaw in the ML model (a problem which could be fixed by sensor diversity -- i.e., multiple sensing modalities, which they only have in front of the vehicle via radar and camera.)
what blindspots? (outside of the directly in front of the car)There is very little in terms of sensor redundancy as well as known blindspots for the cameras at low speeds.
what blindspots? (outside of the directly in front of the car)
what blindspots? (outside of the directly in front of the car)
that's ok, when a camera failure is detected, the car stops and needs to be fixed before the functionality is allowed again. They already do it.many key areas are only within view of a single camera
on hw2.5 the steering actuator has two paths. hw3 consists of two compute nodes, one monitoring the other. if the primary fails, secondry is the same and is able to take over.Do they truly have enough compute power available to bring the car to a safe stop if the HW3 chip stops working
what do you mean by "hw3 chip"? they have four TRIP chips. two per compute node.What kind of "internal redundancy" does the HW3 chip itself have?
the bolded part (I bolded it) is not wrong. Tesla is positioned very well to collect this data. not so sure about the actual usefullness of it, but collection could be done just fine.The thesis is that autonomous driving is enabled by training NNs on millions or billions of miles of state-action-pairs that only Tesla is currently positioned to collect — so the thesis goes.
well, not so important on say highway, though.Directly in front of the car is important particularly for driveway/parking lot navigation. Imagine toddlers or pets close to the car. I bet there are lots of blind spots, and ultrasonic is pretty unreliable and imprecise.