Yes, not bugs.What would you characterize errors in analysis & design as ? Not bugs ?
You're right. We have a different concept of bugs.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Yes, not bugs.What would you characterize errors in analysis & design as ? Not bugs ?
It *is* feeling its way through the turn (at least, that is my guess based on observations). The car starts the turn based on the path prediction which is in turn based on the BEV generated by the cameras/NN. As the car turns, the new view causes updates to the BEV and path, and the car applies corrective steering to match the new path. What seems to happen (again, my take based on observations) is that the car tries to correct to the new path too aggressively (short-distance), rather than plotting a path that gradually moves from the old to the new path. Since this whole process is very dependent upon lighting and atmospheric conditions it goes a long way to explaining the variability in the car when making the same turns (again, speculation on my part, but I'm probably not far wrong).Yeah, I find that FSD Beta jerks the wheel a lot during turns like it is kind of feeling its way through the turn. Also, sometimes, coming to a complete stop a red light but where the car will need to make a turn, it turns the wheel before stopping instead of keeping the wheel straight and only turning when it starts the turn.
In a certain sense I tend to agree. A "bug" is generally defined as a software defect that cause the software to behave in an unintended manner. But what is the "intended" manner? Ultimately, that must be some design document(s) that specify the intent of the software. So, in a certain formal sense, you cannot have "bugs" in design documents, though you can of course have flaws/errors/omissions in the design.Yes, not bugs.
You're right. We have a different concept of bugs.
In the old days the Functional Design Specification (FDS) defined the software. Anything not adhering to that was a bug.Ultimately, that must be some design document(s) that specify the intent of the software.
That would be a pretty detailed design document to specify all the possible behaviors of every event down to that level. A more concise and appropriate early design would be to determine a path through the drivable area and follow it. There are an infinite number of paths, so some constraints must also be given, such as minimizing the length, and curvature (this would be part of comfort) among others. Each of these is given a weight, and if the number of constraints is more than the degrees of freedom, not all of them can be satisfied simultaneously. The larger weights are given to what are considered by the developer to be most important at the current stage of development.BTW, in this case I don't think any "design" would have specified anything about jittery, twitchy behavior of the steering wheel. But probably one of the design goals was to minimize "discomfort"(from AI day) - and that is what the jittery behavior is breaking.
Probably (obviously) the highest weight is to safety. But equality likely is the car is VERY cautious when I comes to safety, so it over-reacts to things that it thinks are an issue, and jerks the wheel. This is mostly a sign of the maturing NN making safety predictions about other vehicles (I suspect). We currently have first-order rules like "dont stray into the predicted path of another car", but have not yet got enough second-order rules that handle special cases where its ok to do this. As that second-order rule set grows it's likely we will see less jerkiness, but it will take a long time for it to get to 99%.That would be a pretty detailed design document to specify all the possible behaviors of every event down to that level. A more concise and appropriate early design would be to determine a path through the drivable area and follow it. There are an infinite number of paths, so some constraints must also be given, such as minimizing the length, and curvature (this would be part of comfort) among others. Each of these is given a weight, and if the number of constraints is more than the degrees of freedom, not all of them can be satisfied simultaneously. The larger weights are given to what are considered by the developer to be most important at the current stage of development.
So clearly some weight has been given to comfort, for the path could certainly be much more tortuous. Therefore the design isn't broken. It's what they have currently spec'd. Personally, you might have assigned a higher weight to "comfort". But the assignment of weights is a tradeoff and highly interrelated. Giving comfort more weight means other criteria are not met as accurately.
I'm sure the design of the conestoga wagon didn't specify that the trip across the country should take 9 months. Certainly everyone would want it to take less time, so by your criterion the design was broken. There were faster modes of travel at the time.
The car will jerk the wheel around with no other vehicles in sight.This is mostly a sign of the maturing NN making safety predictions about other vehicles (I suspect).
Elon said the sensitivity for the voxel map still needs some tweeking. Perhaps FSD is detecting phantom unclassified objects because of this.The car will jerk the wheel around with no other vehicles in sight.
Do we have some evidence from somewhere that the voxel map is actually being used? I know it was operating in the background for some time without being used and there were a bunch of Twitter threads about how it wasn't being used for driving at that time. But that was a while back. Do we have some direct evidence that they're actually using it in the released FSD Beta builds now (note - turned on and using it are two distinctly different things)? I did some brief poking around the other day but in the end I didn't find anything conclusive.Elon said the sensitivity for the voxel map still needs some tweeking. Perhaps FSD is detecting phantom unclassified objects because of this.
Indeed .. and is just another example of the immature rule set (and other issues with path predication I have already discussed)The car will jerk the wheel around with no other vehicles in sight.
I'm pretty sure they would have to be now they no longer use radar. Speculating, of course.Do we have some evidence from somewhere that the voxel map is actually being used? I know it was operating in the background for some time without being used and there were a bunch of Twitter threads about how it wasn't being used for driving at that time. But that was a while back. Do we have some direct evidence that they're actually using it in the released FSD Beta builds now (note - turned on and using it are two distinctly different things)? I did some brief poking around the other day but in the end I didn't find anything conclusive.
But from AI day - planner is not rule based - its cost optimization based.Probably (obviously) the highest weight is to safety. But equality likely is the car is VERY cautious when I comes to safety, so it over-reacts to things that it thinks are an issue, and jerks the wheel. This is mostly a sign of the maturing NN making safety predictions about other vehicles (I suspect). We currently have first-order rules like "dont stray into the predicted path of another car", but have not yet got enough second-order rules that handle special cases where its ok to do this. As that second-order rule set grows it's likely we will see less jerkiness, but it will take a long time for it to get to 99%.
Elon said the sensitivity for the voxel map still needs some tweeking. Perhaps FSD is detecting phantom unclassified objects because of this.
That would be a pretty detailed design document to specify all the possible behaviors of every event down to that level. A more concise and appropriate early design would be to determine a path through the drivable area and follow it. There are an infinite number of paths, so some constraints must also be given, such as minimizing the length, and curvature (this would be part of comfort) among others. Each of these is given a weight, and if the number of constraints is more than the degrees of freedom, not all of them can be satisfied simultaneously. The larger weights are given to what are considered by the developer to be most important at the current stage of development.
So clearly some weight has been given to comfort, for the path could certainly be much more tortuous. Therefore the design isn't broken. It's what they have currently spec'd. Personally, you might have assigned a higher weight to "comfort". But the assignment of weights is a tradeoff and highly interrelated. Giving comfort more weight means other criteria are not met as accurately.
From what I understand, this is not true, because Tesla Vision has been around for months and it was indicated (by rice_fry on Twitter as I recall) that the voxels might exist but they weren't being used for guidance at that time. There was some tweet about the pillars.I'm pretty sure they would have to be now they no longer use radar. Speculating, of course.
That was the impression I got from his latest (that I'm aware of) tweet on the subject. Also there was a video of FSD avoiding overhanging branches. But I don't know for sure.Do we have some evidence from somewhere that the voxel map is actually being used? I know it was operating in the background for some time without being used and there were a bunch of Twitter threads about how it wasn't being used for driving at that time. But that was a while back. Do we have some direct evidence that they're actually using it in the released FSD Beta builds now (note - turned on and using it are two distinctly different things)? I did some brief poking around the other day but in the end I didn't find anything conclusive.
For example, this tweet does not say they are using it in FSD 10:
Are you trolling now? I've already answered this question twice.How come it doesn't detect these when taking the same kind of right turn without stopping because of a stop sign ?
That is why I ask - what exact cost is being minimized when jerking the wheel back and forth without moving ?
Doesn't make sense. What does "distance from the current path" even mean ?Are you trolling now? I've already answered this question twice.
Distance from the current path. It's unable to effect this though because the car is not moving.
Nope - even if something is low weight, cost optimizer won't ask for jerks, if its not helping in something else. What would jerking help ?The suppression of the jerking isn't happening because there is a relatively low weight to do so. They don't care at this point.
Fsd calculates a path and the car is not on that path. Let's say it's 1 ft to the left of where your car is, then the distance to the path is 1 foot.Doesn't make sense. What does "distance from the current path" even mean ?
The optimizer isn't asking for the jerks. The jerks are a natural consequence of noise and slow speed. If you don't do something to suppress them they will occur.Nope - even if something is low weight, cost optimizer won't ask for jerks, if its not helping in something else. What would jerking help ?
It's not a difference in uncertainty.You have a current fixed position. The car needs to go to another location - and needs to turn for that. The target location is a very short distance away and very easily seen. There is simply no reason why there would be a lot of uncertainty in this scenario compared to when the car is already moving.
Yes, but the cost is calculated based on rules (almost by definition).But from AI day - planner is not rule based - its cost optimization based.
That's quite possible, as I noted I'm speculating, we we all are to an extentFrom what I understand, this is not true, because Tesla Vision has been around for months and it was indicated (by rice_fry on Twitter as I recall) that the voxels might exist but they weren't being used for guidance at that time. There was some tweet about the pillars.
I believe their distance estimation, which is what you'd need for Tesla Vision, doesn't have to use the voxel map. Could be wrong though. I'm just looking for the evidence from a reasonably reliable source; I'm not making claims.