You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I solve this issue by not driving myself at all.You are correct...but missing out on "safe" driving because you are in FSD/AP mode can hurt your score by magnifying the "unsafe" aspects of your driving. If you miss out on doing a bunch of safe driving because your are AP/FSD most of the time, and then do one or two "bad turns" when you don't have FSD/AP engaged, then your score would be worse than what it could have been if you had not used AP/FSD at all.
If you are a cautious driver using FSDb actually hurts you. When attempting to qualify for FSD back when, all my deductions occurred when FSDb did something I was not comfortable with and caused me to disengage then the car immediately complained about the incident. There should be a waiting period after disengaging before determining an infraction occurred to allow the driver to correct the error made by FSDb. Many deductions were from FSDb allowing a merging vehicle to enter the highway very close ahead. When is disengaged to back off I received a following too close warning.Correct me if I'm wrong, but miles driving on FSDb don't count towards your safety score negatively. (In other words, they only help your score, they can't hurt it.) I drive 90% of my miles on AP/FSD...on the days I stick to 99% of miles on AP/FSD, I almost always score a 100. On the days I disengage a lot, my score suffers.
(For those confused, this is for those FSDb users who also have Tesla Insurance.)
I find it weird you’d mention Flat Earthers, but one thing they don’t handle well are uncomfortable truths that challenge their dearest held beliefs. Let’s see how you handle some uncomfortable facts that challenge your beliefs…The book of life. Anyone who is paying anything attention to Tech rather than just living in the Tesla bubble would know that Google created the first NPU AI Training chip in 2015 called the TPU and are at v4 right now. Elon like claiming "first" in everything, which his fanatics @MrTemple blinding regurgitate. Which is like being a flat earther.
While improving the use of regen is a valid goal I'd probably rank it close to the bottom of items that need improvement.
Same feels...I want mine waaaaaaaaaahhhhhhhh!
In your professional opinion, would it be possible for Tesla to repurpose Dojo to help solve the stopping problem? Or is that what it is for?It’s fairly well understood by those who understand what Dojo is. And honestly takes a bit more than a bullet point to explain to lay people.
While I agree with you in general, for my daily commute, I know exactly where FSD is bad and where it's fine. I can proactively disengage in the few instances where I'm approaching the bad spots so as to not get dinged. Using this method, I have a consistent 99-100 daily safety score on days featuring only my daily commute.If you are a cautious driver using FSDb actually hurts you. When attempting to qualify for FSD back when, all my deductions occurred when FSDb did something I was not comfortable with and caused me to disengage then the car immediately complained about the incident. There should be a waiting period after disengaging before determining an infraction occurred to allow the driver to correct the error made by FSDb. Many deductions were from FSDb allowing a merging vehicle to enter the highway very close ahead. When is disengaged to back off I received a following too close warning.
There is a short grace period I believe after disengaging FSDb where what you do doesn't ding you. I can't find a description of it right now anywhere though.If you are a cautious driver using FSDb actually hurts you. When attempting to qualify for FSD back when, all my deductions occurred when FSDb did something I was not comfortable with and caused me to disengage then the car immediately complained about the incident. There should be a waiting period after disengaging before determining an infraction occurred to allow the driver to correct the error made by FSDb. Many deductions were from FSDb allowing a merging vehicle to enter the highway very close ahead. When is disengaged to back off I received a following too close warning.
Back when I had a SS to worry about, I had no clue how you would ever trigger the aggressive turning. It didn’t matter how aggressive I took a turn, it wouldn’t register in my SS.bleh, i used full selfdriving most of my drive today...since it worked a lot better and I didn't have to disengage it a lot in my neighborhood, I missed out on taking quite a few low speed turns this morning. As a result, my safety score suffered....score of 94 this morning as opposed to my typical 99 or 100 for my morning drive.
In your professional opinion, would it be possible for Tesla to repurpose Dojo to help solve the stopping problem? Or is that what it is for?
Just spitballing here.
If this works, it would immediately justify the Dojo investment.
So you think the chances are high that they can train the neural net to measure distances with low jitter, and have the results consistently decrease monotonically with time, under a wide range of conditions?So short answer, Dojo will offer MUCH faster AI and NN development which will help the “stopping problem”.
10.69 should have better lane selection, drive smoother, understand unprotected turns better (visualizing a blue creep limit/wall) including ability to wait in a median crossover region (visualizing a blue area), and avoid arbitrary objects (visualizing gray blobs).Can someone summarize what to look for on this new update?
Indeed, as with many things, timing might have just lined up for the video occupancy network to be introduced with 10.69 even though there has been ongoing explorations for years. As you suggest, there could have been an increase in labeling/training compute capacity along with reduction in cost of computing ground truth from a growing data collection fleet combined with experience in supporting "4D" neural network improvements for existing networks, e.g., predicting velocity with Tesla Vision.I'm sure the AI team was constantly looking for ways to solve general objects in "4D"
That car was not part of the early rollout. I went through the safety score back since later September and got first beta in about November. I'm getting it for a different reason, not because of a misconfiguration. Just because it is on my old MX doesn't mean it rolls out to other ones. I guess that is part of your pointTo be clear and not to confuse others, your vehicle is part of the "early" rollout including others who received FSD Beta without needing to be added via Safety Score. This relatively fixed group gets FSD Beta updates differently from those who get later iterations as part of what seems to be a random rollout, e.g., 10% or 25% random selection.
But what you might be getting at is that there doesn't seem to be a technical restriction for legacy S/X to randomly be selected as your vehicle runs 10.69 fine. So it seems like there might be a rollout (mis-?)configuration issue that hasn't allowed these vehicles to get 10.69.1.1.
said by someone that actually got it.I suggest that 10.69.1.1 should not be released more widely. Today, it missing two turns while on navigation and it attempted to run two red lights. I tried rebooting after the missed turns, but the red light issue is a major safety hazard. The car does make most turns correctly and it stops for red lights, so there is definitely a bug that is affecting my car. It has never tried before on previous FSD beta versions to proceed through a stoplight that has been visibly red for an extended time.
So far ALL, not most. [cry me a river rant] I remember getting 1 or maybe 2 early updates quickly after the 10.3 debacle but that was in November or early Dec. Since then I have been back of the pack on every update.While this is a fine and testable theory, a far more likely theory is that, due to the fact that 90% of FSD beta users are the "last" to get the update, you're just in that 90% most of the time.
(moderator edit)TPU is a training chip not inference. Edge TPU (Coral) & Pixel Neural Core is the inference chip. (moderator edit)I find it weird you’d mention Flat Earthers, but one thing they don’t handle well are uncomfortable truths that challenge their dearest held beliefs. Let’s see how you handle some uncomfortable facts that challenge your beliefs…
Google’s TPU is optimized to RUN neural net Tensor operations.
Tesla’s Dojo is made to train neural nets.
I explained this in the parts of my post you didn’t cherry-pick. ““
Apologies I didn’t make the distinction explicit in every sentence I wrote describing it. It’s fairly well understood by those who understand what Dojo is. And honestly takes a bit more than a bullet point to explain to lay people.
The difference between Dojo and TPU is ENORMOUS.
Training of the nets the way Dojo does requires an absolutely GOBSMACKING amount of dataflow. Many orders of magnitude more than required by the TPU processing silicon found and embedded in SoC silicon from Google, Apple, et al, which they use to EXECUTE the neural nets to say recognize objects in an image or do speech to text, etc.
Note: This is why I pointed out that Tesla will be uniquely positioned to offer NN training as a service (similar to Amazon’s varied cloud service offerings.)
Basically Google’s TPU (and Apple’s ML enclave in its A and M series chips, etc) are like the FSD computer in Teslas.
Of course we’ve seen a lot of ML processing silicon the past decade.
Dojo is a COMPLETELY different beast than we’ve yet seen in production (key distinction there).
So when I put a bullet point that Dojo is a key advantage for Tesla’s future. Where exactly do you see a chip in production that competes?
Or were you just being pedantic about the level of explicitness in the phrasing of my point while missing (or ignoring) that point entirely?