Kablooie
Member
No. It was just a short term glitch in my particular car. I drove a few miles manually, parked for about half an hour and FSD was back when I started up again.So has V12 been disabled for all who have it?
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
No. It was just a short term glitch in my particular car. I drove a few miles manually, parked for about half an hour and FSD was back when I started up again.So has V12 been disabled for all who have it?
FSD beta update information has never been so quiet...what's going on with this investigation and is there any news/updates?Is there any news on the investigation of the accident that’s delaying the V12 deployment?
Even if it Was the drivers fault they still need to figure out Why the car allowed it to happen for their end game target. Last thing they want to do is release this wide and have hundreds of “may be the drivers fault” things to investigate.FSD beta update information has never been so quiet...what's going on with this investigation and is there any news/updates?
My hope is that it was the drivers fault and FSD was not engaged...AND that Tesla is working on training V12 for a broader deployment before April.
Fingers crossed!
My guess as to why it’s taking so long to release a fix for the FSD accident (based on my rudimentary layman’s knowledge.) First they have to look at the data and logs and run some simulations to determine why the computer made the error. Then they have to find hundreds or thousands of similar situations in their video database and retrain the system with them. This is time consuming but will be faster in the future as more compute power is brought online. Then they probably test it in the simulator and if it looks good it goes out to internal testers. Once that’s working it is released to the select proletariats and someday to the great society.FSD beta update information has never been so quiet...what's going on with this investigation and is there any news/updates?
My hope is that it was the drivers fault and FSD was not engaged...AND that Tesla is working on training V12 for a broader deployment before April.
Fingers crossed!
So TACC (and Double Pull) is still in the v12 Autopilot Menu and ISN'T grayed out?????? What happens if you select it?Hasn't been disabled for me. Also attached is screen shot showing that double pull autopilot activation is disabled for v12 fsd.
v12.2.1 is not reliable in parking lots.
Yesterday my car attempted to switch to the outgoing lane painted with big and long an white arrow going to the opposite direction while it's going in the parking lot.
Many times v12.2.1 moved to any open space in the parking lot regardless it's going in or going out.
Tesla has had quite a few human labelers that previously needed to annotate vehicles, objects, etc. that also wanted high quality for perception. This had transitioned to auto-labeling, but there were likely checks to ensure correctness such as multiple people labeling as well as automation that became auto-labeling. The training data is also curated over time, so in the process of training, those that the network has trouble learning could indicate which are problematic examples either due to the network unable to learn that pattern or that it was actually labeled incorrectly. Presumably the labels are associated with the labeler to allow for improvements to the labeling system whether it was human or computer generated.what happens if a human makes a mistake and submits a bad behavior into the training system as a good behavior?
Maybe I'm using FSD Beta differently from you, but I was paying extra attention to my 11.x disengagements yesterday, and quite a few of them were preemptive either knowing 11.x would have trouble such as completing multiple lane changes smoothly or preventing 11.x from making an unnecessary lane change by keeping my hands on the wheel preventing it from turning. Video clips sent back a few seconds after disengagement would capture the good example of what FSD Beta should have done. Looking at the last 30 days of upload to Tesla, it's almost 1TB, so maybe it is from continued 11.x usage / disengagement / voice drive-notes to continue collecting examples for 12.x training?The problem with disengagements is by definition, something went wrong causing the disengagement so they would be ruled out for training data.
Uninformed interpretation:
May have thought there was a cross bar, two potential positions:
View attachment 1025519
View attachment 1025521
I see what you're saying, but wasn't the vehicle making a left so the median wasn't in the path?It looks like medians are causing some trouble. We typically drive on the right side of medians. So without proper context why not turn right there?
Beautiful example of decision wobble here, I've seen this a couple times:
The left path seems like a toss up of whether it is one-way or two cars wideAnother instance of median induced confusion?
I see what you're saying, but wasn't the vehicle making a left so the median wasn't in the path?
The left path seems like a toss up of whether it is one-way or two cars wide
View attachment 1025527
Looks to me the car went to the right to give way for the car coming from the opposite direction.Beautiful example of decision wobble here, I've seen this a couple times:
Honestly, I was never expecting parking lots from FSD anyway. It'd be nice, but parking lots are probably some of the most difficult driving you can do, not to mention all the other issues (no mapping, knowing where the door is, deciding which door you park by, etc)v12.2.1 is not reliable in parking lots.
Yesterday my car attempted to switch to the outgoing lane painted with big and long an white arrow going to the opposite direction while it's going in the parking lot.
Many times v12.2.1 moved to any open space in the parking lot regardless it's going in or going out.
something else I'd like to know is how the object perception network is coded and integrated into the system as a whole. If object detection is not part of the end to end AI but rather performed separately and then fed to the AI code the accident may well have been an issue with 'Tesla Vision.' Garbage in, garbage out, right?My guess as to why it’s taking so long to release a fix for the FSD accident (based on my rudimentary layman’s knowledge.) First they have to look at the data and logs and run some simulations to determine why the computer made the error. Then they have to find hundreds or thousands of similar situations in their video database and retrain the system with them. This is time consuming but will be faster in the future as more compute power is brought online. Then they probably test it in the simulator and if it looks good it goes out to internal testers. Once that’s working it is released to the select proletariats and someday to the great society.
This is more time consuming than writing some new code or adjusting some variables as was done in the past.