willow_hiller
Well-Known Member
Looks like we might actually be able to collect automated disengagement data for drivers with Tesla Insurance:
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Is this from the old 80s TV movie turned mini series called "V" ?That's nothing.
Elon can enlist Starship superHeavy to achieve DOJO overwatch!
View attachment 900158
Good eye!Is this from the old 80s TV movie turned mini series called "V" ?
My car also stops short of neighborhood stop signs. Sometimes it creeps slowly up to the sign, at other times it proceeds from the short stop position. Fortunately, in my case, there are no occlusions at these intersections. However, your comment about the car entering the intersection without visibility of oncoming traffic lanes makes me wonder how a Tesla resolves the difference between not seeing cross traffic and seeing that there is no cross traffic. These are two very different things!I'm noticing a jekyl/hyde side of 25.2 when exiting my neighborhood. Sometimes FSDb slowly creeps to and through the stop sign and other times it launches blind and well short from the stop sign. In the latter case the right side B pillar has no chance of seeing oncoming traffic given a cement wall's placement so FSDb throws the dice, applies excess acceleration, and blasts through the stop sign with a starting position well short of normal. I'll have to look at the latter case closer to see if the normal creep wall is displayed but either way it doesn't seem to come into play.
I encourage you to recalibrate the cameras and to confirm that GPS is pin-point accurate. Zoom in on the map and make sure the car is exactly where it should be on the map.Oofa. Just installed 25.2 and It's way worse than the previous version. Just tried it on a local 2 lane, 35 mph road, and it hugs the double yellow line so much that It doesn't feel safe. Then it was halting and accelerating. I suppose I can reboot, maybe recalibrate the cameras? But this is... Not usable for me on local roads.
Haven't been on the highway yet, have always thought Autopilot was more reliable.. Hope some of the weird changes don't migrate.
Stopping back in to say that it is definitely better after recalibrating and trying again. Different road though.. For whatever reason, FSD always had difficulty on that particular road. It has some hills and spots where it's half paved (right down the middle), definitely a challenge for any vision-based system.I encourage you to recalibrate the cameras and to confirm that GPS is pin-point accurate. Zoom in on the map and make sure the car is exactly where it should be on the map.
It also is not programed to slow in school zones and does recognize railroad signals. Tesla really needs to warn people before someone, who really believes this thing is Full Self Driving, kills themselves.I’m on 10.69.25.2 and the software didn’t want to stop for a train with lights and crossing arms on both sides. Anyone having the same problem ? The attached picture was taken after I applied the brakes.
Nobody believes the car is fully autonomous more than 30 seconds after the first time they engage FSDb.It also is not programed to slow in school zones and does recognize railroad signals. Tesla really needs to warn people before someone, who really believes this thing is Full Self Driving, kills themselves.
I think signing an agreement before use recognizing that this is not true full self-driving and it may do any old crazy thing imaginable at any time should be a bit of a warning, as it were. Course if we feel the need to inform people to open the pizza box before eating the pizza there might be a problem. Those pizza people are out there driving. Probably fewer of them driving Teslas I bet.........It also is not programed to slow in school zones and does recognize railroad signals. Tesla really needs to warn people before someone, who really believes this thing is Full Self Driving, kills themselves.
I agree but how do you explain Gbills expecting it to stop for a railroad crossing, I'll bet he read the warning but still expected it to stop for a train crossing and he/she probably would expect it to stop for a school bus with red flashing lights as well. Tesla needs to put out a list of what FSD will do and will not do and when it will do what it should do.Nobody believes the car is fully autonomous more than 30 seconds after the first time they engage FSDb.
Besides, Tesla puts up a large warning panel when you first enable FSDb telling you how it might do the worst thing at the worst time, or words to that effect.
They did. And that list says who the hell knows, we're working on it. Point is don't ever expect it to do anything you can't fix........Tesla needs to put out a list of what FSD will do and will not do and when it will do what it should do.
On one hand it would be nice if they had a list of specific shortcomings (won’t stop for trains, wind stop for school buses or slow in school zones, etc.) on the other hand creating such a list runs the risk of people assuming it can deal with items not in the list.They did. And that list says who the hell knows, we're working on it. Point is don't ever expect it to do anything you can't fix........
Maybe I was not clear enough on what I was attempting to say, so let me rephrase. Tesla knows what the FSD has been programed to do and what it has not been programed to do. If GBills had known that the car would not stop for a railroad crossing he would not have been surprised when it did not do what is was not programed to do. The same holds true for a stopped school bus, he/she may expect it to stop, but its not programed to. It is possible to make this experiment safer and less stressful if people are better informed.They did. And that list says who the hell knows, we're working on it. Point is don't ever expect it to do anything you can't fix........
Maybe I was not clear enough on what I was attempting to say, so let me rephrase. Tesla knows what the FSD has been programed to do and what it has not been programed to do. If GBills had known that the car would not stop for a railroad crossing he would not have been surprised when it did not do what is was not programed to do. The same holds true for a stopped school bus, he/she may expect it to stop, but its not programed to. It is possible to make this experiment safer and less stressful if people are better informed.
I tell people that with each release, FSD beta comes up with new ways to kill me.Why is the default posture for people not to assume the car will try to kill them? I don’t know why one would assume differently. No informing necessary really - it says right there it will do the wrong thing at the worst time. What else could that mean?