You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I know where v11 is going to fail me and the situations I need to pay close attention to. Wonder if v12 is improved enough for a false sense of security and when it does something unpredictable the drivers are unprepared to react. Sounds like when v12 gets to me, I'll be on high alert constantly.Thus was posted on FB, don't see it anymore on X. Just that there was an incident.
I bet it tried to run down a pedestrian during sunset because the sky was orange.
With E2E totally possible. Some homicidal person could have hit a pedestrian at sunset and made it into the training data.
Perhaps the failed perpendicular parking attempt / crash (24th) did happen with active 12.2.1 (19th release)? I wonder if it was actually related to end-to-end wanting to reverse to adjust, but post-processing heuristics might currently restrict 12.x to forward actions, so it might have gotten confused? If so, probably pretty surprising that 12.x would even decide to creep into a parked car worthy of extra investigation?Until V12 can reverse, it's not really going to be able to park… If this makes headlines, Tesla might check the logs
Technically possible but you say this like it's plausible way V12 could pick up bad driving behaviors. That seems pretty unlikely and a lot of things would have to come together an in unlikely way:With E2E totally possible. Some homicidal person could have hit a pedestrian at sunset and made it into the training data.
I'm going to have to retract this. Teslascope is now stating that the incident was the only one recorded. That would certainly point toward the parking lot crash.Teslascope implied that someone was misusing FSD in some manner, leading to a crash. The parking lot incident would not seem to have been a misuse incident - though certainly a failure to adequately monitor the car.
Hopefully, we will get some additional details at some point. If it's a FSD weakness, it would be good to know of something that needs extra care.
Additional sensors would mitigate issues like that (lidar, radar, etc). But I doubt they'll be adding a new sensor suite anytime in the next 20 years.I bet it tried to run down a pedestrian during sunset because the sky was orange.
With E2E totally possible. Some homicidal person could have hit a pedestrian at sunset and made it into the training data.
I doubt it with E2E. It's not a perception issue at all. For example when it tries to run into pedestrians, you can see in the visualization the perception engine in the car can clearly see the pedestrian (and identify them as one). It's the decision making part that still decides to run into them.Additional sensors would mitigate issues like that (lidar, radar, etc). But I doubt they'll be adding a new sensor suite anytime in the next 20 years.
I guess more training is needed then. I've never been a fan of the vision-only approach but if they pull it off props to them.I doubt it with E2E. It's not a perception issue at all. For example when it tries to run into pedestrians, you can see in the visualization the perception engine in the car can clearly see the pedestrian (and identify them as one). It's the decision making part that still decides to run into them.
It's the same deal with L4 cars (which are loaded with lidar and radar sensors). They still run into buses and trucks even though the sensors clearly can detect them, That's an issue with the software, not the sensors.
The E2E approach is the issue here. E2E does not equal vision-only. For example, Wayve have introduced radar into their E2E solution.I guess more training is needed then. I've never been a fan of the vision-only approach but if they pull it off props to them.
Driver doesn’t have a very good record either. As we know it requires incredible inattention or negligence to get a strike, and he apparently had at least two!