Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
IMG_0900.jpeg


Ok everyone. Update your account address to Vegas.
 
Thus was posted on FB, don't see it anymore on X. Just that there was an incident.
I know where v11 is going to fail me and the situations I need to pay close attention to. Wonder if v12 is improved enough for a false sense of security and when it does something unpredictable the drivers are unprepared to react. Sounds like when v12 gets to me, I'll be on high alert constantly.
 
Until V12 can reverse, it's not really going to be able to park… If this makes headlines, Tesla might check the logs
Perhaps the failed perpendicular parking attempt / crash (24th) did happen with active 12.2.1 (19th release)? I wonder if it was actually related to end-to-end wanting to reverse to adjust, but post-processing heuristics might currently restrict 12.x to forward actions, so it might have gotten confused? If so, probably pretty surprising that 12.x would even decide to creep into a parked car worthy of extra investigation?
 
Teslascope implied that someone was misusing FSD in some manner, leading to a crash. The parking lot incident would not seem to have been a misuse incident - though certainly a failure to adequately monitor the car.

Hopefully, we will get some additional details at some point. If it's a FSD weakness, it would be good to know of something that needs extra care.
 
With E2E totally possible. Some homicidal person could have hit a pedestrian at sunset and made it into the training data.
Technically possible but you say this like it's plausible way V12 could pick up bad driving behaviors. That seems pretty unlikely and a lot of things would have to come together an in unlikely way:

- Tesla would need to be sourcing training clips from the public for complex driving scenarios without review (as opposed to sourcing V12 training material purely from employees and synthetic data). Or an employee introduced that clip.
- Tesla's curation of training data missed an extremely obvious poor behavior (missed both as part of curation on Tesla's end and when the collision clip was uploaded from the car)
- The clip of a pedestrian collision would need to have an outsized impact on system behavior (it's training on millions of clips! it has ample training material demonstrating avoiding pedestrian. I would expect a single clip of bad behavior to essentially be overridden by the rest of it's training.)
 
Teslascope implied that someone was misusing FSD in some manner, leading to a crash. The parking lot incident would not seem to have been a misuse incident - though certainly a failure to adequately monitor the car.

Hopefully, we will get some additional details at some point. If it's a FSD weakness, it would be good to know of something that needs extra care.
I'm going to have to retract this. Teslascope is now stating that the incident was the only one recorded. That would certainly point toward the parking lot crash.

 
Never mind. Just found it. Also read an interesting comment on the video.
Of ALL the versions to release first to the small OG tester group (chuck, dirty Tesla, AI driver, etc.) this was the version to do so. This is by far the most significant change in the build, and they (seemingly randomly, but there must have been some reason) decide to release it to a random group of people. Why? They had a system that worked well. This is without a doubt an idiotic decision that has not paid off well. A classic case of “if it ain’t broke, don’t fix it.” Whoever was driving that car had PLENTY of time to stop it, but they were probably not trained enough with the beta in order to know what to do etc. 100% preventable if they would have released it to the OG crew. Really a strange decision overall.
 
  • Like
Reactions: KArnold
Additional sensors would mitigate issues like that (lidar, radar, etc). But I doubt they'll be adding a new sensor suite anytime in the next 20 years.
I doubt it with E2E. It's not a perception issue at all. For example when it tries to run into pedestrians, you can see in the visualization the perception engine in the car can clearly see the pedestrian (and identify them as one). It's the decision making part that still decides to run into them.

It's the same deal with L4 cars (which are loaded with lidar and radar sensors). They still run into buses and trucks even though the sensors clearly can detect them, That's an issue with the software, not the sensors.
 
  • Like
Reactions: rlsd and JB47394
I doubt it with E2E. It's not a perception issue at all. For example when it tries to run into pedestrians, you can see in the visualization the perception engine in the car can clearly see the pedestrian (and identify them as one). It's the decision making part that still decides to run into them.

It's the same deal with L4 cars (which are loaded with lidar and radar sensors). They still run into buses and trucks even though the sensors clearly can detect them, That's an issue with the software, not the sensors.
I guess more training is needed then. I've never been a fan of the vision-only approach but if they pull it off props to them.
 
  • Like
Reactions: zoomer0056
I guess more training is needed then. I've never been a fan of the vision-only approach but if they pull it off props to them.
The E2E approach is the issue here. E2E does not equal vision-only. For example, Wayve have introduced radar into their E2E solution.
Introducing radar: Wayve's sensor stack explained

However, the problem with E2E is that it's hard to know the reason why the software is doing something. The advantage is you no longer have hard coded functions, so potentially it can improve with training (instead of relying on engineers to write a new function to handle specific cases). But that is also a problem, because you don't have clear "rails" any more in terms of specifically excluding certain behavior.
 
  • Like
Reactions: laiod and sleepydoc
Driver doesn’t have a very good record either. As we know it requires incredible inattention or negligence to get a strike, and he apparently had at least two!

I hope you're being sarcastic there. If you have no interior camera, it's incredibly easy to get a strike. Just drive through about two miles of inactive construction zones with both hands on the wheel, and pay attention to the road like you're supposed to instead of noticing the flashing on your dashboard until it starts beeping at you, then do that two or three more times close enough together before leaving the construction zone. Boom. Instant strike.

That's not to say that this particular person wasn't incredibly inattentive or negligent, though. 😁