diplomat33
Average guy who loves autonomous vehicles
And now Tesla has this “edge case” to program into the neural net so it should never happen again.
In the strictest sense, yes, it is an edge case that FSD will need to be able to handle. But I would never be so callous. A person's death is never just another "edge case" to solve.
I do suspect that one big reason why Tesla is so committed to reaching L5 autonomy as quickly as possible, is because FSD is the only way to truly solve these edge cases on the current hardware. The fact is that the current AP with it's limited use of cameras and limited NN, is never going to be able to handle these sort of cases. The only way to guarantee that you have solved a scenario like this, is with all cameras active, including the front side cameras, and the full NN that can detect and predict all vehicles and objects. Doing so, requires that Tesla finish their work on FSD. The sooner Tesla finishes FSD, the sooner they can make the cars much safer and prevent these accidents from ever happening again.
I had the "Full Self Driving" trial the last few weeks and I don't trust it over 25mph or stop and go traffic. On a straight 2 lane undivided highway I had it engaged going ~55mph when we came up to a tractor that was half on the road and half on the shoulder. It didn't see the tractor and I had to take over at the last moment putting 2 wheels over the yellow line slightly. That's when it finally freaked out about the oncoming car (that was accommodating me by moving over) and auto-braked. What it should have done is slow down behind the tractor. I was never in any danger because I was ready to take over and wanted to see how it handled the situation, but the answer is it failed. Stop calling it FSD, it's adaptive cruise control with lane keep and some gimmicks that work less well than just doing it yourself.
Autopilot is not designed to handle that situation yet. Once Tesla releases the full FSD and says that Autopilot is L5, then yes, it will be able to handle that situation.