LiveLong&Profit
Member
I have seen a lof of Chuck's videos trying to cross the double(?) highway - he even had one with a drone watchingAs a follow up to this, when Chuck tweeted that, he was talking specifically about a special case of his unprotected left turns, where the computer gets confused and ends up turning right instead of left. More FSD Beta 9.2 videos have come out from others that do have some successful unprotected left hand turns. Overall, there are some mistakes here and there, but I believe that Beta 9.2 is overall a good improvement over 9.1. Here are some FSD Beta videos from today:
(I haven't seen the latest videos yet - sry)
The earlier ones were part disappointment and part ... relief!
Some of the attempts at crossing look quite dangerous - people are driving by *fast* in both directions.
I am not sure the self-driving software is confused - likely just cautious.
I had a weird sensation that the car was reacting like this: "You ordered me to cross, but my primary concern is to keep you safe - so I am going to ignore your self-destructive command and just go another way" - 1st law of robotics.
It may turn out that the current progress wrt self-driving will have a rather underestimated and slightly weird effect - to make the roads safer for everyone - even when no autonomous cars are around, by simply re-evaluating some sketchy traffic and roads design!
We know that people are not logical and are reckless too often. And we also know that public works are not always done with respect and caution and careful planning.
It may turn out that self-driving cars will be silent but also very effective criticism of some ill-conceived traffic solutions, simply by refusing to risk the lives of Tesla passengers.
Off course, I will be very impressed if/when the vision-only is good enough to do the crossing under a lot of traffic conditions. But - I think it is a possibillity it only 'dares' to do it in very light traffic conditions. Let's see what happens.
Another way to handle this is as hyper-parameter called a 'risk' setting which the driver could, somewhat, control. Maybe this is what the focus on accelerating was always (also) for: The abillity of self-driving car to merge into traffic - really fast.
A risk setting would have to be very carefully thought out and implemented extremely well in order to guard against too reckless behaviour (and subsequent PR and political backlash) - but, it could perhaps solve the problem of being 'too nice' in fierce traffic.