I think we should all be required to do one of these before our cars will start. You know, to lend the neural net programmers/trainers a hand.That's pretty much how I imagine the 101 accident happening.
That video is crazy because I would think AP would be programmed to avoid chevrons and not just continue driving.
You're not wrong of course. But this should probably be one of the next things that AP programmers take on. It is a known limitation of the system but one which should be able to be overcome with the current hardware. Atleast situations above where the car is driving over chevrons which is never an acceptable lane.What I see:
Autopilot working as designed by following the most clearly marked white line on the highway. IT IS KNOWN that Autopilot as of right now does not read road markings other than lanes — no stop lines, no median markings, no differentiation between outside lanes, dashed lanes, solid lanes, double-solid lanes, or anything like that. It is also known that Autopilot as of right now has difficulty detecting stationary obstacles.
This is what happens when you don't pay attention. Obviously, this was done for demonstration purposes (with a terrible clickbait video title), but all the responsibility in this video rests with the driver — the car had steered into the median zone for at three full seconds before the driver took control.
Is this what happened in the tragic accident in California? Quite possibly, and it only further underscores that Autopilot is still not a fully autonomous system. You're advised to not take your hands off the wheel, and though I know most, if not all, of us do, you still have to maintain awareness of the road around you. You are responsible for letting Autopilot almost kill you.
Suppose we change the semantics. Suppose we call it Lane Keeping Assist (LKAS) rather than Autopilot (let's set aside the Speed Assist aspect for now). With this semantic change it completely takes on a different and more realistic meaning. It also changes the expectation. Tesla is not the only automaker with LKAS, and any car with LKAS will drive into a gore point barricade just as easily, because we know LKAS is not self-driving. It could happen to any LKAS-equiiped manufacturer. Why is Tesla put on the defensive here and is it natural to assume that Autopilot is self-steering? It is not--it's really just LKAS, a good one. Nissan Leaf will have ProPilot LKAS. BMW/VW/MB have LKAS. Chevy Trucks will have LKAS this year.
Attached is a satellite photo of the 101 fatal crash site. In the lower right, just above the "B" in Bayshore, you can see where the collision occurred into the barrier at the edge of a wall. Just above it is the lane for the "flyover" ramp. Below it is the main highway left lane. You can see how the pair of solid lines marking a pavement-level "virtual island" could be mistaken for a traffic lane.
View attachment 291110
Also plenty of warnings where you should have taken control.
Except the screen flash/pulse warnings are just about having your hands on the wheel, right?
I.e., it's not the car warning that it doesn't know where the lane is and that the driver needs to take over, correct?
So I don't see how these warnings are relevant to the topic. It wouldn't change anything for a distracted driver who doesn't notice the car veering off course.