Separate names with a comma.
Discussion in 'Model X' started by ChicagolandMX, Apr 1, 2018.
Yikes, but let’s change the title.
Also plenty of warnings where you should have taken control.
That's pretty much how I imagine the 101 accident happening.
That video is crazy because I would think AP would be programmed to avoid chevrons and not just continue driving.
Informative video. I would agree with Gator. That is a likely scenario.
But how was that video made? Was the person intentionally ignoring for the video or was he distracted and it was caught on the dash cam?
My fear is they will increase the nanny frequency.
Didn’t the keep your hands on steering wheel warning come after the Joshua brown truck incident?
This post is SO misinformed and such BS frankly. I can make a post about how easily it is to die in a Bentley, or Honda, or Chevy, and then allow the car to drive off a cliff with my hands off the wheel.
AutoPilot is NOT full autonomous driving. It is designed to have the driver REMAIN ATTENTIVE and keep their hands on the wheel at ALL times ready to take over in a split second if the car is headed somewhere wrong.
In this case, the (moron) driver was using his hands to hold the camera to make this stupid video, rather than pay attention to the road. He ignored multiple visual and audio alerts to put hands on the wheel and had to drop the camera when a crash almost happened.
This is an example of what NOT to do. It’s foolish and immature and threatens to hurt the entire move toward (eventual/safe) autonomous driving.
AP is safely used hundreds of thousands of times a day by people who read the manual and follow the warnings. All it takes, however, is for one daredevil like this who wants to score “likes” on YouTube to ruin it for all the rest of us!
STOP DOING THIS! If you’re going to make AutoPilot “show off” videos, do it with a Dash Cam or a dash-mounted GoPro or cell phone, NOT with a handheld camera. You’re inviting trouble!
I think we should all be required to do one of these before our cars will start. You know, to lend the neural net programmers/trainers a hand.
What I see:
Autopilot working as designed by following the most clearly marked white line on the highway. IT IS KNOWN that Autopilot as of right now does not read road markings other than lanes — no stop lines, no median markings, no differentiation between outside lanes, dashed lanes, solid lanes, double-solid lanes, or anything like that. It is also known that Autopilot as of right now has difficulty detecting stationary obstacles.
This is what happens when you don't pay attention. Obviously, this was done for demonstration purposes (with a terrible clickbait video title), but all the responsibility in this video rests with the driver — the car had steered into the median zone for at three full seconds before the driver took control.
Is this what happened in the tragic accident in California? Quite possibly, and it only further underscores that Autopilot is still not a fully autonomous system. You're advised to not take your hands off the wheel, and though I know most, if not all, of us do, you still have to maintain awareness of the road around you. You are responsible for letting Autopilot almost kill you.
That's a horrible set of lane markings. And yes, you can see in the IC that it's following the only lane line it had - which was intended as the outside line of the lane next to it. This is why you pay attention, especially near exit ramps.
The title really should be "How to Not Use AutoPilot".
You're not wrong of course. But this should probably be one of the next things that AP programmers take on. It is a known limitation of the system but one which should be able to be overcome with the current hardware. Atleast situations above where the car is driving over chevrons which is never an acceptable lane.
I think the reason we're seeing such interest in this accident is because there is a difference between AP failing to stop an accident such as not seeing a fire truck after a vehicle moved out of the way, or hitting a semi versus contributing to the accident by putting the driver in a life threatening situation.
Attached is a satellite photo of the 101 fatal crash site. In the lower right, just above the "B" in Bayshore, you can see where the collision occurred into the barrier at the edge of a wall. Just above it is the lane for the "flyover" ramp. Below it is the main highway left lane. You can see how the pair of solid lines marking a pavement-level "virtual island" could be mistaken for a traffic lane.
Suppose we change the semantics. Suppose we call it Lane Keeping Assist (LKAS) rather than Autopilot (let's set aside the Speed Assist aspect for now). With this semantic change it completely takes on a different and more realistic meaning. It also changes the expectation. Tesla is not the only automaker with LKAS, and any car with LKAS will drive into a gore point barricade just as easily, because we know LKAS is not self-driving. It could happen to any LKAS-equiiped manufacturer. Why is Tesla put on the defensive here and is it natural to assume that Autopilot is self-steering? It is not--it's really just LKAS, a good one. Nissan Leaf will have ProPilot LKAS. BMW/VW/MB have LKAS. Chevy Trucks will have LKAS this year.
cars come with autopilot off by default and it literally says beta right next to it, not sure why people assume its a perfect system. ultimately the driver needs to be awake and well aware to take over in these situations.
I think it comes down to marketing. Most other car companies don't market it's lane keep assist as a way to help with driving. Tesla started with that, including this video below with Elon. I know this is not how the system behaves in real life, but the initial marketing pitch of autopilot really started this misperception of autopilot'' capabilities.
BTW, note the discussion at the end on how the car stopped for a stationary car. That's another marketing feature that has not become reality.
Or he can just look up for a split sec....
Except the screen flash/pulse warnings are just about having your hands on the wheel, right?
I.e., it's not the car warning that it doesn't know where the lane is and that the driver needs to take over, correct?
So I don't see how these warnings are relevant to the topic. It wouldn't change anything for a distracted driver who doesn't notice the car veering off course.
Whoever designed and painted that road should be arrested.
Every Autopilot dies.... not every Autopilot truly lives.
True but moot point. Currently in AP, the driver should not be distracted.