Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Prediction: Tesla Pulls Autosteer Function Soon

This site may earn commission on affiliate links.
I agree. That's why I don't think it should be engaged in all situations and Tesla knows the best which situations those are.

We're chasing our tails here. I have the same textbook response to "Tesla knows better": then limit the speed on the car. Then install a breathalyzer. Then don't let the car drive without the seat belts being buckled. Then limit the acceleration. Then etc...

Big brother may know better, big brother may be watching, but if big brother starts to limits the ability to use the car, that'd lead to a lot of pissed off customers.
 
People relinquish their driving to AutoPilot - it's Tesla that programs AP's behavior and they should restrict it where it does not perform well.
I would describe it differently.
Drivers voluntarily choose to use Autopilot and when they turn on the feature in the center display controls there is a clear message displayed about its limitations which they have to acknowledge, including the fact that they are responsible for driving the car and their hands should be on the wheel.
Then they have to engage it (once AP determines it has enough sensor information to operate) and be ready to take over if it alerts them. They do not "relinquish" control of the car. The human driver is responsible!
How many times does all this have to be pointed out...
I predict this thread is going to look really silly in about a month or two :)
Autopilot will only improve and get better. Already the Tesla Autopilot system is better than systems found in other cars for months or years without issues.
Well stated. I agree.
 
It's a 1.0 release, how many 1.0 releases of anything are a mature fully functional error free product? Let's have realistic expectations here.

This isn't the latest version of Angry Birds we're talking about. We're talking people's lives here. This is software that directly modifies the behavior of an automobile out on public thoroughfares.

If autopilot isn't safe, even in 1.0, it shouldn't be on the market until it is.
 
This isn't the latest version of Angry Birds we're talking about. We're talking people's lives here. This is software that directly modifies the behavior of an automobile out on public thoroughfares.

If autopilot isn't safe, even in 1.0, it shouldn't be on the market until it is.

When used properly, do you have any evidence that AP isn't safe?
 
Don't understand why there's so much complaining... I am completely satisfied with AP. Use it for what it's designed to do for now, on highways only and it performs perfectly. I have used it for a few days and it's perfect. From after on-ramp to before off-ramp, all good, no issues whatsoever.

For people who complain about exit lanes on the right, please just keep your hands on the wheel on exits then... this is not meant to be autonomous driving yet, and that's CLEARLY communicated. It's really meant for helping to ease long distance highway travel and I think it does it perfectly. I would STRONGLY OPPOSE taking away this function because it fits my needs completely and without issue and because people refuse to follow clearly stated instructions.
 
When used properly, do you have any evidence that AP isn't safe?


So we agree then, you have no evidence.



To properly use the system, the following must be true (and it's obviously not true in the video):
A. Hands must be on the wheel
2. It's for highway driving only
III. The road must be divided
d. The driver must pay attention
Five. I'm sure I missed a few key points, but the above 4 are good enough.
 
Last edited by a moderator:
I'd say that video shows something very different than a lot of other people think it shows.

It shows a driver who is operating a camera with one hand while operating a car with AutoPilot engaged.

It shows a driver operating AutoPilot on a road that is not of the type that Tesla has specified as what AutoPilot should be operated on.

It shows a driver who ignores the fact that the car veers into the other lane once just before this, meaning the car is clearly having trouble making out where to drive. A situation that calls for additional caution.

It shows the car encountering a situation where it cannot make out the lines and immediately handing control back to the user.

It shows the driver not immediately taking control as soon as the beeping and the notice to take control came up. In fact they don't seem to react until the car starts veering. Which is a large part of the reason this video looks so bad.

The car does veer towards the other lane right about where it is right along side the oncoming car. The driver doesn't manage to take control till just slightly past this. So the video actually shows how quickly someone can react. Again this driver was distracted by operating a camera which also left him with only one hand to control the vehicle.

Despite the poor choices of the driver. The car was still able to give the user sufficient notice to take control before it was unaware of where the road was. In fact the beeps start around 1-2 car lengths before it catches up with the oncoming car where it veered into the other lane.

My guess from watching this video many times is that as it comes around the curve there is a glare situation that obstructs the camera's ability to see the lines. Since it'd just come around the curve it hadn't had time to see the lines before so it has no idea where the road is.

The only thing Tesla can mitigate is the fact that the user was operating AutoPilot on a road that they shouldn't have been. But I'm not even convinced of how easy they can do that. Geofencing the feature to specific road types would be very hard. It's very common for roads to run parallel and near to each other. Navigation systems routinely try to match up where you are to a road, but when the roads are parallel it gets it wrong sometimes. Creating situations where it disengages because it mistakenly believes the car is not on a highway is probably just as dangerous as the situation in that video.

If those situations occur just as often all such a solution would do is shift the dangerous situations from ones like the video where the user could have anticipated the problem due to the visibility issue. To ones where the user can't really predict because it's hard to tell if another road is close enough and sometimes the determination can flip from one road to another.

It's much easier to sit there and say what Tesla should do based on some anecdotal videos with no data to backup that certain actions would improve the safety of the system than it is to actually make the system safer.
 
I agree. That's why I don't think it should be engaged in all situations and Tesla knows the best which situations those are.

And Tesla does limit the situations it can engage. You just don't agree that they've correctly identified and limited those situations.

Only time will tell whether they're stringent enough on a global level, but there's no point beating the same points into the ground.
 
III. The road must be divided

I see people keeping saying this. There is nothing in what Telsa has put up that says the road must be divided. Many highways are not divided. What Tesla actually says is:

Autosteer works well on highways where there are clear lane markings or a car directly ahead to follow. It does not function reliably when a road has very sharp turns or when lane markings are absent, faded or ambiguous. Autosteer performance will deteriorate in rainy, snowy or foggy conditions.
 
I'd say that video shows something very different than a lot of other people think it shows.

It shows a driver who is operating a camera with one hand while operating a car with AutoPilot engaged.

It shows a driver operating AutoPilot on a road that is not of the type that Tesla has specified as what AutoPilot should be operated on.

It shows a driver who ignores the fact that the car veers into the other lane once just before this, meaning the car is clearly having trouble making out where to drive. A situation that calls for additional caution.

It shows the car encountering a situation where it cannot make out the lines and immediately handing control back to the user.

It shows the driver not immediately taking control as soon as the beeping and the notice to take control came up. In fact they don't seem to react until the car starts veering. Which is a large part of the reason this video looks so bad.

The car does veer towards the other lane right about where it is right along side the oncoming car. The driver doesn't manage to take control till just slightly past this. So the video actually shows how quickly someone can react. Again this driver was distracted by operating a camera which also left him with only one hand to control the vehicle.

Despite the poor choices of the driver. The car was still able to give the user sufficient notice to take control before it was unaware of where the road was. In fact the beeps start around 1-2 car lengths before it catches up with the oncoming car where it veered into the other lane.

My guess from watching this video many times is that as it comes around the curve there is a glare situation that obstructs the camera's ability to see the lines. Since it'd just come around the curve it hadn't had time to see the lines before so it has no idea where the road is.

The only thing Tesla can mitigate is the fact that the user was operating AutoPilot on a road that they shouldn't have been. But I'm not even convinced of how easy they can do that. Geofencing the feature to specific road types would be very hard. It's very common for roads to run parallel and near to each other. Navigation systems routinely try to match up where you are to a road, but when the roads are parallel it gets it wrong sometimes. Creating situations where it disengages because it mistakenly believes the car is not on a highway is probably just as dangerous as the situation in that video.

If those situations occur just as often all such a solution would do is shift the dangerous situations from ones like the video where the user could have anticipated the problem due to the visibility issue. To ones where the user can't really predict because it's hard to tell if another road is close enough and sometimes the determination can flip from one road to another.

It's much easier to sit there and say what Tesla should do based on some anecdotal videos with no data to backup that certain actions would improve the safety of the system than it is to actually make the system safer.

Doh!

You must spread some Reputation around before giving it to breser again.
 
I see people keeping saying this. There is nothing in what Telsa has put up that says the road must be divided. Many highways are not divided. What Tesla actually says is:

It does say it. When you enable Auto Steer under Driver Assistance, this window pops up:
20151021_095101 - Copy.jpg