Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot Options

This site may earn commission on affiliate links.
A car manufacturer restricting a feature by location isn't new but it is evil (there are a few cars that unlock performance when GPS detects it's on a known race track). I would not like the idea of Tesla controlling when or where I can do something as that restricts freedom.

Just answer one simple question... Which is safer: dumb cruise control or autopilot (a more advanced cruise control)?

Should all auto manufacturers make their cars not able to do regular cruise control on streets with cross traffic? What about in neighborhoods? Should the auto manufacturers make you car unable to go above 25 mph in a school zone? Should we live in a police state?

When automakers start controlling what you can or can't do then where does it stop?
I think the idea is to simply dissallow use of features in circumstances or environments where these features are known to fail or are limited to such an extent as to be unreliable.
 
  • Like
Reactions: KZKZ
I think the idea is to simply dissallow use of features in circumstances or environments where these features are known to fail or are limited to such an extent as to be unreliable.

Exactly, so you are saying all auto manufacturers should disable normal cruise control for all roads because:
  • It doesn't steer for you
  • It won't slow down automatically for slower traffic
  • it won't keep you in your lane
  • it won't brake automatically in case of emergency
Your idea is to disallow the use of that feature because it's limited and unreliable?!

(I understand you are talking about Tesla Autopilot, but we already have such "limited" or "unreliable" features now and nobody is in a mob with pitchforks calling for manufacturers to disallow it)
 
Last edited:
Exactly, so you are saying all auto manufacturers should disable normal cruise control for all roads because:
  • It doesn't steer for you
  • It won't slow down automatically for slower traffic
  • it won't keep you in your lane
  • it won't brake automatically in case of emergency
Your idea is to disallow the use of that feature because it's limited and unreliable?!

(I understand you are talking about Tesla Autopilot, but we already have such "limited" or "unreliable" features now and nobody is in a mob with pitchforks calling for manufacturers to disallow it)

Normally normal cruise control doenst work below 25 mph so therefore you already have an instance of a manufacturer limiting a feature.
 
Normally normal cruise control doenst work below 25 mph so therefore you already have an instance of a manufacturer limiting a feature.
I'd probably estimate that the majority of americans don't drive on many public roads which the speed limit is less than 25 mph. So it really applies to parking lots.

Point being, of course, regular cruise control is not currently limited on public roads nor should it be. The same is true for Tesla autopilot with the exception that it has limited functionality on roads without lane markings. People simply need to use proper judgement on when and where to use these technologies. Allow people to decide for themselves.
 
I'd probably estimate that the majority of americans don't drive on many public roads which the speed limit is less than 25 mph. So it really applies to parking lots.

Point being, of course, regular cruise control is not currently limited on public roads nor should it be. The same is true for Tesla autopilot with the exception that it has limited functionality on roads without lane markings. People simply need to use proper judgement on when and where to use these technologies. Allow people to decide for themselves.

Most neighborhoods and residential areas around me are 15-25 mph. Can't say any are 30...
 
This article discussions the problems with AP very well: I understand most people hate SA here but please don't let that bias your opinion off the bat. My opinion is summarized very nicely in the following paragraph of the conclusion:

The problem put into evidence in this event is not a Tesla problem. Instead,it's a generic problem affecting cars deploying lane keeping and TACC together. The apparent lack of need to keep the attention on the road for long stretches of time will lead many drivers to zone out. They will be lulled into false safety. However, the rate at which these systems fail is several orders of magnitude greater than the rate at which humans fail, so each failure of these systems if met by an inattentive driver will be a high risk situation. It's likely that this human-machine interaction will lead to cars which are materially less safe than cars having TACC alone. Lane keeping (as presently structured) is thus probably a dangerous feature, be it in a Tesla or in any other vehicle.

http://seekingalpha.com/article/3986304-detailed-view-tesla-autopilot-fatality

and if you're up for more reading 538 does a good job as well discussion the problem.

No Technology — Not Even Tesla’s Autopilot — Can Be Completely Safe
 
  • Informative
Reactions: KZKZ
Mobileye, which is what Tesla uses for the AP hardware, released this statement about the Florida accident. It says the new system that can detect crossing traffic won't be released until 2018. What does that timing mean for the M3 then?

Tesla Autopilot partner Mobileye comments on fatal crash, says tech isn’t meant to avoid this type of accident [Updated]

“We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”

Tesla says they can do some of that now.

Update: Tesla sent us the following statement in response to Mobileye’s statement:

“Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario. Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature. In the case of this accident, the high, white side of the box truck, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire.”
 
...most people hate SA...

"However, the rate at which these systems fail is several orders of magnitude greater than the rate at which humans fail, so each failure of these systems if met by an inattentive driver will be a high risk situation."

I call BS on that article's statement that the rate at which these systems fail is "several orders of magnitude greater than the rate at which humans fail". I don't have Tesla autopilot on my Prius. Driving home today I probably inadvertently crossed or touched the white line five or six times minimum. Guess how many times that would have happened with autopilot? I've personally seen humans fail and crash into the side of each other several times. I also see accidents (typically rear end collisions) on the side of the road nearly every other day. I can guarantee autopilot fails less than humans at the tasks it's actually spec'd to do.
 
  • Like
Reactions: plankeye
Mobileye, which is what Tesla uses for the AP hardware, released this statement about the Florida accident. It says the new system that can detect crossing traffic won't be released until 2018. What does that timing mean for the M3 then?

Tesla Autopilot partner Mobileye comments on fatal crash, says tech isn’t meant to avoid this type of accident [Updated]

“We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”
We won't really know until at least reveal 2 and maybe even later than that. But it's certainly possible that from day one all M3's will have the hardware necessary for LTAP detection/avoidance but the software to make it functional might not be released until 2018.
 
We won't really know until at least reveal 2 and maybe even later than that. But it's certainly possible that from day one all M3's will have the hardware necessary for LTAP detection/avoidance but the software to make it functional might not be released until 2018.
This is actually something that scares me with the preorders. I'm afraid we will get obsolete hardware that gets upgraded 6 months later.

I'd love if it came all the necessary hardware for full autonomy even if it's not enabled on day one. At minimum it'd be nice if there was an easy upgrade path should better sensors become available.
 
  • Like
Reactions: KZKZ
Exactly, so you are saying all auto manufacturers should disable normal cruise control for all roads because:
  • It doesn't steer for you
  • It won't slow down automatically for slower traffic
  • it won't keep you in your lane
  • it won't brake automatically in case of emergency
Your idea is to disallow the use of that feature because it's limited and unreliable?!

(I understand you are talking about Tesla Autopilot, but we already have such "limited" or "unreliable" features now and nobody is in a mob with pitchforks calling for manufacturers to disallow it)
No, I wouldn't put it that way, your analogy is faulty, and you're creating a strawman.
 
  • Like
Reactions: SureValla
I call BS on that article's statement that the rate at which these systems fail is "several orders of magnitude greater than the rate at which humans fail". I don't have Tesla autopilot on my Prius. Driving home today I probably inadvertently crossed or touched the white line five or six times minimum. Guess how many times that would have happened with autopilot? I've personally seen humans fail and crash into the side of each other several times. I also see accidents (typically rear end collisions) on the side of the road nearly every other day. I can guarantee autopilot fails less than humans at the tasks it's actually spec'd to do.


Several orders of magnitude?

LOL, that author is an idiot.
 
  • Like
Reactions: JeffK
No, I wouldn't put it that way, your analogy is faulty, and you're creating a strawman.
It is only sort of a strawman argument but the analogy is perfectly sound.

Let's look at some references:
from wikipedia
Modern cruise control (also known as a speedostat or tempomat) was invented in 1948 by the inventor and mechanical engineer Ralph Teetor....
The first car with Teetor's system was the 1958 Imperial (called "Auto-pilot") using a speed dial on the dashboard.

Here's a WIRED article from 2011 documenting dangers of cruise control and adaptive cruise control
Too Much Safety Could Make Drivers Less Safe

Here's a Wall Street Journal post from 2013 on the same topic
http://blogs.wsj.com/corporate-intelligence/2013/08/21/cruise/

My point is, that all this is old news. Other manufacturers have not limited these features on certain roads vs others. I don't believe Tesla should limit it either. I believe we need to educate owners and trust in their own personal responsibility.

1958%20Chrysler%20Auto-Pilot%20Brochure-01.jpg
 
Last edited:
Apples and oranges. The car already automatically disengages AP when it detects limitations. Why on earth would you be upset by that?
You might have missed some of the previous posts. People are calling for Tesla to have the cars automatically disengage when not on an approved freeway, such as a divided highway or anything with cross traffic, pedestrians, rural roads, etc.
I say keep it as it is, so you are agreeing with me.
 
and if they did, but missed a particular school zone where someone was speeding through, you can already hear the "but Tesla missed restricting the speed at this school, it's not my fault" driver statements
Exactly! I dislike the mentality of the general public where we should blame the manufacturer when often it's the fault of the user. This applies to more than just cars.

Besides a defect I think people should refrain from blaming manufacturers until there's full autonomy. It'll take a while before it's proven much better than a human in the majority of cases (obviously it'll never be 100% perfect).
 
and if they did, but missed a particular school zone where someone was speeding thru, you can already hear the "but Tesla missed restricting the speed at this school, it's not my fault" driver statements

Well in reality the "code" would be written for places the pilot can be driven, not where it can't. IE all roads tagged as interstate highways is a good place to start and expand from there.