Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

They are NOT messing around when they say AutoPilot is Highway only. (Be CAREFUL!)

This site may earn commission on affiliate links.
No. Absolutely not. No way. No. Did I mention no?

I didn't buy a nanny-state car. If I want to drive it in the city, I should be able to drive it in the city, knowing it's limitations.

NO!!!!

I'm not sure I understand what you mean, but I think I do. You think you 'should' be able to use AP in the city. If so, why? You made an agreement not to use it in the city when you clicked through the notice choosing to do the beta. Therefore, you 'should not' be able to use it in the city. If you use it that way anyway, you do at your own risk, but unfortunately at risk to others too.

This sounds self-righteous, but that's not intended. I have been using it against the rules too, for the first day, and now I realize it was a mistake, and will be more disciplined about it.
 
What confuses me about this whole issue, and about TACC which preceded it, is that they are using Mobileye technology, which can, indeed recognize stop signs and lights, and we are told the car is fitted with their latest current chip. Why has the feature NOT been activated? I mean it does not require Tesla to do the research and coding. I mean it is a significant safety issue. All the warnings caveats in the world are just legal fluff. When the rubber hits the road, practicalities are more important at the end of the day.
 
What confuses me about this whole issue, and about TACC which preceded it, is that they are using Mobileye technology, which can, indeed recognize stop signs and lights, and we are told the car is fitted with their latest current chip. Why has the feature NOT been activated? I mean it does not require Tesla to do the research and coding. I mean it is a significant safety issue. All the warnings caveats in the world are just legal fluff. When the rubber hits the road, practicalities are more important at the end of the day.


Musk suggested it's coming. I believe it's because Tesla is writing its own software on top of the Mobileye system. So, Mobileye may have done a lot of the work, but Tesla still needs to integrate it into their "autopilot" solution. For example, none of the other vendors require feedback to be gathered and sent to the cloud for analysis. Tesla would need to code specific scenarios where the traffic sign and light recognition data would be incorporated into their fleetwide learning system. Remember, that's Tesla's killer app. Many vendors will have Mobileye hardware and software, but Tesla has OTA updates and 3G in every car. Logging the crucial data within bandwidth constraints, then designing algorithms to take this wealth of data and reliably enhance their autopilot experience could be playing a big part in the slower roll-out of features.
 
I'm not sure I understand what you mean, but I think I do. You think you 'should' be able to use AP in the city. If so, why? You made an agreement not to use it in the city when you clicked through the notice choosing to do the beta. Therefore, you 'should not' be able to use it in the city. If you use it that way anyway, you do at your own risk, but unfortunately at risk to others too.

This sounds self-righteous, but that's not intended. I have been using it against the rules too, for the first day, and now I realize it was a mistake, and will be more disciplined about it.

For myself, I think that I am capable of using the system in situations other than the highway with the knowledge that it might fail. I'll have to be more cautious, have my hands on the wheel ready to react immediately. Over time, I'm confident that I will be able to recognize when I want to use it on surface streets and when its shortcomings are such that it isn't worth it to me. I'm 100% good with the idea that I am completely responsible for what my car does. I would be very disappointed if they just shut off operation on all streets where you could do something stupid because some people might be less responsible (although I wouldn't actually be surprised).

What confuses me about this whole issue, and about TACC which preceded it, is that they are using Mobileye technology, which can, indeed recognize stop signs and lights, and we are told the car is fitted with their latest current chip. Why has the feature NOT been activated? I mean it does not require Tesla to do the research and coding. I mean it is a significant safety issue. All the warnings caveats in the world are just legal fluff. When the rubber hits the road, practicalities are more important at the end of the day.

Do you seriously imagine that it is just a matter of turning on the function in Mobileye? Obviously if it was easy they would have already done it.
 
What confuses me about this whole issue, and about TACC which preceded it, is that they are using Mobileye technology, which can, indeed recognize stop signs and lights, and we are told the car is fitted with their latest current chip. Why has the feature NOT been activated? I mean it does not require Tesla to do the research and coding. I mean it is a significant safety issue. All the warnings caveats in the world are just legal fluff. When the rubber hits the road, practicalities are more important at the end of the day.

Because Tesla is prudently activating these features one at a time. That allows them to ensure things are working well before they add the next piece. Imagine a software update that included all of the possible EyeQ3 functionality in it at once. Not only does it take a lot of time and testing to get that ready, but it makes the scope for trying to resolve issues when the release is out just that much larger.

Tesla has enough on their hands already with improving TACC and autosteer. Handling signage is something that can wait for later. After all, their primary intent is to get autopilot working well with highway driving first. There are obviously still areas for improvement there. Surface streets will come later (possibly with hardware not on current cars).

And not recognizing stop signs and lights is not a safety issue, because they explicitly state that autopilot is for highway use and needs continuous monitoring. Autonomous driving will come, but Tesla makes it very clear that this is not yet it.
 
I'm not sure I understand what you mean, but I think I do. You think you 'should' be able to use AP in the city. If so, why? You made an agreement not to use it in the city when you clicked through the notice choosing to do the beta. Therefore, you 'should not' be able to use it in the city. If you use it that way anyway, you do at your own risk, but unfortunately at risk to others too.

This sounds self-righteous, but that's not intended. I have been using it against the rules too, for the first day, and now I realize it was a mistake, and will be more disciplined about it.

I'm getting tired of these silly arguments. You made an agreement not to exceed the speed limit when you got your drivers license, have you ever exceeded the speed limit by 1mph?

I want to be able to use AP wherever I want to. I am doing it on my risk. I typically don't use AP or TACC in the city (almost never actually, though I have tested it in both situations), because autosteer does poorly, and TACC isn't that useful for city bumper to bumper for my driving style. I use both on the highway during roadtrips. But if I WANT to use either in the city, I should be able to.
 
I'm getting tired of these silly arguments. You made an agreement not to exceed the speed limit when you got your drivers license, have you ever exceeded the speed limit by 1mph?

I want to be able to use AP wherever I want to. I am doing it on my risk. I typically don't use AP or TACC in the city (almost never actually, though I have tested it in both situations), because autosteer does poorly, and TACC isn't that useful for city bumper to bumper for my driving style. I use both on the highway during roadtrips. But if I WANT to use either in the city, I should be able to.
Well, it's not just your risk, there are other people on the road too.. If you hit someone because the autosteer veered into them, they are also at risk. It's similar to arguing that drunk drivers only risk themselves.
 
Tesla seems to be trying to tell us that the TACC and AutoSteering software, as is
currently implemented in 7.0 (2.7.56) is for controlled-access "highways" only.

Situations NOT handled in the code:
Intersections, stop signs, signal lights, left turns, right turns, exit ramps,
entrance ramp merging, insufficient lane markings, left turn lanes,
sharper curves (even at slower speeds), slowing for curves,
pedestrians, animals, driveways, ... etc.

So, some might want to experiment in other situations, but SAFETY is
very important, for yourself and for the others around you. I hold the
steering wheel very gently, to feel any corrections that the AutoSteering
is making, ready to grasp the wheel firmly if it starts to do anything
unexpected or dangerous,

However, unless you are experieced and skilled in handling fast moving
5000 pound deadly objects (recognizing the aproaching "difficult"
situations, taking control sufficiently quickly, and applying suitable
corrective action) you should stick to controlled-access highways
with very well painted lane stripes on both sides of your lane.

In a panic, pressing the brake should be immediate, since that
turns off the AutoPilot functions. Corrective action is usually
slowing and steering, so be immediately ready to do both.

Master TACC first, learning its strong and weak points, since
it appears that one cannot use the AutoSteering without
the TACC (is that correct?), and only then try the auto-steering
in a "perfect" situation, hopefully with very little traffic.

Cheers, please be CAREFUL.
 
Same with speeding. What's your point?
Unequal assertion. Speeding is a well understood risk with high practice and established parameters. Additionally, people seeing another one speeding are accustomed to the effect on their own vehicle and can make adjustments to their driving surroundings based on the new input (guy is speeding ridiculously fast near me).

With AP there is a vast unknown, with no overt indiciation to other drivers that a car is under software control rather than the driver. The possibility of spontaneous and disasterous decisions are not only possible, they are probable. Although I fundamentally agree with the notion that I should not be limited to a walled garden of acceptable roads where I can use AP, I also agree that the notion that ONLY I am at risk is fundamentally flawed and extremely dangerous.

I think it should be enormously obvious why all the yummy possibilities of AP have not been released yet, because it's going to take quite a while to train the driving and observing public to deal with the new dynamic of software driven cars.

I'm just waiting for the first video of someone setting AP and then climbing through the sunroof to surf the car.
 
Give me a break - really? What data do you have to make this bold claim?
Because we've already seen videos and heard the stories of people pushing AP considerably beyond what it was designed to do, and AP making (perhaps, not entirely known) very bad decisions. I'd wager that, if used in a very conservative way, on straight highways with clear markings, the odds would go down to close to zero. They'd have to or Tesla could not have released the software. But Tesla is going to have to endure boneheads and thrill seekers. And when you throw the limitations of the software into the same salad as the creativity of the Jackass generation, you're going to get spontaneous and disasterous results.

BELIEVE ME I hope not ... and I'm hoping that the software's learning algos are faster than the need to be more wild and crazy than the previous winner of the Darwin award. But I am not optimistic.
 
Nice strawman argument, filled with many fallacies.

Unequal assertion. Speeding is a well understood risk with high practice and established parameters. Additionally, people seeing another one speeding are accustomed to the effect on their own vehicle and can make adjustments to their driving surroundings based on the new input (guy is speeding ridiculously fast near me).

The argument was that with AP there are other people on the road that I can endanger. I said the same is true with speeding, there are other people on the road too. I'm not speeding on a closed track. Just like drunk drivers can injure other people. Yet, we don't allow drunk drivers. Keep producing cars that go faster and faster. And no one is regulating AP (nor do I think anyone should be).

With AP there is a vast unknown, with no overt indiciation to other drivers that a car is under software control rather than the driver. The possibility of spontaneous and disasterous decisions are not only possible, they are probable. Although I fundamentally agree with the notion that I should not be limited to a walled garden of acceptable roads where I can use AP, I also agree that the notion that ONLY I am at risk is fundamentally flawed and extremely dangerous.

There is no driver indication that a person is using cruise control. Think of AP like glorified cruise control. The other person does not need to know that you're using it, because THE DRIVER IS STILL IN CONTROL!

AP != Autonomous. The software is not in control of the car, the software is an aid, you are in control of the car!

Once we hit Level 3 autonomy, the car will be in control. We're not there yet. EM said 3 years (lol). MobilEye said 5 years (I think). Which is already pushing the boundaries, because a couple years ago researchers were speculating Level 3 is 10 years out.

I think it should be enormously obvious why all the yummy possibilities of AP have not been released yet, because it's going to take quite a while to train the driving and observing public to deal with the new dynamic of software driven cars.

I'm just waiting for the first video of someone setting AP and then climbing through the sunroof to surf the car.

I think it's painfully obvious why they haven't been released yet either: because it's hard to do, and it has to work in 99.99999% of the time. It's not a "hey, let's keep this from the public because they're stupid" it's more of a "hey, this is a 10-year plan, let's make the software bullet proof and roll it out stage-by-stage"
 
A couple of days of experience?

I've already had my AP to almost collide me with a bus (on a straight stretch of highway!) and someone had a close encounter with an oncoming traffic due to AP misinterpreting a tree shadow on the road.

I REALLY hate that more folks have not made their own observations regarding the, " tesla autopilot tried to kill me" video.
It DID NOT, misinterpret a tree shadow in the road! It was pushed past it's torque limits in a cambered left turn and disengaged. When it STOPPED applying right steering torque, the car veered to the left. Yes the disengagement was sudden, but any reasonable person, 1) would not even had it on in that situation and 2) seeing how much trouble it was having in the curve preceding the disengagement curve, would have taken over immediately.
To have left it on in that situation was moronic.
 
Sadly, there are some people with impaired judgment that will be driving an AutoPilot
Tesla, in spite of Tesla's warnings, and our cautions here. Somewhat like giving a child
access to a loaded gun, there will be the curiosity and novelty urges that tempt them
beyond any well-tempered cautions and constraints, into deadly circumstances, where
innocent others are injured or worse. Even the simple speed of a 135 mph 5000 pound
car is not something that I want my wife or children to experiment with.

We have laws against driving with (presumed) impaired judgment due to alcohol
or drug use, but many people still do drive illegally. People with poor judgment
are usually only persecuted after the fact, if at all. In some circles they are revered.

It appears that Tesla is well aware that the car AP will make mistakes, particularly
if used in situations not yet "intended", so remaining alert, being able to take control
immediately, and continuously being safety conscious, are VERY important if
we want to see this technology succeed.
 
Last edited:
What confuses me about this whole issue, and about TACC which preceded it, is that they are using Mobileye technology, which can, indeed recognize stop signs and lights, and we are told the car is fitted with their latest current chip. Why has the feature NOT been activated? I mean it does not require Tesla to do the research and coding. I mean it is a significant safety issue. All the warnings caveats in the world are just legal fluff. When the rubber hits the road, practicalities are more important at the end of the day.

Because that's a decision you _really_ don't want to get wrong. I wouldn't hold your breath waiting for it.
 
The argument was that with AP there are other people on the road that I can endanger. I said the same is true with speeding, there are other people on the road too. I'm not speeding on a closed track. Just like drunk drivers can injure other people. Yet, we don't allow drunk drivers. Keep producing cars that go faster and faster. And no one is regulating AP (nor do I think anyone should be).
You equilibrated speeding with AP and endangerment. This is the actual fallacy. We can see that even the state governments see speeding as different than drunk driving and other forms of less-controlled driving. Driving AP in a way that is inconsistent with the rules specified by Tesla could actually fall into the reckless endangerment, criminal neglect place - much more easily than speeding. If you're doing 100mph in a school zone you're going to get get whacked the same way. But all you'd really need to do with AP is get a little more reckless on a curvy road and, potentially, you could hurt people much more spontaneously and unexpectedly. I am not saying that this is, in any way, an indictment of Tesla or a call for them to regulate. I agree that I prefer not to be nannied. But it does speak to the capability of an individual to create new and unexpected challenges on the road with AP that are much less understood or expected by other drivers.


There is no driver indication that a person is using cruise control. Think of AP like glorified cruise control. The other person does not need to know that you're using it, because THE DRIVER IS STILL IN CONTROL!
This is also a fallacy, because the actual statement is that the driver is SUPPOSED to be in control. AP allows for the possibility of stupid/careless people to be much more easily out of control because they are now enticed to take their hands off the wheel. and familiarity is the enemy of safety in this case. As boneheads become more comfortable they will become more daring and less cognizant, leading to the possibility of more spectacular incidents. Again, I'm talking about boneheads here, not people of brains.


AP != Autonomous. The software is not in control of the car, the software is an aid, you are in control of the car!
You are SUPPOSED to be in control of the car. You clearly think that I am asserting something different.


Once we hit Level 3 autonomy, the car will be in control. We're not there yet. EM said 3 years (lol). MobilEye said 5 years (I think). Which is already pushing the boundaries, because a couple years ago researchers were speculating Level 3 is 10 years out.
I agree with this and your previous posts about level3. I look forward to it, but agree it's a long way away. And it's conceivable that the window may be pushed back as daredevils cause more legal troubles, inhibiting development and public release.


I think it's painfully obvious why they haven't been released yet either: because it's hard to do, and it has to work in 99.99999% of the time. It's not a "hey, let's keep this from the public because they're stupid" it's more of a "hey, this is a 10-year plan, let's make the software bullet proof and roll it out stage-by-stage"
Actually here I disagree with you completely. Developers often do not release capabilities because they are too far ahead of their time. Of COURSE this is is hard. Perhaps ridiculously hard. But, IMO, it is not only prudent, it is required that capabilities like this evolve somewhat slowly into the public sphere ... drivers get used to many of these ideas, the dynamics of them, the capabilities and limitations of them such that they become reasonably expected components of our driving experience, rather than a circus. You cannot release something as amazingly revolutionary as progressively automatic/approaching autonomous driving quickly or easily. It is the public that sucks, not Tesla. And I believe that although they probably could have released more capabilities, they would not until they know they will be safely used.

Heh. And based on many results and posts already, people are looking to push even the existing envelope to a scary point. Hopefully Tesla is collecting mountains of data and using it to prep the next bit leap.