Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autosteer on motorway = great. Autosteer on A road = dangerous?

This site may earn commission on affiliate links.
Did my first longish trip since getting car. About 60 miles each way (120miles round trip). One way was A roads (30 - 60 mph) and the return leg was 60% motorway.

Autosteer on the motorway was largely great. On the A road 3 times I felt very uncomfortable and 1 time I think I would have crashed had I not altered direction slightly after autosteer disengaged as it was confused by oncoming traffic.

My understanding is that this same technology is the basis of FSD? If so... im concerned.

I will now only use it on the motorway until I hear about a dramatic improvement. Was I unlucky? Or is it just not ready?
 
BTW my hands were on the wheel all the time . It disengaged several times - mainly when the road markings weren’t clear and others when the oncoming traffic got near my lane. It disengaged right on a bendy (almost a mild chicane) part of an A road right when the direction of the car needed to be altered. Was a few hundred milliseconds of panic despite having hands on the wheel.
 
The system is limited in its current configuration. Tesla says that Autopilot should be used on limited access highways, so you using it on A roads is putting it outside of the recommended use case. Also note that Autopilot is still in beta form... it will continue to improve and is currently not representative of how FSD will behave when those features are added.

As the message on the screen says every time you engage Autopilot, be prepared to take control at any time. Learn how the system behaves in various scenarios, practice taking over and make sure you're paying attention to what's going on. If you don't feel comfortable with the system in certain situations, don't use it.
 
With time..if you can avoid becoming a gibbering idiot with white hair..you get to understand it's A road limitations and it can be used to some benefit. It's perhaps easier in the S with the small display in front to keep an eye on how well markings are being recognised while still keeping an eye on the road. Classic anger areas are when an overtaking lane opens up but the new middle marking is hidden by the car in front and some junctions where it can get it wrong and try to do a sudden left off the road, On-coming traffic around blind bends and opposite traffic overtaking cyclists andd moving near the middle. You get used to dropping in/out of autosteer before it goes weird or does it for you.
 
  • Like
Reactions: drewpost
The system is limited in its current configuration. Tesla says that Autopilot should be used on limited access highways, so you using it on A roads is putting it outside of the recommended use case. Also note that Autopilot is still in beta form... it will continue to improve and is currently not representative of how FSD will behave when those features are added.

As the message on the screen says every time you engage Autopilot, be prepared to take control at any time. Learn how the system behaves in various scenarios, practice taking over and make sure you're paying attention to what's going on. If you don't feel comfortable with the system in certain situations, don't use it.
Valid points.

I write software for a living. Have done for 20 years. I write mainly business apps - so it’s a different field but I have used machine learning in several projects. What Tesla are doing isn’t the same as my day job... BUT I know how projects work and I understand domain complexity.

I don’t have a clear understanding of how far Tesla have gotten. This all smacks of the 80/20 rule. I hope I am wrong but I have a small concern that we are actually nowhere near FSD. I hope I am wrong.
 
  • Like
Reactions: ChrisA70
I don't think we are anywhere near Level 5 autonomy (hands off, go to sleep and let the car drive), but I do think we're getting close to Level 4, which may require periodic human intervention.

It's also worth mentioning that Autopilot currently ignores or otherwise doesn't respond to some things that the car is aware of. Tesla adds functionality as the machine learning improves.

Take traffic cones, for example: the computer has been identifying traffic cones and other objects internally for a long time, but only recently did they add them to the traffic visualization in the car. Stop lights are another good example: the computer is aware of traffic lights but takes no action on them in the current iteration of Autopilot.
 
I don’t have a clear understanding of how far Tesla have gotten. This all smacks of the 80/20 rule. I hope I am wrong but I have a small concern that we are actually nowhere near FSD. I hope I am wrong.

We are a very long way from FSD, and very very long way in terms of A roads IMO. As previously mentioned Autosteer is currently designed for dual carriageways and motorways, and most of time pretty good at that.

As for FSD in the city :eek:
 
Did my first longish trip since getting car. About 60 miles each way (120miles round trip). One way was A roads (30 - 60 mph) and the return leg was 60% motorway.

Autosteer on the motorway was largely great. On the A road 3 times I felt very uncomfortable and 1 time I think I would have crashed had I not altered direction slightly after autosteer disengaged as it was confused by oncoming traffic.

My understanding is that this same technology is the basis of FSD? If so... im concerned.

I will now only use it on the motorway until I hear about a dramatic improvement. Was I unlucky? Or is it just not ready?

As others have noted, Tesla specify that it's only currently for use on restricted roads (motorways etc), so your results aren't really surprising (though your experience may have been!). I try to remind people of this since (as you discovered), it can be dangerous to use AP outside its intended use. I also worry that too many YoutTube posts of people (idiots?) driving AP in crazy situations will catch the eyes of politicians and lawyers, and THAT can never be a good thing imho.

From a hardware standpoint, yes, FSD will use the same "technology" (cameras, sensors etc), but of course the software stack will be much changed. If this change feasible? Is the complexity curve linear or exponential? I dont think anyone knows yet. I think the car WILL be able to negotiate simple city driving tasks (especially in the US where grid systems are common), including stopping at red lights. But what about all those special cases? The dog in the road? The weaving cars in heavy traffic? That's much fuzzier to me. Traditionally, as you know, AI has over-sold itself for decades, and only recently have we seen an explosion in image and voice recognition. Will that expand to FSD cars? That is what we are all holding our breath to see.
 
  • Like
Reactions: Big Earl
I will now only use it on the motorway until I hear about a dramatic improvement. Was I unlucky? Or is it just not ready?

Your experience is identical to many other people who have posted here. So, no you were not unlucky ... that's how it is at the moment. Any experiments on roads other than motorways require, in my opinion, a higher level of alertness than driving without autopilot. It can, and does, bail out at crucial moments and I have had several occasions when a serious crash either off the road or into oncoming traffic could well have occurred if I hadn't been ready to take instant action. The lines it frequently chooses on a corner are so close to the edge or centre line that if it does bail you have virtually no time to pull it back so you better be ready and have the reflexes of a fly! Tesla does not advise using Autopilot in these circumstances either, so if you do have an accident you're on your own!
 
  • Like
Reactions: WarpedOne
Did my first longish trip since getting car. About 60 miles each way (120miles round trip). One way was A roads (30 - 60 mph) and the return leg was 60% motorway.

Autosteer on the motorway was largely great. On the A road 3 times I felt very uncomfortable and 1 time I think I would have crashed had I not altered direction slightly after autosteer disengaged as it was confused by oncoming traffic.

My understanding is that this same technology is the basis of FSD? If so... im concerned.

I will now only use it on the motorway until I hear about a dramatic improvement. Was I unlucky? Or is it just not ready?

I start chuckling, because for many 60 miles is a short daily commute.

When you first use it, it can be "exciting" but after using it and learning its habits, you start to understand what it is doing (until new software comes out) and where it will have problems.
I'll use it on a 3 mile stretch of road and it does perfectly, it's done it many times.
There's another stretch that I knew would cause it troubles. First, it did it no matter what, but then upgrades got to the point where if I rolled the speed down by 5 mph, it would have no issues. Today it does it all by itself (although it get a little grandpa mode doing it).

The more you use it, the more comfortable you will feel with it. I'm at the point that there are even some situations, like rainy, low visibility evenings that I even feel that it is safer than I am. I'm good, but with a lot of lights from different directions an glare from everything, it just seems to process things a little faster.

There will be situations where you think that the car will crash. That's because it is so much faster in many decisions, it can wait a few seconds longer to make a decision. Coming to a car stopped at a light is one of those situations. It generally will catch, although if there is a significant curve before the light, it becomes more doubtful.

Many have equated it to teaching a teenager to drive. May not be a chauffeur in a Rolls, but it does get you there.

And yes, there are places where the car WILL NOT handle.
 
What does FSD "feature complete" by 2020 means? Will all owners with FSD (1000s) be able to claim mis-sold FSD in the next few years?

Maybe, owners in the US were paid out after AP2 features were delivered ultra late. There was a class action on that I believe.

I suspect that what Elon/Tesla mean by FSD isn’t what you/I might mean by FSD. E.G Smart Summon and stop light recognition are FSD ‘features’.
 
I'm coming to the conclusion that the entire autopilot approach is basically flawed. It's based on the premise that a computer can figure out how to drive a car based on video input from half a dozen cameras. But it seems to me that there are serious questions to be asked about whether it can ever be made to work reliably enough:

- The cameras aren't eyes. They have shortcomings. For example, they don't have eyelids so can't clear themselves of dirt, condensation, water etc; they can't rotate to get maximum pixels on some small area; they suffer much more with glare; they can't handle the dynamic range that that eye/brain can; and they struggle in low light. You can see this when driving the car - it gets confused with low sun; it stops working when the cameras are obscured with dirt or condensation; and in a dark and rainy night, things get a bit ragged.

- The "neutral net" AI is clever, but it's fundamentally just statistical pattern matching in real time. There's no intelligence in there in the sense that the computer is building a model of what's going on, ready to interpret unusual situations with insight based on past experience. The hope is that with enough "learning" from real world data (meaning getting humans to annotate footage from real cars to identify objects and situations), that it'll eventually get there. But the trouble is that the tail of the learning curve is ridiculously long - and without the intelligent insight to deal with a future unknown scenario, there's always the doubt that it'll fail.

It's all very interesting tech, and I applaud Tesla for pushing the boundaries as they have, but the facts are that they are still failing on very basic stuff - the most egregious of which is simple traffic aware cruise control (TACC). It's something many other manufacturers provide - invariably implemented using some sort of radar/lidar technology. It works. The same functionality on a Tesla is very unreliable - phantom braking, aborted lane changes, and confusion with parked traffic are just some of the problems.

My conclusion is that Tesla have probably backed the wrong horse in trying to do FSD almost entirety based off real time image recognition. I can understand why they've done it, but I can't see the tech getting to a workable result any time soon.
 
So FSD is not FSD :eek:

It might be, in very limited circumstances e.g, on a motorway changing lanes without driver intervention, and taking an exit. Like Nav on AP at the moment but without the driver ratifying everything.

Anything much more complex than that, I don’t see it happening in the lifetime of the cars we are currently driving. That won’t stop me buying FSD as I’m enjoying the ground breaking experience, but true FSD it won’t be. IMO.
 
I'm coming to the conclusion that the entire autopilot approach is basically flawed. It's based on the premise that a computer can figure out how to drive a car based on video input from half a dozen cameras. But it seems to me that there are serious questions to be asked about whether it can ever be made to work reliably enough:

- The cameras aren't eyes. They have shortcomings. For example, they don't have eyelids so can't clear themselves of dirt, condensation, water etc; they can't rotate to get maximum pixels on some small area; they suffer much more with glare; they can't handle the dynamic range that that eye/brain can; and they struggle in low light. You can see this when driving the car - it gets confused with low sun; it stops working when the cameras are obscured with dirt or condensation; and in a dark and rainy night, things get a bit ragged.

- The "neutral net" AI is clever, but it's fundamentally just statistical pattern matching in real time. There's no intelligence in there in the sense that the computer is building a model of what's going on, ready to interpret unusual situations with insight based on past experience. The hope is that with enough "learning" from real world data (meaning getting humans to annotate footage from real cars to identify objects and situations), that it'll eventually get there. But the trouble is that the tail of the learning curve is ridiculously long - and without the intelligent insight to deal with a future unknown scenario, there's always the doubt that it'll fail.

It's all very interesting tech, and I applaud Tesla for pushing the boundaries as they have, but the facts are that they are still failing on very basic stuff - the most egregious of which is simple traffic aware cruise control (TACC). It's something many other manufacturers provide - invariably implemented using some sort of radar/lidar technology. It works. The same functionality on a Tesla is very unreliable - phantom braking, aborted lane changes, and confusion with parked traffic are just some of the problems.

My conclusion is that Tesla have probably backed the wrong horse in trying to do FSD almost entirety based off real time image recognition. I can understand why they've done it, but I can't see the tech getting to a workable result any time soon.

  • The cameras aren't eyes. They actually have better resolution, but have been de-rated explicitly because better resolution isn't better. Go do a little research and see how little area that the eyes have good resolution in. No, the side cameras don't have wipers, but the front does. They don't need to rotate, they can see everything at once. BTW, people get confused in low sun angles. That's why many cities have the sunrise and sundown slowdowns in traffic.
  • And just how do you think that your brain does it. How do you know what a "car" is? A "building" , a "road" that's because you've had years of learning, called childhood, to learn it. And trust me, humans do equally poor in known (even known) situation. That's where wrecks and slow cars come from.
So, are you saying that if you were in a control room with a set of 360 degree monitors, that you couldn't drive the car remotely?

Let's turn the table around, seeing that you seem to be a LIDAR pundit, if I put you in that same control room, you could drive?

People can drive with a single eye. In that case we know that it's all simple perception with distance being extremely hard to determine. Tesla has stereoscopic imaging that can determine the distance and RADAR to give an even better determination. With cameras, you can also get cues, such as color, that isn't available with LIDAR.
After the basic image recognition of either cameras or LIDAR, it's the same set of hard solutions to create. If you take a look at any of the raw image with interpretations shown, it's pretty obvious that Tesla is already got 99+% of the image recognition problem solved.
 
- The cameras aren't eyes. They have shortcomings. For example, they don't have eyelids so can't clear themselves of dirt, condensation, water etc; they can't rotate to get maximum pixels on some small area; they suffer much more with glare; they can't handle the dynamic range that that eye/brain can; and they struggle in low light. You can see this when driving the car - it gets confused with low sun; it stops working when the cameras are obscured with dirt or condensation; and in a dark and rainy night, things get a bit ragged.

Actually, to be fair to the cameras, they are in many cases BETTER than human eyes. Last night I was driving home in the dark/rain and the lane lines were virtually impossible for me to see (reflections, shiny road surface etc). Yet the Tesla was picking them out better than my eyes could; almost flawlessly in fact. Modern cameras are getting to the point where they hav better dynamic range and excellent visual acuity across their entire field of view (they will never need to rotate since they dont have a fovea unlike the human eye). Of course, the brain is amazing at taking a poor/noisy signal and sorting it out, but last night mine wasn't doing as good a job as the car!
 
  • Like
Reactions: pow216