Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[UK Discussion] This is what a self driving car looks like

This site may earn commission on affiliate links.
Status
Not open for further replies.
I don’t know why all this binary one is better than other crap. They are different attempts at the same problem. None work as well as people thought they might. They all have their positives and all have their negatives.

For me, if I could be driven up a motorway hands free, safely and know in advance where I would be needed and where I could relax, that would be ideal.

But someone else will have their own criteria of success.

It’s just a shame that some of the solutions have over promised and under delivered. More fool me for believing the hype at the time and my sticking hand in my pocket. Having worked for a small company that ran L4 at the time, I kinda thought that companies with many times the resources would have been much further ahead than they were, but perhaps not.

It's not an easy problem to solve. Remember, FSD is Tesla's third attempt. AP1 was based upon MobileEye, and was good at lane keeping and distance following, but the capability hit a wall trying to go beyond that. AP2 improved on that with lane changing on highways, but hit a wall with city streets. FSD. . . we will see where the walls are, and probably pretty soon
 
Interesting discussion.

I'm no expert on the multitude of driver assist or self driving packages available in every car for sale today. However, I can confidently say I absolutely love Autopilot on my 2021 Model Y (no EAP nor FSD). It works almost flawlessly, I use it every day both on highways and on back roads, and mine very rarely phantom brakes or has any real issues at all. Sure it isn't L5 and I need to take over now and then, but I'm fine with that. As a driver assist I think its spectacular.

When people say Tesla's Autopilot is terrible I tend to roll my eyes. It's possible there are better systems out there, but stating Tesla's is terrible seems....biased to me? Maybe I just got lucky with my MY? I see people like Kyle from Out of Spec testing various systems though and his testing also seems to show Tesla's Autopilot works wonderfully, so I really don't think it's just me, and I do wonder at anyone stating it's "terrible".

My two cents. Now continue arguing about this stuff. :cool:
 
Interesting discussion.

I'm no expert on the multitude of driver assist or self driving packages available in every car for sale today. However, I can confidently say I absolutely love Autopilot on my 2021 Model Y (no EAP nor FSD). It works almost flawlessly, I use it every day both on highways and on back roads, and mine very rarely phantom brakes or has any real issues at all. Sure it isn't L5 and I need to take over now and then, but I'm fine with that. As a driver assist I think its spectacular.

When people say Tesla's Autopilot is terrible I tend to roll my eyes. It's possible there are better systems out there, but stating Tesla's is terrible seems....biased to me? Maybe I just got lucky with my MY? I see people like Kyle from Out of Spec testing various systems though and his testing also seems to show Tesla's Autopilot works wonderfully, so I really don't think it's just me, and I do wonder at anyone stating it's "terrible".

My two cents. Now continue arguing about this stuff. :cool:
just come back here once you trie BMWs implementation of "AP" analogue...
 
just come back here once you trie BMWs implementation of "AP" analogue...

A friend of mine bought a new 2022 BMW last year, he says the BMW system is okay but not as good as Tesla overall. He says his auto parking stuff is flawless and the BMW system works well on most highways, but he takes over on sharp turns because he says it kicks off most times in tight curves. Also he doesn't like using the BMW system on curvy back roads for the same reason. His wife has a Model 3 and he feels her Autopilot is more reliable and confident than his BMW driving assist.

I guess different strokes for different folks? 🤔
 
Which BMW? Which Driver Assistance level did they have? Not every BMW is equipped with the same kind or level of driver assistance

It's funny how those who have actually owned a BMW say their assistance systems are better while those claiming otherwise claim it's their friends or neighbours saying this
 
  • Like
Reactions: GeorgeSymonds
Doesn't work" <> "Won't work" in this context

Not wanting to fuel the flames, but that must be applied to several of the solutions. Eg some solutions are speed and/or scenario limited, not because of technical ability, but because that is what they are either licensed for and/or comfortable airing in public. Tesla seem comfortable doing their testing in public, other solutions less so. It doesn’t mean that is not happening, just that progress and full capabilities are less obvious or the capabilities, no matter what limits some may think they have, are to meet more specific objectives and meet those well.
 
  • Like
Reactions: bkp_duke
Tesla seem comfortable doing their testing in public, other solutions less so

Personally I find that surprising, but that is indeed what Tesla have decided to do. Chuck's videos show flaws, and he handles them well. Other YouTubes show much scarier stuff.

I have purchased FSD on one of my cars ... but I am very far from convinced I would want to use it as FSDb ... I'm sure I'd try it to see how it was, but I'm not sure I'd use it daily for a city streets commute. The whole creep-limit and will it go, or not, I just can't believe I would be comfortable with.

But ... Chuck's recent videos have had it wait at a green light for crossing traffic that was jumping a red light (which I might not have spotted), and the visualisation showed a tire that was loose and bounding along the road (albeit in a different lane).
 
A friend of mine bought a new 2022 BMW last year, he says the BMW system is okay but not as good as Tesla overall. He says his auto parking stuff is flawless and the BMW system works well on most highways, but he takes over on sharp turns because he says it kicks off most times in tight curves. Also he doesn't like using the BMW system on curvy back roads for the same reason. His wife has a Model 3 and he feels her Autopilot is more reliable and confident than his BMW driving assist.

I guess different strokes for different folks? 🤔
tesla kicks off on less sharper turns though...

i wonder how tesla's autopilot behaves for you on the most tight curves... because it cannot handle some motorway exits...
 
I saw that video when it came out. I would do exactly the same with software - "We want to make sure this works in 'Perfect Scenario' first" - it would be disabled for anything else, at that time.

"Doesn't work" <> "Won't work" in this context
Sounds like an excuse to me. Isn't the point of BETA testing is to test in as wide a range of scenarios as possible.
 
I saw that video when it came out. I would do exactly the same with software - "We want to make sure this works in 'Perfect Scenario' first" - it would be disabled for anything else, at that time.

"Doesn't work" <> "Won't work" in this context
Also, we have to remember that the process of self driving is multiple relatively independent steps:
1) Interpret the world around you and understand where you are in it
2) Project future actions on the other entities in that world - what are they going to do next
3) Make a plan for how to achieve your objective in that world while complying with the various behavioural rules

These steps are all fairly independent because by the time you're forming your driving plan in stage 3 you're past the point of worrying about whether it's rainy or sunny etc, you've already built your model of the world. Based on the changes we've seen in the Beta videos stage 3 is where a lot of the work is going on - refining the driving planner so that it's a more comfortable drive, and teaching it to handle the road features it doesn't do well yet, etc.

None of that work is less valuable because the analysis in stage 1 doesn't handle a rainy environment well yet, it just means that the software that handles stage 1 needs further work and training so that it's still confident it's understanding what it's seeing when it's rainy. The key thing that I think a lot of non-software people don't understand is that teaching stage 1 to reliably recognise road features in the rain doesn't undo ANY of the stage 2/3 work at all. The driving plan is still the same (hopefully with some adaptation for stopping distances etc) irrespective of the weather, provided you're still able to model the world around you.

I still think the fundamental principle of the vision based system is valid; humans drive by using their eyes so there's no reason why a camera based system can't do the same. Camera-based FSD will probably always bail out in milder conditions than a human driver, but that's because humans are risk taking idiots who rely on luck to avoid accidents a lot more than they realise.
 
humans drive by using their eyes so there's no reason why a camera based system can't do the same.
If your eyes are blinded by sunlight, you can chuck on a pair of sunglasses - a camera can't do the same.

If you get something in your eye which obscures your vision, you can get it off - a camera can't do the same.

This "humans can do something with vision so why can't cameras do the same" argument is so flawed.
 
  • Disagree
Reactions: bkp_duke
This could be another US vs UK / EU difference with Autopilot - AutoSteer nerfed in EU.
Can we please stop using the regs as an excuse for poor implementation by Tesla.

The lateral limits are perfectly reasonable, other car manufacturers abide by them and don’t seem to have as much of a hard time dealing with them.

The solution is, as most drivers would do, is reduce the speed of the car to match the corber and road conditions. But Teslas approach is to career in without reducing speed then suddenly finding mid corner that it’s exceeding the limits so immediately throws a wobble and tells driver to take over.

Reduce speed, keep within limits. Pretty simple?

It’s a major flaw that I hope single stack will address.
 
  • Like
Reactions: EVMeister
Can we please stop using the regs as an excuse for poor implementation by Tesla.

The lateral limits are perfectly reasonable, other car manufacturers abide by them and don’t seem to have as much of a hard time dealing with them.

The solution is, as most drivers would do, is reduce the speed of the car to match the corber and road conditions. But Teslas approach is to career in without reducing speed then suddenly finding mid corner that it’s exceeding the limits so immediately throws a wobble and tells driver to take over.

Reduce speed, keep within limits. Pretty simple?

It’s a major flaw that I hope single stack will address.

I was trying to give an explanation as to why US based posters here might be finding the system a lot less problematic than we do here in the UK. The lateral limits probably weren't the reason, agree with that. Just something I came across that got me thinking... My personal experience regarding the car going far too fast to take some corners properly matches yours - it simply doesn't slow down enough and aborts.

This is exactly one of those situations I was referring to in an earlier post where I will expect the system to mess up and for me to take over - fast and windy roads. I know when I have to take over and do so - don't find that stressful, but shouldn't have to with such a seemingly simple situation. They have such roads in the US as well and FSD beta has demonstrated it can handle them without deactivating or driving perilously fast. I look forward to these improvements filtering down to the UK, hopefully sooner rather than later.
 
Last edited:
  • Like
Reactions: MrBadger
Not wanting to fuel the flames, but that must be applied to several of the solutions. Eg some solutions are speed and/or scenario limited, not because of technical ability, but because that is what they are either licensed for and/or comfortable airing in public. Tesla seem comfortable doing their testing in public, other solutions less so. It doesn’t mean that is not happening, just that progress and full capabilities are less obvious or the capabilities, no matter what limits some may think they have, are to meet more specific objectives and meet those well.
The Mercedes example described earlier is a good example of this. Their system can drive at higher speeds and without a lead car when operating at level 2, however a combination of Germanic reservedness and legal frameworks has resulted in the decision to limit the conditions in which it operates when at level 3 because the back up of an alert driver is, by definition, removed.

Much of this comes down to mindset. Tesla rely on the driver alertness to push the limits, and up to a point that is a rapid way to learn. The somewhat crude analogy is a child learning to walk, an adult is there to catch them should they fall. The downside is Tesla are seemingly putting very little effort into understanding, ahead of time, where they should give up. If your goal is only level 5 then they are betting on the regulators allowing the systems straight in at that level, and matters such as redundancy of cameras, fail safe measures etc etc may all come to the surface (they may not, we don't know). Alternatively they will need to retrospectively go back and learn where the limitations are and learn how to predict and respond appropriately so there are zero driver takeovers on a journey that aren't requested by the car in good time, r where the driver simply wants to turn off the system for other reasons. In Europe, that will be a very hard sell. The mindset here is to start with a limited scope and expand, exactly what the Germans are doing. It is entirely possible we reach Level 4 here with a European based mindset with the likes of Bosch or Mobileye before Tesla, whereas in the US the reverse is true, purely because of the nature of the evidence base and path to implementation.

Examples like stopping because a car is jumping a red light should not be an example of self driving, but of a passive safety system. Drive a competitors car with cross traffic collision avoidance type systems and you see this exact type of feature come into play. The parameters are slightly different, but the logic is the same, the driver should be alerted and potentially prevented from driving, whether manually driving or on autopilot. Tesla should be implementing such measures in all cars as part of passive safety and not limiting it to FSD in my opinion, maybe once the single stack software arrives it will be the case. As a comparison, try driving a modern German car towards a wall below about 10mph when parking, you'll typically find the car will keep braking once it gets within approximately 30cm as a passive safety system.

We've had a "watercooler" chat over city street driving and if and when it arrives in the UK. The material concern is the lengthening of reaction times when a driver has to intervene. Currently, a driver spots a hazard like a child running into the road, and then responds. The distance to stop is a function of how long before the risk is understood, and then how long to physically brake. Modern brakes have dramatically improved the latter, but the concern over things like mobile phones is the former has increased. Now layer of an extra dimension, the car is driving, the driver now has to perceive the risk, and then perceive the car is not responding before they take over. The increase in time could be catastophic. The flip side is the car is looking all the time and always attentive and so will make less mistakes than a human driver. Going back to the earlier point about passive safety systems, there is no logical reason why the car wouldn't intervene should it detect a hazard (just like collision avoidance, but more geared to a city streets environment) but as a passive safety system

In essence, our thoughts on safety is that Level 2 on city streets in the UK will be potentially dangerous because the expectation is the car will react first, and then the driver. If this was passive safety, the expectation is the driver will respond first, and the car will react in parallel.

For our American friends, if you really want a considerably safer driving experience, forget Tesla, simply move to the UK as statistically we already have it on our roads.
 
Can we please stop using the regs as an excuse for poor implementation by Tesla.

The lateral limits are perfectly reasonable, other car manufacturers abide by them and don’t seem to have as much of a hard time dealing with them.

The solution is, as most drivers would do, is reduce the speed of the car to match the corber and road conditions. But Teslas approach is to career in without reducing speed then suddenly finding mid corner that it’s exceeding the limits so immediately throws a wobble and tells driver to take over.

Reduce speed, keep within limits. Pretty simple?

It’s a major flaw that I hope single stack will address.
yeah, the magical two word solution: single stack,

bmw and others do not have the issue, only tesla has. SO either all other's are stupid or it is only tesla which behaves stupid...
 
  • Disagree
Reactions: bkp_duke
If your eyes are blinded by sunlight, you can chuck on a pair of sunglasses - a camera can't do the same.
Effectively it can if the dynamic range of the camera is high enough. You know we use cameras to take photos of the actual surface of the sun, right?

If you get something in your eye which obscures your vision, you can get it off - a camera can't do the same.
That's true, at least not without additional equipment. Don't believe I've ever seen a 'front camera obscured' message. However this is also true of every other kind of sensor though, they're all blocked by something otherwise they wouldn't be any use as a sensor. Lidar uses light, just light you can't see with your eyes, so a Lidar emitter would have the same weakness. It is true that the rear camera on the Y seems quite prone to picking up dirt.

This "humans can do something with vision so why can't cameras do the same" argument is so flawed.
Why? None of the points you've put above undermine the concept. At best it shows that the currently deployed hardware has some weaknesses, but that's a very different discussion.
 
  • Like
Reactions: bkp_duke
The downside is Tesla are seemingly putting very little effort into understanding, ahead of time, where they should give up.
Next up on Tesla Motors Club following this discussion of how Telsa haven't bothered understanding when the system should give up: complaints about scenarios where FSD/Autopilot/etc voluntarily disengages itself.

In essence, our thoughts on safety is that Level 2 on city streets in the UK will be potentially dangerous because the expectation is the car will react first, and then the driver. If this was passive safety, the expectation is the driver will respond first, and the car will react in parallel.
Isn't this a bit contrived though? I don't think there's any driver out there who's going to watch an emergency situation unfold and not intervene because they think the car will do it. Most the time you're going to find that the car and the driver stamp on the brakes together. The advantage the car has with any scenario more complicated than emergency braking is it has constant 360 degree awareness and so effectively always has an evasive steering plan in its back pocket.

Examples like stopping because a car is jumping a red light should not be an example of self driving, but of a passive safety system.
This is harder than it seems to get right. Spotting A red traffic light is easy, being 100% certain that it applies to your lane of traffic requires a hugely greater semantic understanding of the road system. Get it wrong and you've just done an unnecessary emergency stop in flowing traffic and inevitably been rear-ended by the human behind you who is either not paying attention or driving based on assumptions about what you're going to do rather than what you're actually doing.

try driving a modern German car towards a wall below about 10mph when parking, you'll typically find the car will keep braking once it gets within approximately 30cm as a passive safety system.
Tesla's have Automatic Emergency Braking if that's what you mean? Even a system like this has edge cases that can backfire. In the unlikely event that you get stuck on a railway crossing, for example, you'd probably prefer that your car will let you drive through the barrier rather than sit there politely issuing cross traffic warnings about the approaching train.
 
Status
Not open for further replies.