Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Self-driving cars (level 3) to be allowed on UK roads

This site may earn commission on affiliate links.
Just noticed and article on the Top Gear website where VW are talking about charging €7 (about £6) an hour for level 4 autonomous driving. No timescales for when this will be available (don’t think they’ve gone past level 2 yet), just that they think that is the level they’ll need to cover the cost of the hardware and be profitable. It does sounds better value than Tesla’s £6,800 up front cost, unless of course you do a reasonable amount of driving (break even point is about 7 hours a week over 3 years). Will be interesting to see where Tesla pitch their subscription service, should that ever appear.
 
  • Informative
Reactions: Yev000
Just noticed and article on the Top Gear website where VW are talking about charging €7 (about £6) an hour for level 4 autonomous driving. No timescales for when this will be available (don’t think they’ve gone past level 2 yet), just that they think that is the level they’ll need to cover the cost of the hardware and be profitable. It does sounds better value than Tesla’s £6,800 up front cost, unless of course you do a reasonable amount of driving (break even point is about 7 hours a week over 3 years). Will be interesting to see where Tesla pitch their subscription service, should that ever appear.
If we assume an average speed of ~ 30 mph over 15000 miles, that will take ~ 500 hours to drive. At £6/hour Thats £3000 per year.

Assuming there is no charging, maintanance, etc. Not too bad once depreciasion is figured into it.

7k up front would break even in just over 2 years - it wont be only 7k once we "get there". And the nice part is if you drive less for whatever reason you don't need to spend.... dont forget the cost of the actual car ... you cant exactly *drive* the FSD package without the car.
 
I’m surprised the govt is using the term Self Driving.
Just politics. The govt decided a while back that they wanted the UK to be seen as cutting edge in adopting self-driving technology. That's fine, but they didn't really look at the details of how hard it would actually be to achieve. Their timescales for implementation were also a complete joke. It didn't help that the likes of Musk thought self-driving would be done and dusted several years ago. Now the goal-posts are starting to move and "self-driving" doesn't actually literally mean "self-driving" anymore.

It reminds me of some joker on piston-heads a few years ago seriously arguing that within a year or so, a fully automated car would be "easily" capable of out-pacing a human driver along UK B-roads. I'm tempted to even dig up the ancient thread and ask him when it's going to happen, lol.
 
  • Like
Reactions: Llama.
It reminds me of some joker on piston-heads a few years ago seriously arguing that within a year or so, a fully automated car would be "easily" capable of out-pacing a human driver along UK B-roads. I'm tempted to even dig up the ancient thread and ask him when it's going to happen, lol.
From a computer AI perspective, none of this was possible prior to compute reaching minimum requirements around 2017.

Assuming adequate funding is applied I predict significantly better than a human driving system to be out within this decade. You can bookmark this and laugh at me in 2030 if you like.
 
Just noticed and article on the Top Gear website where VW are talking about charging €7 (about £6) an hour for level 4 autonomous driving. No timescales for when this will be available (don’t think they’ve gone past level 2 yet), just that they think that is the level they’ll need to cover the cost of the hardware and be profitable. It does sounds better value than Tesla’s £6,800 up front cost, unless of course you do a reasonable amount of driving (break even point is about 7 hours a week over 3 years). Will be interesting to see where Tesla pitch their subscription service, should that ever appear.

Tesla have talked about switching to a subscription model too, £6 per hour, 1000 hours to match the current purchase price of FSD, at an average speed where its likely to be useable of say 50 mph, then you'd need to be doing 50k miles on FSD to meet the Tesla purchase price. Its probably not unreasonable

Hmm not very cheap for people who drive a lot and would mostly benefit from FSD. For me for example based on VWs charging it would cost me around 3k per year. 6800 up front makes more sense. IF we ever get it though :p
By my maths you're doing 18k+ miles a year with active AP engaged, given when you can use it you must be doing double that mileage in total which isn;t typical. You're also not on Level 4, not even level 3, you're at level 2, and if Tesla did ever get to deliver Level 4 then Musk has said he'd charge a lot more. You've really paid for a punt that you'll still own the car when/if it happens
 
From a computer AI perspective, none of this was possible prior to compute reaching minimum requirements around 2017.

Assuming adequate funding is applied I predict significantly better than a human driving system to be out within this decade. You can bookmark this and laugh at me in 2030 if you like.
Its hard to know. Lets not forget that in nearly 6 years of development so far Tesla haven't been able to get windscreen wipers to work reliably. What they need is not evolution as if they're edging closer as its with adiminishing returns, what they need is some form of step change breakthrough which isn't more grunt, maybe the 4D modelling will be it but I have my doubts as the sensor suite isn't there. I've worked in AI/NN/ML etc for years, my university project 30 years ago was solving NP complete problems using statistical methods rather than brute forcing all the permutations and the one key take away is that you never know what the right answer is, just what statistically is the best answer you can come up with in the time with that limited by local minima and ergodic barrier challenges that the car (or whatever) simply can't see options beyond a defined set of parameters from its current best answer resulting in stalemate. The current crop of techniques are derivatives of those mathematical approaches, and it wasn't new back then 30 years ago, just the computing power meant the problems needed to be simpler,

I worry when Musk talks about not being able to work out the truth when mixing together multiple sensor types and hence dropping radar, it seems he'd prefer to be placing his bets on one sensor (which ironically is 3 for the straight ahead camera) rather than being able to resolve the differences. If he said the radar was just rubbish, badly calibrated, told them nothing etc then it would make more sense, but why on earth has it taken them 6 years to work that out? They're the same sensors today as back then. I think its because the 4D approach has dumbed down the task as its too hard to resolve multiple sensors over time.

However if you're talking about the whole industry doing it then maybe somebody has something we're not aware of.
 
Last edited:
  • Like
Reactions: Horatio Alger
I worry when Musk talks about not being able to work out the truth when mixing together multiple sensor types and hence dropping radar, it seems he'd prefer to be placing his bets on one sensor (which ironically is 3 for the straight ahead camera) rather than being able to resolve the differences. If he said the radar was just rubbish, badly calibrated, told them nothing etc then it would make more sense, but why on earth has it taken them 6 years to work that out? They're the same sensors today as back then. I think its because the 4D approach has dumbed down the task as its too hard to resolve multiple sensors over time.

My biggest concern is that for similar reasons why they struggle to synchronise radar with the other sensors, they may have issues with synchronising the cameras with each other. When I use to work in broadcast TV graphics, we often has to synchronise multiple video streams. It was dead easy as a synchronisation signal was piped around with all the other signals so everything could sync with that. However if you are trying to sync multiple sources with no way of synchronising them (can you synchronise the multiple cameras?), you are going to get very mixed results.

iirc that the cameras are running at 30fps (just halve the distances for 60 fps, 1/4 for 120fps etc) so from a bit of schoolboy maths, if you are travelling at 60mph (~27m/s), in the 1/30th second for each frame (33ms), it represents a distance window of approx 0.9m. If you are looking at the output of 2 cameras, let alone 8, is that level of uncertainty between the position of objects in each frame going to be acceptable? That is getting close to half a cars width or 1/4 of an average motorway lane.

Of course, Tesla may have the ability to synchronise their sensors, so this may not be an issue.
 
  • Informative
Reactions: Drew57
My biggest concern is that for similar reasons why they struggle to synchronise radar with the other sensors, they may have issues with synchronising the cameras with each other. When I use to work in broadcast TV graphics, we often has to synchronise multiple video streams. It was dead easy as a synchronisation signal was piped around with all the other signals so everything could sync with that. However if you are trying to sync multiple sources with no way of synchronising them (can you synchronise the multiple cameras?), you are going to get very mixed results.

iirc that the cameras are running at 30fps (just halve the distances for 60 fps, 1/4 for 120fps etc) so from a bit of schoolboy maths, if you are travelling at 60mph (~27m/s), in the 1/30th second for each frame (33ms), it represents a distance window of approx 0.9m. If you are looking at the output of 2 cameras, let alone 8, is that level of uncertainty between the position of objects in each frame going to be acceptable? That is getting close to half a cars width or 1/4 of an average motorway lane.

Of course, Tesla may have the ability to synchronise their sensors, so this may not be an issue.
Its a good point you make, maybe they don't overlap the cameras they just work with different resolutions, ie the long range forward camera fills in the middle bit at high resolution, the edges with the medium range and the sides with the wide angle, but that approach then unpicks the redundency aspects when you start looking at overlapping camera feeds which was also the idea. The world is also seem with a monical and no depth perception which we get with 2 eyes (we know people drive with one eye, but I suspect thats more a nod to civil liberties than thinking they're just as safe) so how they're working that out reliably is anybodies guess from an image.
 
I don’t think depth perception is an inherent problem. Simply because I have very poor sight in one eye (basically only provides peripheral vision with no focus). Despite that I am naturally very good at ball sports (tennis, football etc) and have no issues judging distance while driving. My brain simply works it all out despite a lack of stereo vision.
 
AP fails at depth perception all the time - it's why lorries jump from one lane to the other.. it can't tell whether it's a big lorry further away or a smaller one close to you.

Radar will give you distance, but Elon doesn't want to use that.. so it's entirely possible FSD will have the same problem.
 
I struggle to understand how we can get reliable L3 or L4 in the UK within a reasonable timeframe even though I already use Autopilot on motorways, dual carriageways and some A roads. Occasional erratic behaviour means I will continue to keep my hand on the wheel regardless of this forthcoming legislation for traffic queue situations.

Moving on from this to true 'self driving'....

Given the historically complex nature of our road network & the unpredictability of some drivers, can artificial intelligence adapt and be versatile to a level of competence that matches a human brain? AI comes from what it is programmed to do & a skill we all possess and it doesn’t is critical thinking. It’s OK performing repetitive, routine tasks, but is it capable of acting when faced with eventualities that go beyond what it was programmed for?

…giving way or not when faced with oncoming traffic in a narrow road or an obstruction, adapting and avoiding grids, covers, potholes, vans & lorries unloading, even deep puddles etc. Just as importantly, deciding when it’s safe to skirt round them or maybe stop to let an oncoming vehicle pass first? What about when ‘we’ predict the likely actions of children or someone beckoning from the pavement for us to stop or proceed? The amount of variables are almost endless and a so-called ‘Robo Taxi’ must address them all.

Humans get gut feelings, think outside the box, see different ways to approach a problem & make connections when they aren’t visible. For AI, solving problems where rules don’t exist is going to be very difficult & time consuming. New situations will initially require human intervention to deal with, update, fix or introduce new programming.

Although artificial intelligence is capable of remarkable things, when will it be truly able to match a human brain on our particular roads? (Or will it require roads to ONLY be used by autonomous vehicles?)

The biggest conundrum for me is that a Tesla is an easy, relaxing vehicle to drive but also powerful enough that I want the enjoyment of being in control and using it. Therefore I see it as a retrograde step handing over to a more nervous, sometimes unpredictable ‘learner driver’ where I need greater readiness in order to intervene.

I’m not saying it isn’t possible to eventually solve these issues but if I wanted to be a beta tester I wouldn’t choose to pay someone else thousands more for the privilege. However already mid 60s, if it reaches true L4 at a faster rate than my decline regress to L2, then I probably would take the plunge for FSD on a future vehicle.
 
  • Like
Reactions: Adopado
AP fails at depth perception all the time - it's why lorries jump from one lane to the other.. it can't tell whether it's a big lorry further away or a smaller one close to you.

Radar will give you distance, but Elon doesn't want to use that.. so it's entirely possible FSD will have the same problem.
I agree. I was just commenting that it's issues are not a limitation of mono-vision. It's just not clever enough to build an accurate 3D model of the world like our brains do effortlessly with one or two eyes. When you shut one eye the world around you doesn't suddenly turn into a flat 2D surface.
 
I struggle to understand how we can get reliable L3 or L4 in the UK within a reasonable timeframe even though I already use Autopilot on motorways, dual carriageways and some A roads. Occasional erratic behaviour means I will continue to keep my hand on the wheel regardless of this forthcoming legislation for traffic queue situations.

The important thing to remember about L3 and L4 is that they have a limited operational domain which will be dictated by the vehicle manufacturer and what approval they obtain for that. I personally think that L3 would be obtainable for the same operational domain that covers NoA. Its not there yet and Tesla would need to devote continued effort to achieve it, but certainly not as much effort as they are putting into City Streets. There are many aspects of City Streets beta that would greatly benefit existing Highways, but I have recently started to fear that Tesla think Highways is Done, which it is clearly not. The short term UK (and other jurisdictions) aspirations for L3 approval would give Tesla something to aim for, but I suspect other manufacturers will have similar offerings that they have not shown their full hand. The difference between L3 and L4 is actually quite small, boiling largely down to reliability of the system and where responsibility lies.

Other than Tesla's apparent lack of focus in the aspects that could bring L3 to the roads, my other concern is how well Tesla will play in getting cars approved for L3 and ultimately L4. I really cannot see their iterative approach to vehicle releases currently utilised by Tesla fitting in with an ever more strict route to approval that I am sure will safeguard official L3 and L4 solutions. If Tesla think they can constantly tweak their software as they do now without redoing the approvals process, I suspect they will need a rethink in their approach - it may be that there will be multiple release cycles going on - one as is now where Tesla undertake ad-hoc releases, and one for FSD vehicles that far less frequent yet approved for L4 (and possibly L3) use.
 
AP fails at depth perception all the time - it's why lorries jump from one lane to the other.. it can't tell whether it's a big lorry further away or a smaller one close to you.
From what I have read this could be solved in the upcoming V9 release, where "actual probability distribution of objects" should give smooth representation of the real world.. more details were in this electrek article:

Elon Musk hypes Tesla Full Self-Driving Beta driving visualizations with new update - Electrek

The lorry jump is a PITA.... so hoping it's true, and that some of these architectural changes of Autopilot/FSD appear in UK builds soon ahead of any eventual Beta City streets release..
 
Its hard to know. Lets not forget that in nearly 6 years of development so far Tesla haven't been able to get windscreen wipers to work reliably. What they need is not evolution as if they're edging closer as its with adiminishing returns, what they need is some form of step change breakthrough which isn't more grunt, maybe the 4D modelling will be it but I have my doubts as the sensor suite isn't there. I've worked in AI/NN/ML etc for years, my university project 30 years ago was solving NP complete problems using statistical methods rather than brute forcing all the permutations and the one key take away is that you never know what the right answer is, just what statistically is the best answer you can come up with in the time with that limited by local minima and ergodic barrier challenges that the car (or whatever) simply can't see options beyond a defined set of parameters from its current best answer resulting in stalemate. The current crop of techniques are derivatives of those mathematical approaches, and it wasn't new back then 30 years ago, just the computing power meant the problems needed to be simpler,

I worry when Musk talks about not being able to work out the truth when mixing together multiple sensor types and hence dropping radar, it seems he'd prefer to be placing his bets on one sensor (which ironically is 3 for the straight ahead camera) rather than being able to resolve the differences. If he said the radar was just rubbish, badly calibrated, told them nothing etc then it would make more sense, but why on earth has it taken them 6 years to work that out? They're the same sensors today as back then. I think its because the 4D approach has dumbed down the task as its too hard to resolve multiple sensors over time.

However if you're talking about the whole industry doing it then maybe somebody has something we're not aware of.
Not about radar being rubbish, just the video data now being interpreted as being much more accurate by the current analysis. The radar is now providing a very low "importance score" in most models. I wonder how well this has been tested in poor weather.
 
Occasional erratic behaviour
I've really not seen this while just cruising down a motorway, frankly it's fine and I'm simply not getting involved until a lane change is required.
is it capable of acting when faced with eventualities that go beyond what it was programmed for
It's not being 'programmed for', that's what makes this AI rather than what we've had so far (which has far more programmed behavior). It's been trained with real world experience, just like how human drivers learn. I would add that there are a wide range of driving abilities on the road, like the utter morons blocking yellow boxes on my commute this morning. It's common for us to over estimate our own ability, and overlook the times we mess up, just look how many accident there are every day, each of those is a human reaching the limit of their ability. I'll be happy to trust a machine, just as I'm happy to trust a plane, train, tube etc.

For me once FSD can pootle along a motorway while I'm present but doing something else is a huge win, long journeys are just long sit downs. FSD is a big help already, and once these new regulations get extended to 70 mph I'll be happy. It's not really an irritation for me to have to drive on roads off motorway, I can see how that is necessary for Elons robotaxi vision, but not for my passenger car.
 
  • Like
Reactions: KennethS and ringi