Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Self-driving cars (level 3) to be allowed on UK roads

This site may earn commission on affiliate links.
It's not being 'programmed for', that's what makes this AI rather than what we've had so far (which has far more programmed behavior). It's been trained with real world experience, just like how human drivers learn.
Thats a fundemental misunderstanding of AI. Even AI is configured to make decisions within a defined set of parameters and thats not "go drive a car" it will be broken down into a number of logical components that get joined together.

There's a set building a picture of the environment, whats around it, what the classification is of those objects etc. Thats where AI can play a part becuase it can take images and compare them to others to work out whats a car, whats a road and where its edges are, whats a pedestrian, whats a traffic light, what colour is the traffic light etc. Once it has the environmental map ot can then try and apply a mixture of rules based skills (maximum speed), where to position itself in lane etc, After that you can layer on top navigation and lane skills to determine whether it wants to change lane etc.

They've previously talked about a model that was purely designed to determine if a car in an adjacent lane might enter the lane you are in, trying to pick up of the nuances of behaviour of a driver in a differnet lane, reflecting that cars drift towards the inside of the lane on a curve, but what about a driver drifting because of a cross wind, a large puddle they've spotted and want to avoid, a car in a further lane over that the car in the adjacent lane decides to give a slightly wider birth to, none of which are likely to result in them coming into your lane. Its the level of finesse here on just one aspect of driving, judging what the car next to you might do that has already got monumentally complex and the data you'd need to take in to train it, and that specifric AI model won't pick up on many of those aspects unless the AI is fed with all the potential inputs including observed behaviour of that driver from the first time it appeared (have yuou never spotted somebody driving erratically from a distance and they've stood out to the extent that you've paid extra special attention to them?). What about the things you've never seen before but you instinctively know what to expect, an accident ont he opposite carriage way, a chunk of wood in the road, a fishtailing trailer up front of the car next to you and not you, This is the local minima issue - you assume that your inputs and freedom of response caters for everything but what about the stuff that never been seen or happens so infrequently its not learnt how to deal with it?

Its far to easy to think you can just stick all the data into one big model and think its going to pop out an answer telling the car what to do, but it really not that simple.
 
Last edited:
Thats a fundemental misunderstanding of AI. Even AI is configured to make decisions within a defined set of parameters and thats not "go drive a car" it will be broken down into a number of logical components that get joined together.

There's a set building a picture of the environment, whats around it, what the classification is of those objects etc. Thats where AI can play a part becuase it can take images and compare them to others to work out whats a car, whats a road and where its edges are, whats a pedestrian, whats a traffic light, what colour is the traffic light etc. Once it has the environmental map ot can then try and apply a mixture of rules based skills (maximum speed), where to position itself in lane etc, After that you can layer on top navigation and lane skills to determine whether it wants to change lane etc.

They've previously talked about a model that was purely designed to determine if a car in an adjacent lane might enter the lane you are in, trying to pick up of the nuances of behaviour of a driver in a differnet lane, reflecting that cars drift towards the inside of the lane on a curve, but what about a driver drifting because of a cross wind, a large puddle they've spotted and want to avoid, a car in a further lane over that the car in the adjacent lane decides to give a slightly wider birth to, none of which are likely to result in them coming into your lane. Its the level of finesse here on just one aspect of driving, judging what the car next to you might do that has already got monumentally complex and the data you'd need to take in to train it, and that specifric AI model won't pick up on many of those aspects unless the AI is fed with all the potential inputs including observed behaviour of that driver from the first time it appeared (have yuou never spotted somebody driving erratically from a distance and they've stood out to the extent that you've paid extra special attention to them?). What about the things you've never seen before but you instinctively know what to expect, an accident ont he opposite carriage way, a chunk of wood in the road, a fishtailing trailer up front of the car next to you and not you, This is the local minima issue - you assume that your inputs and freedom of response caters for everything but what about the stuff that never been seen or happens so infrequently its not learnt how to deal with it?

Its far to easy to think you can just stick all the data into one big model and think its going to pop out an answer telling the car what to do, but it really not that simple.
Never mind FSD, AP needs to start being perceptive & thinking ahead. For example, seeing brake lights 2 cars ahead, so stop accelerating, or accelerating towards a roundabout where theres onone in front. Phantom braking has never happened to me but those two scenarios bug me & put me off AP. My beamer was much better.
 
  • Like
Reactions: ringi
Isn't autosteer only meant to keep you in lane and TACC only meant to maintain your speed based upon the car in front and a few other hints?

FSD has the feature to slow and stop you for a roundabout, traffic lights, stop line etc. Not basic Autopilot.
 
AP is a dead branch.. Tesla might have an intern on it but from what I've seen probably not even that. So 'AP' isn't going to start doing anything.

FSD has the ability to be much more responsive, once it's applied to all cases not just city streets. Hopefully even phantom braking.
 
...TACC only meant to maintain your speed based upon the car in front and a few other hints?
The TACC part of AP does seem to work very well now & adjusts for speed signs on the local roads where I use it. I'm also perfectly happy if Autopilot has reached its final state (other than solving an occasional phantom braking incident, which I am prepared for whilst using).
 
Last edited:
It makes more sense for them to be the same code base just with features disabled/enabled. They have AP, EAP and FSD, trying to maintain all 3 and making them increasingly safe doesn’t make sense, letting AP be left high and dry and unreliable will give people no confidence to pay out lots more for the higher options. Musk has already said he expects the 4D changes should make phantom braking less likely. You could use those with AP in a shadow mode to refine various aspects of the models - you watch what real drivers do and see how it compares to the predictions the car makes, the discrepancies can then be analysed. As (‘if’ maybe more accurate) FSD actually becomes real, then having a parallel fleet of cars without it operational but running in shadow mode would be a great way of reacting to changing road behaviours without the FSD failing. It would almost become a case of those who haven’t paid will be helping the continual refinement for those that have.
 
Yea, but only FSD has the feature set functionality to stop you for stop lines, roundabouts etc which is what @Yachtsman seems to be wanting it to do.
No, perhaps I didnt explain my experiences. AP works fine as long as there is a vehicle infront within the tracking range, i.e. 2/3/4 car lengths. However when that space infront suddenly increases (the car infront moves out of your lane & there's e.g. 100 yards to the next vehicle & your max speed is above your current speed, (a typical motorway situation) the car races to catch up with the vehicle infront, even though there are several vehicles brake lights showing, and then brakes much too late.
Whereas the human driver will anticipate the issue ahead and reduce the acceleration and brake gently.
 
Look, self driving cars are never going to happen. This is as good as it gets. The insurance costs alone are going to be prohibitive. Unlimited fines for a death in the workplace for example. How does this translate to cars killing people?

I personally think Tesla are disgraceful for using terms such as ‘full self drive’ and ‘autopilot’. It’s not.
 
It’s marketing crap. If I think autopilot I think of a plane flying itself. Tesla’s don’t drive themselves.
You can blame that on Holywood.

An autopilot is a system used to control the trajectory of an aircraft, marine craft or spacecraft without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle, allowing the operator to focus on broader aspects of operations (for example, monitoring the trajectory, weather and on-board systems).

 
  • Like
Reactions: ringi
I am going to agree with UrbanSplash. Saying FSD and claiming that cars will be driving around by themselves with non sitting behind the steering wheel is not going to happen unless all the cars on the roads are like that and they can communicate to each other.
From my point of view I don't need this. At the moment I am using these features at 80% of my daily commute. I don't mind holding the steering wheel. It actually keeps me focused for any potential Autopilot screw ups. If version 9 and potentially FSD beta manages to cover the rest of 20% (which is mostly roundabouts and slip exits) then I will be a happy man. Even if that means I need to keep one hand on the wheel :)
 
You can blame that on Holywood.

An autopilot is a system used to control the trajectory of an aircraft, marine craft or spacecraft without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle, allowing the operator to focus on broader aspects of operations (for example, monitoring the trajectory, weather and on-board systems).

I was under the impression that Autopilot on a plane means the pilot doesn't need to constantly watch altitude, speed, course and do different tasks, which is not the same as intently supervising the same taks. In effect, Autopilot on a plane is like Level 4 in a car.
 
I was under the impression that Autopilot on a plane means the pilot doesn't need to constantly watch altitude, speed, course and do different tasks, which is not the same as intently supervising the same taks. In effect, Autopilot on a plane is like Level 4 in a car.

Are you arguing that Tesla Autopilot does not fit the dictinary definition because the definition implies less degree of supervision than was is actually required with Tesla Autopilot?
 
  • Like
Reactions: Yachtsman
Modern avionics are pefectly capable of flying and landing the aircraft within the avionics' design parameters. The link referred to mentions general aviation (i.e not the big commercial stuff). The pilots of heavies are there in case of malfunctions - increasingly rare with 5+ redundant systems - and for when weather conditions and ground failures exceed the avionics design - crosswind limits and suchlike.
Tesla has done a good job of selling futureware to those gullible amongst us (yeah me too). And the more I play with it the less impressed i am with current capabilities, speed of improvements and hardware used. If he ever got it licensed for robotaxis then there will be heaps of folk waiting by the roadside for a replacement cab when the taxi runs out of talent and stops.
 
Are you arguing that Tesla Autopilot does not fit the dictinary definition because the definition implies less degree of supervision than was is actually required with Tesla Autopilot?
I'm saying that on an aeroplane the pilot hands over some functions to the system relieving the pilot of those functions and their role became a periodic inspection of the functions operation, not a continual review of successful operation. If Tesla autopilot meant as a driver you didn't need to worry about lane keep, speed control, and even junctions but focused on range, fuel economy, weather forecast where you were driving, traffic and whether to take a diversion, decide whether you wanted to redeuce the speed to 5mph below the speedlimits to conserve fuel etc then I'd agree they share the definition, but they don't.

Level 3 and 4 autonomous driving is where the system hands back control to the driver under certain situations - the driver still needs to be there to take control when required. That feels much more like a planes autopilot system.