Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD / AP Rewrite - turning the corner?

This site may earn commission on affiliate links.
It's quite frustrating that we're seeing drives on massive 4 lane roads with almost no traffic (either empty roads or the kind of traffic you might see on a quiet sunday morning), mostly at night, instead of a busy city.. it's supposed to be city driving but I'm really not seeing anything that AP wouldn't do, except for he turning left and right, which is impressive when you first see it but the 20th time it gets a bit dull..

They've had it since tuesday, and nobody has filmed their morning commute yet...

Also, does it fix the current system.. phantom braking, randomly deciding the motorway is 30mph, NoA having zero lane discipline..
 
No. I agree, your interpretation was mine too. I kind of assume that the guy had recently updated and was probably seeing general improvement. I’d say 40.8 is better than 36.12. Not perfect, but better.
I am very encouraged by the obvious progress we are seeing in the Tesla FSD system with the first glimpse at what the new 4D rewrite can do. The fears that 'it will never be able to turn at junctions', 'never deal with going around a roundabout', or 'never cope with cars parked at the side of a road' (something that especially irritates me on local UK roads), look like they have been pretty much answered by this new beta rewrite - at least in principle.

Big question now I think is whether we will finally start to see the 'exponential rate of learning and improvement' from the new 4d system, which Elon promised all the way back on Autonomy Day. As much as I am an enthusiast and have enjoyed seeing the steady small improvements in FSD features since I got my M3 at the end of 2019, the rate of improvement so far has been slow and incremental and very far from 'exponential'. To get the system from its current state to being demonstrably safer than most human drivers, we need to see now for the 4D rewrite that the neural net is indeed capable of much, much faster learning and improvement to cope with all of the 'edge cases' that currently trip up self driving cars.

So what do you think, will things start to improve rapidly now? Or will it just be much better but quickly end up trapped in a new 'local maximum' that means we have a much improved system that still falls far short or reliable 'Full Self Driving'?
 
It’s a reasonable bet that it will improve at a fairly rapid rate in the US. The question is how well that learning will fan out across the globe.
I was pleased to see that kerbs, in particular, traffic islands were seen and avoided as well as parked vehicles. One thing I’m conscious of, is the car’s desire to achieve max speed limit where possible. AI has to learn otherwise from mapping and road conditions.
It’s early days yet...
 
  • Like
Reactions: Florafauna
  • Like
Reactions: Gatsojon and phil4
As a Brit now living in Australia, I see how much the two countries' road difficulties match, and differ from the US. (I had a car in the US for 12 years so am familiar with the differences there, too).

It has to be said that the Tesla equipment has done remarkably well so far on pure motorway driving - with the odd PBs and other anomalies - and to some extent within its documented limits on ordinary roads, too.

What has bothered me, and made allowing my car to drive itself on these "ordinary" roads quite stressful, is the lack of "understanding" of the conditions and comfort that it displays at present. For example, by keeping rigidly to the lanes' centre, it takes no account of the way that a competent driver unconsciously uses the road width to "iron out" bends. We ourselves naturally ease to the left when approaching a right bend, close in on the right of the safe area in the middle and exit towards the left side again. The way the car does it, makes me for one, very uncomfortable, as does the way it charges into the corner without changing speed as if it's unaware of what it's going to find halfway round.

Perhaps the magic upgrade will begin to address these concerns amongst others, perhaps the psychological comfort of the driver and passenger isn't a high priority; I just don't know.

From the few videos I've seen on the new rewrite it looks more like a video game than something designed to give the driver relevant information, but we will see in good time.
 
First off, may I respectfully suggest that we all avoid using derogatory labels like 'fanboy' on this forum? I appreciate reading contrasting views and opinions - and especially from experienced and knowledgeable people such as yourself and others who regularly post here, regardless of whether I may, or may not, agree with you on a specific point.

In this specific case I don't have a strong opinion about how NCAP chose to evaluate different driver assist features on different makes and models and I don't really care much about their assessment either. I think most of these reports and assessments that are comparing new technologies inevitably have a degree of 'apples vs oranges' comparison built in to them. They definitely have a very arbitrary weighting based upon how you define 'categories' and what relative score you assign to separate categories. You can in this way get radically different outcomes from many types of evaluation just by altering the categories you arbitrarily define and the scoring system (also arbitrary) you choose.

I would say that at present we are still very much in an intermediate phase of evolving technology for autonomous driving and with many parallel systems developing quickly and changing rapidly, we don't know which one - or maybe many - will end up dominating the market in future. It could be that one system 'wins' - think VHS wiping out betamax for video tapes for example - and that does not even have to mean that the 'winner' is truly technically superior in all respects (I am not at all sure VHS was superior technically for example). Maybe no single solution will win, and the consumer then 'wins' as they get a wider choice of successful, great products that in reality all do a really good job, but with some differences - think Nikon vs Canon DSLRs for example. Which you choose will then depend upon personal preferences between a range of features and factors, so there is no need to conclude that one is objectively better than the other. Depends upon what matters most to you.

As of today, my personal preference is for the Tesla system and in my opinion it is the one most likely to rapidly evolve into a comprehensive autonomous system. I may be proved wrong about that. Also, my preference and opinion does NOT make me blind to current limitations with the Tesla system (I also hate phantom braking!) but I guarantee all of the other competing systems currently have limitations also and are not at all 'feature complete' in a way some posts might imply. Regardless, if anyone simply prefers driving an alternative solution from another manufacturer to Tesla and finds that works better for them - fine with me! I am also genuinely interested to hear why you prefer it and what features and experiences most affect your evaluation.

With respect, the NCAP assessment is about driver assist capabilities on cars today. It is very much apples with apples. There is a whole separate discussion on which will and wont evolve quickly, but in the middle of a phantom braking moment the future potential of a system is going to be of little consequence as the car behind is confronted with your car rapidly slowing down.

Teslas ethos is to do their development in public. It builds brand affinity with owners through making them feel part of the process rather than been given some system that’s been vigorously tested and matured and as a result has a high degree of matured ability and reliability. If we want to compare the state of the game in development we’d compare the current Tesla offering with the generation that’s currently in the lab at mobileye, happily driving around Israel and has been for a year which makes even the Tesla beta software look a tad clunky. This thread and the NCAP assessment isn’t about that, it’s discussing the assistance in the car today and how well it performs. It’s also not about autonomous driving and the future.

I’m not sure how you’d like me to describe someone who refuses to accept an independent study into the current systems without any personal experience of the others and simultaneously throws doubt on the veracity of the weighting system while using one of the those metrics as a proof point for Tesla. I”m all for healthy debate, but sadly on here there are people who refuses to acknowledge any criticism of Tesla, or perhaps a little more charitably they might be described as poorly making their point. There also seems to be a gap in comprehension on what these other systems can actually do but happily dismiss them as fact.

So back to the case in point. We have no knowledge of the independence of those given the software, we have seen videos before of Teslas capabilities but which have otherwise failed to transition into production. The function set appears to have been expanded, especially with regard to turning across traffic. The city driving nature appears to be on reflection linked to lower driving speeds as none of the videos I’ve seen have been of cars doing any real speed. You have to start somewhere and that would be logical in fairness but increasing speed is more than just a threshold, it’s increasing sensor range and monitoring and a squared law increase in object recognition. It will come in time, but again, it’s not here today. My angst with the Tesla system is less about the list of features, but the performance and reliability of the features it has. I personally feel much safer in a car that does less but does it more reliably, although it should be noted that the feature list is near identical at the moment (excluding the beta software). And this is where opinions start to differ and it’s where others start to compare apples with oranges by not looking at the current system when discussing the NCAP findings.. The one camp places great confidence that the current, variably performing (in some cases the best until it throws a wobbly) but expansive features means closer to full self driving and the other look at the performance today, the lack of feedback on what the car is thinking, the random braking, the missed speed limit change, the variability in windscreen wipers and automatic full beam etc and compare that to the relatively rock solid and highly communicative performance in the latest system from the competition, as reported by ncap, even if those system will plateau and make no claims of ever being self driving. I think the pro Tesla system side would do well to recognise the strengths of these other system, be receptive to other ways of engaging with the driver if and when they see something beneficial, and to lobby for its inclusion. There is no monopoly on good ideas.

The beta software shows promise, but it also reminds me in some ways of the first EAP systems from 2016 and the journey is far from being an easy one.
 
Last edited:
.
Perhaps the psychological comfort of the driver and passenger isn't a high priority; I just don't know.
Unfortunately I don't think interim occupant comfort is even on the priority list, never mind high up on it. Other wise we would have a better way of interacting with the wipers while they improve (although I'm happy with them now...), And we would be able to use the wheel while on ap to indicate position in lane, or choose a lane when transitioning from single to dual. Possibly have better insulation from the or side world and so on.

I've not watched all the videos, but those I have seen are encouraging. Show the building blocks needed while gathering more data.

Bearing in mind it's a bunch of tech that is really only just out of the research labs and being able to see the dev is pretty unusual. It gets more complicated when people have stuck their cash on it which is why you start getting people suing Tesla for breach of contract etc. It's all a little kick starter, time to sing the early adopter song! At least we get some updates unlike other kickstarter projects I've backed.

It does look like vision is pretty much solved, I do t think any of the videos were good enough for us to comb over looking for missed objects etc? And I think Andre or Elon said that exec decisions are still software 1.0 ATM - is hand coded, not learnt. Guess that will be the next step with more data and dojo. I do think there has to be a better way for us to contribute to that tho to help learning.
 
This one is quite good

'normal' city roads, windy roads.. Also showing the failures, which are the things to watch as the beta matures. The country road he follows isn't dissimilar in curvature to my commute and I know current AP can't handle it (it doesn't slow down for the corners, understeers and tries to drive into walls). This seems much better.
 
With respect, the NCAP assessment is about driver assist capabilities on cars today. It is very much apples with apples. There is a whole separate discussion on which will and wont evolve quickly, but in the middle of a phantom braking moment the future potential of a system is going to be of little consequence as the car behind is confronted with your car rapidly slowing down.

Teslas ethos is to do their development in public. It builds brand affinity with owners through making them feel part of the process rather than been given some system that’s been vigorously tested and matured and as a result has a high degree of matured ability and reliability. If we want to compare the state of the game in development we’d compare the current Tesla offering with the generation that’s currently in the lab at mobileye, happily driving around Israel and has been for a year which makes even the Tesla beta software look a tad clunky. This thread and the NCAP assessment isn’t about that, it’s discussing the assistance in the car today and how well it performs. It’s also not about autonomous driving and the future.

I’m not sure how you’d like me to describe someone who refuses to accept an independent study into the current systems without any personal experience of the others and simultaneously throws doubt on the veracity of the weighting system while using one of the those metrics as a proof point for Tesla. I”m all for healthy debate, but sadly on here there are people who refuses to acknowledge any criticism of Tesla, or perhaps a little more charitably they might be described as poorly making their point. There also seems to be a gap in comprehension on what these other systems can actually do but happily dismiss them as fact.

So back to the case in point. We have no knowledge of the independence of those given the software, we have seen videos before of Teslas capabilities but which have otherwise failed to transition into production. The function set appears to have been expanded, especially with regard to turning across traffic. The city driving nature appears to be on reflection linked to lower driving speeds as none of the videos I’ve seen have been of cars doing any real speed. You have to start somewhere and that would be logical in fairness but increasing speed is more than just a threshold, it’s increasing sensor range and monitoring and a squared law increase in object recognition. It will come in time, but again, it’s not here today. My angst with the Tesla system is less about the list of features, but the performance and reliability of the features it has. I personally feel much safer in a car that does less but does it more reliably, although it should be noted that the feature list is near identical at the moment (excluding the beta software). And this is where opinions start to differ and it’s where others start to compare apples with oranges by not looking at the current system when discussing the NCAP findings.. The one camp places great confidence that the current, variably performing (in some cases the best until it throws a wobbly) but expansive features means closer to full self driving and the other look at the performance today, the lack of feedback on what the car is thinking, the random braking, the missed speed limit change, the variability in windscreen wipers and automatic full beam etc and compare that to the relatively rock solid and highly communicative performance in the latest system from the competition, as reported by ncap, even if those system will plateau and make no claims of ever being self driving. I think the pro Tesla system side would do well to recognise the strengths of these other system, be receptive to other ways of engaging with the driver if and when they see something beneficial, and to lobby for its inclusion. There is no monopoly on good ideas.

The beta software shows promise, but it also reminds me in some ways of the first EAP systems from 2016 and the journey is far from being an easy one.
Thanks for responding. A few separate points being discussed here. As for '’I'm not sure how you’d like me to describe someone who refuses to accept an independent study...", I'd just say I prefer being polite and respectful of people who choose to hold different views. You clearly are persuaded by the value assessments in the NCAP report, others here less so and I conclude from your comments that the results of the NCAP survey coincide closely with your personal views, based on some direct experience of driving different cars and the subjective opinions you hold. I think some of your points are very well made, but while I am happy to agree that Tesla have no monopoly on good ideas and do not think they make cars that are superior in every department to other cars, nor am I persuaded that the conclusions of the NCAP report represent some sort of immutable, profound ground truth that simply cannot be disputed. I take both the conclusions and 'independence' of all such reports with a large grain of salt, regardless of their conclusions as to whether they conclude Tesla technology is better or worse than any other. That doesn't mean I don't find them interesting and worth discussing. I just think they are, at least in part, filled with arbitrary and subjective parameters that can be altered to affect the conclusions. You are welcome to disagree with me on that!

What I certainly do agree with you is on your assessment of the current merits of specific elements of the overall technology being used by different automakers and your conclusion that at this point in time, other systems perform objectively better in specific tasks than what Tesla have delivered so far. What this does not lead me to conclude however is that this means Tesla are wrong to pursue their vision (pun intended, sorry) that their approach will eventually lead them to delivering a superior system in the future. I expect Tesla could make their cars 'better' in the short term, perhaps by changing their approach to use similar third party solutions to those bought in by BMW and Mercedes etc, but it looks to me like they are confident in the long term potential of their vertically integrated engineering approach to ultimately provide a better overall solution.

I don't know if they will be successful with this or not, but I respect Tesla for having the courage of their convictions to go all out for an engineering driven solution, rather than being pushed backwards and forwards by finance departments, advertising and customer relations departments wanting to pursue short term goals, with profits staying high for their shareholders and short term customer satisfaction ratings. They are the exact opposite of a 'Me Too' company and I find that refreshing! The optimistic view is that Tesla technology resembles the same sort of disruptive impact on the auto industry that digital camera technology had previously on the SLR market. In the early days, digital camera technology was scoffed at by the purists and professionals and there is no question that 'independent reports' from those days could show how inferior DSLR systems were in some measurable areas of the technology. But all of that ignored the potential and trajectories of the alternative technologies. It is now an objective fact what happened in that industry.

Since none of us have a crystal ball, we can all only speculate about the future and how this will turn out. I am enjoying watching this evolve - exciting times! - and simply I am saying that regardless of NCAP or any other current reports, I admire Tesla for the approach they are taking and wish them success. I also wish success for other companies trying to make innovative, great products and especially those helping to transition the world to a more environmentally friendly and safer driving experience. Not sure what that makes me then - 'Eco-techno- Fanboy'? I look forward to your suggestions!
 
I think that what we're seeing with disparate views regarding driver assistance is just an extension of the age old problem that a lot of innovative products have, that of designing an effective man-machine interface.

I managed a fairly large aircraft procurement about 16 or so years ago, where the same problem arose. The technical difficulty in presenting an operator with a large amount of information from various sensors and weapons systems, and allowing the operator to interface with it effectively, were huge. It came to a head when a large touch screen was presented by the techies as the preferred solution. I simply couldn't believe they could be so damned stupid, as there was absolutely no way their interface could be operated in an aircraft in combat, with the crew wearing normal flying clothing and SE.

The answer to that was to actually stick some of these interface designers in the aircraft, fly them around and show them how challenging it is to make small, precise, movements on a screen, whilst subject to the normal vibration, acceleration levels etc of flight. It was apparent that the design team simply had no understanding of the environment, hence the reason for their enthusiasm for their preferred solution. To their credit, they then built a simulator, so they could try out different interfaces, and they pulled in a couple of retired aircrew as advisors.

By the same token, some designers of things like phone apps, need to understand that not everyone has prehensile thumbs that are only 3mm wide at the tip, and that not everyone has 20-20 near-vision. Haptic feedback on any user control is extremely helpful. For decades, aircraft secondary controls were designed so that pretty much everything could be done by touch. Deliberately removing the ability to be able to input information using just the feel from fingertips inevitably makes things less precise, and more challenging for the user.

It often seems to me that the Model 3 is very similar to that prototype cockpit I was shown all those years ago. The technology is impressive, but the ability of the driver to interface with it isn't as good as it could be. It's almost as if those designing the interface don't drive, or at least don't drive a RHD car when they are right handed, so cannot use their dominant hand to operate secondary controls. Instead of designing the cabin around the most effective way to present the driver with key information, including that from the driver assist systems, and optimise the way that the driver can interact with it, it seems the cabin has been designed around the idea of having a clean look, with much of the things related to actually driving the car being a lower priority.

It may be that this emphasis away from the driver's needs is deliberate, as the move towards greater autonomy comes closer, but right now we're in a position where the car has to be driven by a driver. I can understand why organisations like NCAP are critical of the man-machine interface. Whilst it may well be a good solution at some point in the future, when the level of autonomy is higher than it is now, it seems a sub-optimal solution for this interim period, when all of the self-driving systems are just driver assistance features. I suspect it's also why some (most?) other EV manufacturers are sticking with displays in the driver's eye line, or are using HUDs, at least for now. During this interim phase, before we get fully autonomous vehicles, there's an even greater need to present the driver with clear information, to allow the driver to be able to quickly understand what the driver assist features are doing, and easily respond if they aren't doing exactly what's wanted.
 
This one is quite good

'normal' city roads, windy roads.. Also showing the failures, which are the things to watch as the beta matures. The country road he follows isn't dissimilar in curvature to my commute and I know current AP can't handle it (it doesn't slow down for the corners, understeers and tries to drive into walls). This seems much better.
It’s a bit more real world than some of the others. Definition of lane markings are going to be a big issue in the UK
 
I think that what we're seeing with disparate views regarding driver assistance is just an extension of the age old problem that a lot of innovative products have, that of designing an effective man-machine interface.

I managed a fairly large aircraft procurement about 16 or so years ago, where the same problem arose. The technical difficulty in presenting an operator with a large amount of information from various sensors and weapons systems, and allowing the operator to interface with it effectively, were huge. It came to a head when a large touch screen was presented by the techies as the preferred solution. I simply couldn't believe they could be so damned stupid, as there was absolutely no way their interface could be operated in an aircraft in combat, with the crew wearing normal flying clothing and SE.

The answer to that was to actually stick some of these interface designers in the aircraft, fly them around and show them how challenging it is to make small, precise, movements on a screen, whilst subject to the normal vibration, acceleration levels etc of flight. It was apparent that the design team simply had no understanding of the environment, hence the reason for their enthusiasm for their preferred solution. To their credit, they then built a simulator, so they could try out different interfaces, and they pulled in a couple of retired aircrew as advisors.

By the same token, some designers of things like phone apps, need to understand that not everyone has prehensile thumbs that are only 3mm wide at the tip, and that not everyone has 20-20 near-vision. Haptic feedback on any user control is extremely helpful. For decades, aircraft secondary controls were designed so that pretty much everything could be done by touch. Deliberately removing the ability to be able to input information using just the feel from fingertips inevitably makes things less precise, and more challenging for the user.

It often seems to me that the Model 3 is very similar to that prototype cockpit I was shown all those years ago. The technology is impressive, but the ability of the driver to interface with it isn't as good as it could be. It's almost as if those designing the interface don't drive, or at least don't drive a RHD car when they are right handed, so cannot use their dominant hand to operate secondary controls. Instead of designing the cabin around the most effective way to present the driver with key information, including that from the driver assist systems, and optimise the way that the driver can interact with it, it seems the cabin has been designed around the idea of having a clean look, with much of the things related to actually driving the car being a lower priority.

It may be that this emphasis away from the driver's needs is deliberate, as the move towards greater autonomy comes closer, but right now we're in a position where the car has to be driven by a driver. I can understand why organisations like NCAP are critical of the man-machine interface. Whilst it may well be a good solution at some point in the future, when the level of autonomy is higher than it is now, it seems a sub-optimal solution for this interim period, when all of the self-driving systems are just driver assistance features. I suspect it's also why some (most?) other EV manufacturers are sticking with displays in the driver's eye line, or are using HUDs, at least for now. During this interim phase, before we get fully autonomous vehicles, there's an even greater need to present the driver with clear information, to allow the driver to be able to quickly understand what the driver assist features are doing, and easily respond if they aren't doing exactly what's wanted.

Well said, Sir.
I do think these cars are built to avoid spending money on anything that can be skimped.. like a button or stalk control although there really is no need for the myriad such things some other manufacturers stick on a steering wheel.
 
I think that what we're seeing with disparate views regarding driver assistance is just an extension of the age old problem that a lot of innovative products have, that of designing an effective man-machine interface.

I managed a fairly large aircraft procurement about 16 or so years ago, where the same problem arose. The technical difficulty in presenting an operator with a large amount of information from various sensors and weapons systems, and allowing the operator to interface with it effectively, were huge. It came to a head when a large touch screen was presented by the techies as the preferred solution. I simply couldn't believe they could be so damned stupid, as there was absolutely no way their interface could be operated in an aircraft in combat, with the crew wearing normal flying clothing and SE.

The answer to that was to actually stick some of these interface designers in the aircraft, fly them around and show them how challenging it is to make small, precise, movements on a screen, whilst subject to the normal vibration, acceleration levels etc of flight. It was apparent that the design team simply had no understanding of the environment, hence the reason for their enthusiasm for their preferred solution. To their credit, they then built a simulator, so they could try out different interfaces, and they pulled in a couple of retired aircrew as advisors.

By the same token, some designers of things like phone apps, need to understand that not everyone has prehensile thumbs that are only 3mm wide at the tip, and that not everyone has 20-20 near-vision. Haptic feedback on any user control is extremely helpful. For decades, aircraft secondary controls were designed so that pretty much everything could be done by touch. Deliberately removing the ability to be able to input information using just the feel from fingertips inevitably makes things less precise, and more challenging for the user.

It often seems to me that the Model 3 is very similar to that prototype cockpit I was shown all those years ago. The technology is impressive, but the ability of the driver to interface with it isn't as good as it could be. It's almost as if those designing the interface don't drive, or at least don't drive a RHD car when they are right handed, so cannot use their dominant hand to operate secondary controls. Instead of designing the cabin around the most effective way to present the driver with key information, including that from the driver assist systems, and optimise the way that the driver can interact with it, it seems the cabin has been designed around the idea of having a clean look, with much of the things related to actually driving the car being a lower priority.

It may be that this emphasis away from the driver's needs is deliberate, as the move towards greater autonomy comes closer, but right now we're in a position where the car has to be driven by a driver. I can understand why organisations like NCAP are critical of the man-machine interface. Whilst it may well be a good solution at some point in the future, when the level of autonomy is higher than it is now, it seems a sub-optimal solution for this interim period, when all of the self-driving systems are just driver assistance features. I suspect it's also why some (most?) other EV manufacturers are sticking with displays in the driver's eye line, or are using HUDs, at least for now. During this interim phase, before we get fully autonomous vehicles, there's an even greater need to present the driver with clear information, to allow the driver to be able to quickly understand what the driver assist features are doing, and easily respond if they aren't doing exactly what's wanted.
Yes, all good points and I fully agree with you! I have some direct professional experience of what happens when a UI & software system is designed by people who have minimal understanding of the working environment of the intended users of their system and you hit the nail on the head.That said, I really do hope that Tesla checked that their AI engineers actually have driving licences!

What I would appreciate with the current autopilot system, as we go through the inevitable period of improving it from an incomplete to a more comprehensive, reliable system, is some sort of 'red/amber/green' on the interface giving an indication of the confidence of the system in its ability to navigate the traffic situation ahead. At the moment, there is no way to know whether the car is getting input from the cameras and sensors that provide enough information to have high confidence it will be able to cope, until suddenly 'abort! abort!' is sounded and you are left to sort out the mess with little warning. This makes using it rather tiring as you are constantly on edge in case it suddenly reacts incorrectly (phantom braking anyone?) and you have to take over control in a fraction of a second.

I suspect that the algorithms used to make decisions while the car is driving include estimations of confidence and this could give some sort of feedback to the driver about the probability of error - at least giving you a little more early warning as to when the chance of an abort has increased. I realise the problem here could be false confidence in the car's ability when it thinks it is doing fine, so probably this suggestion would only work if there was typically a strong positive correlation between the autopilot abort events and predictive metrics that the car is constantly recording.

Eventually, we hope, the system will 'just work'. The big question therefore is whether this is going to be quite soon, which could happen if the FSD neural net really can learn and improve exponentially, or take much, much longer, which is the prediction I see from the many people who are unconvinced by either Elon's confidence or the capability of the current Tesla hardware.
 
To be fair, man-machine interface design is far from easy, especially when there's a need to convey complex safety-critical information very quickly and accurately, and enable a rapid and accurate response from the user. I few an aeroplane a couple of weeks ago where the cockpit was just an ergonomic nightmare, when judged by modern standards. It was obvious that it had been designed by non-flying engineers, who had placed things where it was easiest for them and the maintainers to get at (hell of a great aeroplane to fly, though, all the same). Many older cars were much the same, with the odd bit of good design by complete accident (like left foot operated headlight dip switches). One or two older cars did have really well-designed instruments and controls though. Saab springs to mind as one; that was a car where every secondary control seemed to be in exactly the right place, where it was possible to do pretty much everything needed without taking your eyes off the road.

At the moment, whilst we have this sort of halfway house between the driver being in total control of everything, and the car being in total control of everything, the problem is probably at its most severe. The amount of information that the car can, and needs to, present to the driver is far greater than was the case for a car designed just a decade or so ago. Likewise, the number of driver inputs that are needed now has also grown, as there are now far more driving controls and adjustments that need input, especially when the autonomous secondary controls don't behave quite as expected (auto wipers and lights, TACC speed changes, etc). Arguably a semi-autonomous car actually needs more easy to use controls, and clearer and less distracting information displays, rather than less.

The Model 3 interior and driver displays/controls definitely seem to have been been designed with full autonomy in mind, so as a way of getting people used to the pared down look that future cars may have, that's probably a good thing. Knowing this still doesn't make it any easier to drive a RHD car as a right handed person, and having to use my left hand on the touch screen. Most of the time that my wife's in the car I get her to control things that need manual intervention on the screen, like the wipers, selecting music, adjusting the HVAC etc. There have been times when I've been on my own in the car and pulled over and stopped to change the music/radio station, as I've felt that it's just not safe to do on the move. Maybe a left handed person would find it easier, but being able to do things like this by feel, without needing to take eyes off the road to read the left hand side of the screen, would seem to be safer.
 
The big question therefore is whether this is going to be quite soon, which could happen if the FSD neural net really can learn and improve exponentially, or take much, much longer, which is the prediction I see from the many people who are unconvinced by either Elon's confidence or the capability of the current Tesla hardware.

Because I am cautious I have been told on this forum that I am pessimistic and should adopt a more positive attitude :D

I am not necessarily unconvinced by Elon’s confidence but his record so far does not give me immediate hope. He has been spectacularly wrong in his predictions about FSD in the past. It was supposed to be “feature complete” by the end of LAST year - so why the need for a complete software rewrite?

As for hardware - on Friday I drove for 120 miles up the M6 in very heavy rain. For a lot of the journey I had the message “multiple cameras blocked or blinded”, even after I cleaned them while I was at the Tebay supercharger. I don’t see how any software can deliver FSD if the cameras are so poor in adverse weather conditions.

And I understand (from Electrek) that Tesla is looking to update the radar with a new 4D sensor technology from an Israeli startup called Arbe Robotics that has twice the range of the current radar. Does this suggest that the current radar is insufficient, and if so will we get a retro fit of the new one?
 
  • Informative
Reactions: Glan gluaisne
The interesting thing about that Arbe Robotics radar is that it seems to gives a capability that's close to LIDAR, and, like LIDAR, it seems likely that it may well significantly reduce the processing requirements in the main car system. An all-weather sensor that delivers accurate, fairly high resolution, 3D spatial data, plus time/relative velocity, would make a lot of the subsequent processing a lot easier. Combined with the cameras, it would also seem to offer the capability of having some sensor redundancy/checking, with the ability to be able to cross-check between the two sensing systems.

Automotive Imaging Radar Technology - Arbe
 
A while ago I'm sure somebody on here made mention of a mounting kit modification to the centre screen to enable it to swivel slightly towards the driver - it was available with LHD and RHD fittings. That would improve things a bit but I think it was quite an expensive modification and still won't solve the right-handed driver trying to operate the screen with their left hand or improve the attention required to operate anything reliably on the screen.
 
A while ago I'm sure somebody on here made mention of a mounting kit modification to the centre screen to enable it to swivel slightly towards the driver - it was available with LHD and RHD fittings. That would improve things a bit but I think it was quite an expensive modification and still won't solve the right-handed driver trying to operate the screen with their left hand or improve the attention required to operate anything reliably on the screen.

I think we have to face up to the fact that Tesla is not likely to make any software changes to the change the layout of the screen just for the RHD markets. The sales volumes are just too small. Even if they did, it still wouldn't solve the right handed driver issue.

Bit of pain but we're stuck with it