Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

‘I can almost go from my house to work with no intervention’ - Musk

This site may earn commission on affiliate links.
“So I personally tested the latest alpha build of full self-driving software when I drive my car and it is really I think profoundly better than people realize. It’s like amazing. So it’s almost getting to the point where I can go from my house to work with no interventions, despite going through construction and widely varying situations. So this is why I am very confident about full self-driving functionality being complete by the end of this year, is because I’m literally driving it.”

Elon Musk on Tesla Self-Driving: 'I can almost go from my house to work with no intervention' - Electrek


Presumably, not in our part of the world though....
 
It's apparent that the AI for FSD is progressing rapidly. And when it renders in 4D as opposed to 2.5D (no idea what that really means) its progress will accelerate. Elon says FSD is by far the most important project for Tesla.

It was said that AI would never beat a human at chess. Then it was said AI would never beat a human at Go. Or Starcraft II.

Autonomous driving is a hard problem with a long tail of exceptions. But it is just vectors and computation.

Once FSD is cracked and then the data supports the fact that it is much safer than a human, the regulators won't be able to stand in the way.

Yes Elon has said it is coming lots of times. Yes he may be late. But he always delivers, not matter how bold the task. Yes I am a fan.
 
Anyone that thinks about it for more than a few nano seconds realises its the last few percent and edge cases that are the hardest to solve. You can drive drom Birmingham to Glasgow on the M6 without any issues at all, but try getting from the Digbeth service centre onto the main road let alone onto the M6 and you'll understand the problem.

When I first read the OPs original post I was expecting to see 2017 at the bottom, or 2018, or 2019 or 2020

Its not coming on very quickly at all in reality as phantom braking is just as big an issue today as it was 2 years ago
 
And when it renders in 4D as opposed to 2.5D (no idea what that really means

Totally agree with your comments.

there was a post by @greentheonly on twitter which I think helps explain the 2.5D thing

https://twitter.com/greentheonly/status/1277336118447538179

essentially the current software uses height of an object in the frame to help determine its depth (this was also shown by tesladriver on youtube, who saw that small children could be confused to be adults further aware on current software).

The new software will be custom built for HW3 stitching everything from all cameras into a single 3D image, with I think the 4th dimension relating to time and movement plotting (and predictions) of labelled items within that space.

The next few months should be very exciting, I'm getting my FSD at a very interesting time!
 
I agree with all the sentiment expressed regarding our collective experiences with FSD and I am sure that in Europe, we are going to be a bit down the list for product..but..he does state, 'the latest alpha build of full self-driving software'
So that is the rewrite - something completely different to what is currently installed. Maybe..just maybe...
 
I agree with all the sentiment expressed regarding our collective experiences with FSD and I am sure that in Europe, we are going to be a bit down the list for product..but..he does state, 'the latest alpha build of full self-driving software'
So that is the rewrite - something completely different to what is currently installed. Maybe..just maybe...
I have been led to believe that currently HW3 is basically just emulating HW2.5 in order to run that legacy code so is not being used to anything like its full potential. Supposedly they are doing a massive re-write to use the AI huge power and other native features of HW3 so when that arrives it ought to be a significant step forward if you have HW3. but i may be wrong I forget where I read this.
 
Autonomous driving is a hard problem with a long tail of exceptions. But it is just vectors and computation.

I think that may be where the problem lies. Control of the movement of a vehicle is indeed about vectors and computation but safe driving is another thing altogether. It's particularly the interactions with other human controlled vehicles or with pedestrians or animals that bring in those other factors. It's about body language and experience in weighing up risk factors that cannot be identified by a simple visual model.

It may not be high on most people's list but here's a rather off-beat example that shows how humans use their experience. Even setting aside the variable behaviour of children or of parents with pushchairs, in the countryside we even read the body language of the sheep!

Some sheep at the side of the road you can drive past at 50mph and others you need to pass at a crawl or have to stop. You can often judge which lamb belongs to which ewe in a small group as you approach them ... if the lamb and ewe are on opposite sides of the road you then gauge the "skittyness factor" of the lamb ... the tension in its stance or smoothness of its motion ... these judgments take place in a fraction of a second and at a considerable distance giving plenty of time to make smooth adjustments to speed and road positioning ... maybe it's safer to drive on the opposite side of the road despite the sheep currently not obstructing the carriageway. The driver reads the situation.

So long as a car could identify sheep it could always take the safest option and stop ... but you would simply never arrive at your destination! The vectors bit is manageable but the knowledge behind the decision making is another level of complexity because there are so many experience-based variables interacting.
 
I think that may be where the problem lies. Control of the movement of a vehicle is indeed about vectors and computation but safe driving is another thing altogether. It's particularly the interactions with other human controlled vehicles or with pedestrians or animals that bring in those other factors. It's about body language and experience in weighing up risk factors that cannot be identified by a simple visual model.
true, but they are not trying to build a simple visual model, they are trying to build a hugely complex one. They mentioned a couple of years back that one of the advantages of using a vision system was that unlike lidar they could use body language to predict paths. Is someone crossing the road and paying attention or not, has the cyclist noticed the car. Your sheep analogy only works on people that have seen sheep in the road before .. or even seen a real sheep. While it's a lot slower to train an AI as to what a docile sheep looks like it's not inconceivable and that knowledge would be shared with all cars.
I'm not underestimating the monumental task involved and I'm pretty skeptical on FSD capabilities in built up cities in the UK, but I also don't see an issue with a FSD car hitting a sheep a couple of times before learning how not to hit one, unlike people that, as a whole, just continue to hit them.
 
then gauge the "skittyness factor" of the lamb ... the tension in its stance or smoothness of its motion

There is almost a poetic romance in your description ;)

I agree from an uneducated standpoint - I don’t know much about the capability of AI, i heard of some examples of it learning emotional signals of human faces for example but there are so many layers. 80/20 rule all over. Plus people are wazzocks, how can you ever compute for that!?
 
“So I personally tested the latest alpha build of full self-driving software when I drive my car and it is really I think profoundly better than people realize. It’s like amazing. So it’s almost getting to the point where I can go from my house to work with no interventions, despite going through construction and widely varying situations. So this is why I am very confident about full self-driving functionality being complete by the end of this year, is because I’m literally driving it.”

Elon Musk on Tesla Self-Driving: 'I can almost go from my house to work with no intervention' - Electrek


Presumably, not in our part of the world though....

The trouble is that he is working from home!
 
Autonomous driving is a hard problem with a long tail of exceptions. But it is just vectors and computation.

Except it can all be ruined by the utterly mundane. I have no knowledge or experience of vectors, rendering, AI or any of the other mighty impressive things that are apparently going to bring us full autonomous driving in the (if you believe Elon) near future. But Tesla’s system is almost completely based on cameras, and how often have you seen the message “multiple cameras blocked or blinded”? Until they overcome that incredibly basic problem then the whole FSD experience could be ruined, albeit temporarily, by a bit of mud thrown up from the road.
 
  • Like
Reactions: Adopado
If you want to think about AI for more than a few nano seconds (say 33 minutes) this is a fascinating podcast:

BBC Radio 4 - The Life Scientific, Demis Hassabis on artificial intelligence

Have a read here...

Tesla FSD and Feature Complete

Its not just about AI - its about decomposing driving into the features that need to be implemented and then working out the best way to do it.

If Musks idea of feature complete is the NHTSA list then there is a hell of a lot of things about to drop.

As for vectors etc and what they could do.. sure... but given phantom braking is still a thing nearly 4 years after they started, they're still some way from doing some basic stuff well. People complain about it even being able to centre in lane correctly sometimes and it driving to a edge more than they like, oncoming traffic is variable etc. To suggest they're go from that level of behaviour to jump to even passable skills required in the next few iterations is a leap of faith I'm not taking.
 
Last edited:
  • Informative
Reactions: MrBadger