Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I think you might be overestimating the kind of intelligence that an AI system will need in order to safety drive a car. There's a computer scientist named Ajeya Cotra that does some interesting research on forecasting when AI will become transformative by comparing it to biological anchors. She was interviewed by Freakonomics, and the episode is definitely worth a listen: New Technologies Always Scare Us. Is A.I. Any Different? - Freakonomics

By her team's estimates, GPT2, with 1.5 billion parameters, had intelligence equivalent to a honey bee. And GPT4, with an estimated 1.8 trillion parameters, has the intelligence of a squirrel.

The last estimate of the model size of FSD we had came pre-V12, but it was 1 billion parameters at the time. So maybe moving planning and control has increased the model size, but we are currently working with something that's likely closer to honey bee intelligence than squirrel or even human.

All that being said, think about what bees can do with their limited brains. They can navigate a hive, with a complex series of chambers. They can fly at high speed, avoiding obstacles and seeking certain targets. They might not know what a bear is, but they know not to fly into one, or to sting it if it attacks their hive for honey.
This is a great discussion, and probably headed a bit off topic. I would love to pursue it!

That said, I would not give the task of driving my car to a honeybee. It is wonderful, but it will never be able to anticipate the hard stuff: the likelihood of a kid running out between two cars, or navigate the best path through 100 potholes, even.

It's the last 10% that it can't do that's critical, and FSD as it exists isn't close. And that's where it's stuck, and that's why it's just an algorithm and why that's all it will ever be. Not intelligent.
 
I believe the honey bee does not know anything but is acting on instinct.
Which is exactly what FSD is doing. Training is its evolution, and then it acts on the resulting instincts.

It's the last 10% that it can't do that's critical, and FSD as it exists isn't close. And that's where it's stuck, and that's why it's just an algorithm and why that's all it will ever be. Not intelligent.
If it can, by instinct, design a fusion power plant, who cares? They say that 10,000 hours of guided practice will make someone a master at something. That's because they're turning the activities of that craft into instinctive responses. We hear people try to express that in all sorts of artistic and melodramatic ways, but it boils down to trained instincts.

FSD is a good start, but there's a long way to go before a car is reliably driving itself around. I think that there are still a number of machine learning techniques yet to be discovered that will make it work. And I think it will ultimately be possible, under the right conditions, for a Hardware 3 Tesla to do it. But it'll be someone playing with "vintage hardware" and doing it as an academic exercise.
 
  • Like
  • Informative
Reactions: Ben W and primedive
No, it's a very good question.

Here's one (of several) definition I found to be helpful:

The mental quality that consists of the abilities to learn from experience, adapt to new situations, understand and handle abstract concepts, and use knowledge to control an environment.

The "AI" we are discussing in this context is a long way from being able to learn from experience or handle abstract concepts. I don't think my Tesla will ever "understand" what a bear is, or ever come up with a new idea. It might be very fast at approximating a match to prior information it has been fed, and generate a response, but a worm can do that.
You just carved out about 95% of the general population.. My hope, is that some AI systems will take from the 5% and hopefully, 1-3% and build it’s interactive interpretation of the world around it from them, and enhance.
 
FSD is absolutely trained. Another word you might use is programmed.
Programmed just like someone who trains for 10,000 hours in a task.

It has no ability to learn. None. That's why it's not intelligent.
The software deployed in the car certainly has no ability to learn. The overall system (including the compute and data farms) has the ability to learn via guided training.

I have no interest in characterizations such as "intelligent" or "sentient" because those terms are too imprecise. Let's stick to discussions about learning.
 
  • Like
Reactions: kabin
Programmed just like someone who trains for 10,000 hours in a task.


The software deployed in the car certainly has no ability to learn. The overall system (including the compute and data farms) has the ability to learn via guided training.

I have no interest in characterizations such as "intelligent" or "sentient" because those terms are too imprecise. Let's stick to discussions about learning.
There's a stretch of highway south of me. 20 miles long, maybe. FSD is programmed to believe the speed limit is 45 mph. No matter that the speed limit has never been 45 since the road was paved some 75 years ago. No matter that 65mph signs are clearly posted periodically. No matter that the traffic flow is always 65 or greater. No matter that my car and up to 20 other Teslas an hour with connected cameras are looking at the signs every day. No matter that we are all overriding the faulty programming constantly. FSD still drives 45 mph.

It cannot even do the most simple bonehead moron driving task of driving the speed limit.

I don't care what a server farm in Timbuktu can do. FSD is not AI. Stop fooling yourself.
 
Programmed just like someone who trains for 10,000 hours in a task.
Evidence needed.

Lots of minimizing of the difficulty of FSD in this thread. No one has mastered it yet. If if weren’t for humans, even all the Waymos would be stuck, far away from home.

Humans are incredibly capable, even the worst drivers.

Folks, FSD can’t even do a trivial left turn yet!!! Let’s try to hold it together.
 
Yep. Task specific, narrow AI with much hope for untrained scenario/edge case generalization. The way Elon and TSLA have marketed FSD there's no wonder why the public would be confused.

It's funny Elon wanted the Austin Giga south extension wall to be glass so the world could see all those Nvidia GPU racks.
 
Tonight post 12.3.4 UG SFSD drive
I did not expect a difference, but two edge cases gone

Amazing, feel by end of year we will be extremely close to true automation

2023 MYP

Wow
I'll gauge FSD improvement trajectory on the improvements 12.4 brings. 12.3.4 seems to have broken some things. I had a zero intervention drive home from a restaurant last night, although it was only a 3 mile drive, it got everything right. When I drive with my wife (not using FSD, me driving), I seldom have no interventions.
 
  • Funny
Reactions: SidetrackedSue
1000012569.png


Very cool behavior by 12.3.4 tonight. I was making a left at this T. The car came to a complete stop and then inched forward for visibility. A car approached from the right and indicated it was going to turn LEFT onto the road I was on. At this intersection a lot of the people who have the right of way will actually yield to a car in my position because even human drivers have to pull way far into the lane to see past those trees on the left. Anyways the other driver definitely yielded to my car and FSD didn't even hesitate to make the left super smoothly and quickly. I feel like v11 at this intersection would never have ever understood the other car slowing down to allow us to go even though they have the right of way. This was cool to see. 0 interventions from wawa parking spot to home.
 
View attachment 1039519

Very cool behavior by 12.3.4 tonight. I was making a left at this T. The car came to a complete stop and then inched forward for visibility. A car approached from the right and indicated it was going to turn LEFT onto the road I was on. At this intersection a lot of the people who have the right of way will actually yield to a car in my position because even human drivers have to pull way far into the lane to see past those trees on the left. Anyways the other driver definitely yielded to my car and FSD didn't even hesitate to make the left super smoothly and quickly. I feel like v11 at this intersection would never have ever understood the other car slowing down to allow us to go even though they have the right of way. This was cool to see. 0 interventions from wawa parking spot to home.
Yeah I understand the situation and that’s not cool. Visibility looks fine.

Overcreep into the traffic lane. Not even sure how it happened. I think that extra lane is for parking (except where prohibited by signage), not for turning.
 
I just want FSD to honor my lane change requests. If it’s going to wait until the last second to get into the correct lane for a turn, the least it can do is honor my request to get in the correct lane ahead of time. This shouldn’t be hard to fix.
FSD overrules Nav guidance when it wants to, and that's the way it has to be. But I don't know that FSD sees a difference between the Nav turning on a signal or a driver.