Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AI experts: true full self-driving cars could be decades away because AI is not good enough yet

This site may earn commission on affiliate links.
Interestingly, China thinks even L3 is still not ready anytime soon:

"However, experts say that even achieving L3 on public roads is some time away and will take a large amount of money.
“Even if L3 is achieved, the cost will increase steeply, making it harder to commercialise [the technology],” Chen said."

So there's 1) the time issue: I guess it takes time for the technology to gain competency and 2) money: even when the technology is mature, the cost is still an issue.
Thats sounds like Lidar.
 
Interesting article on 7 weaknesses that current AI has:

Well that's a disturbing article. Food for thought.

What flaw in FSD Beta is responsible for its failure to stop for solid objects in front like Road Closed signs, gates, barriers and walls (as we've seen in videos), is that a NN problem or a rules problem?

Is FSD a single neutral net, a set of neutral nets, a framework of decisions with NN analysis. How would you describe FSD?
 
Well that's a disturbing article. Food for thought.

What flaw in FSD Beta is responsible for its failure to stop for solid objects in front like Road Closed signs, gates, barriers and walls (as we've seen in videos), is that a NN problem or a rules problem?

Is FSD a single neutral net, a set of neutral nets, a framework of decisions with NN analysis. How would you describe FSD?

The camera can detect those obstacles and signs fine because it faithfully records and displays for us to see fine. Our brains can interpret those pixels into meaningful contexts fine.

Whatever the cause for Tesla to avoid basic collisions whether NN or rules... It's still a problem.
 
  • Like
Reactions: Dan D.
What flaw in FSD Beta is responsible for its failure to stop for solid objects in front like Road Closed signs, gates, barriers and walls (as we've seen in videos), is that a NN problem or a rules problem?

It could be a rules problem. The planner wants to make the turn but does not know that the road is blocked. Once the car starts to make the turn, vision sees the road closed sign so the car stops or steers back into the original lane.

It could be a brittleness problem where the road closed signs, gates, barriers are different from the training data so perception does not respond correctly.

It could be an issue with the depth perception. Elon mentioned that the NN to detect height from pixels is not quite ready yet. So maybe that NN is failing to properly detect an obstacle from the closed sign, gate or barrier. And keep in mind that with a blank wall, there are no change in features that the NN can use to detect distance. So it might difficult for the NN to properly detect the distance when it is just seeing a lot of pixels of the same color.

Is FSD a single neutral net, a set of neutral nets, a framework of decisions with NN analysis. How would you describe FSD?

Well, from what Karpathy showed us on AI DAY, FSD seems to be several NN that work together with some hardcoded planning rules too.
 
Interesting read, as someone who has extensive experience in software as well as AI, machine learning (including neural networks), but no experience with self-driving, I tend to agree that we are decades away from level 5. One thing that many people don't realize or at least I don't see discussed is that the more progress you make on "solving" level 5 autonomy the incremental difficulty increases exponentially. Even if you assume that Tesla is 90+% of the way to level 5 that is probably half of the effort, at most. My most optimistic expectation is 20 years, 50 is probably more realistic.

That's just the software, next you have the problem of laws and liability. Right now today vehicles get in accidents and insurance companies dispute the results with video of the accident and police reports sending them to arbitration. Imagine how much worse that will get when vehicles without a human driver are involved. Tack on liability, who is getting sued when a car drives itself? You know Tesla's lawyers won't allow them to sell a single car with that feature without all sorts of legal-ese saying that the owner is fully responsible. I know I certainly won't trust my legal liability to another companies software no matter how much I trust and believe in them!
 
  • Like
Reactions: linux-works
Interesting read, as someone who has extensive experience in software as well as AI, machine learning (including neural networks), but no experience with self-driving, I tend to agree that we are decades away from level 5. One thing that many people don't realize or at least I don't see discussed is that the more progress you make on "solving" level 5 autonomy the incremental difficulty increases exponentially. Even if you assume that Tesla is 90+% of the way to level 5 that is probably half of the effort, at most. My most optimistic expectation is 20 years, 50 is probably more realistic.

L5 is wishful thinking I agree; but do you really not think that geofenced L4 robo-taxies that don't require remote operators, or door to door L2 ADAS with minimal disengagements, is feasible within the next 6-12 months?

I look at Waymo, Zoox, Tesla, and IMO, I personally believe all of the tools are there. It doesn't seem, to me, that any sort of generational leap in technology is required to achieve better than human driving in most circumstances today. It's just a matter of work at this point.

I feel like your comment is suggesting that some fundamental tool is missing.
 

Alphabet’s Waymo and GM’s Cruise get California DMV approval to run commercial autonomous car services

Under the new authorization, Cruise vehicles can operate on public roads in designated parts of San Francisco between 10 p.m. and 6 a.m., including in light rain or light fog, but cannot exceed 30 miles per hour, the department said. Waymo can operate its fleet in parts of San Francisco and San Mateo counties at or below 65 mph, including in the rain or light fog.

Commercializing autonomous vehicles has been far more challenging than many predicted even a few years ago, but Waymo and Cruise are considered to be two of the frontrunners.

In May, both Waymo and Cruise applied for permits to begin charging for rides and delivery. Cruise applied to not have a safety driver present, while Waymo applied to have a safety driver, Reuters reported.
 
Under the new authorization, Cruise vehicles can operate on public roads in designated parts of San Francisco between 10 p.m. and 6 a.m., including in light rain or light fog, but cannot exceed 30 miles per hour, the department said.
Where in San Francisco would you be able to go over 30 miles per hour? I mean, I guess maybe on the freeway, but that's more like a bypass around San Francisco than in San Francisco.... :D

Of course, if they had limited it to other times of day, that physical impossibility would include the freeway, too. :D
 
  • Funny
Reactions: t3sl4drvr
...I personally believe all of the tools are there. It doesn't seem, to me, that any sort of generational leap in technology is required to achieve better than human driving in most circumstances today. It's just a matter of work at this point...

Previously, researchers also thought the same for years but it's another thing to bring the theory into real life.

Sounds good in theory that radarless Tesla would not suffer phantom brakes but that's not true in real life.