Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
What's striking about all these cars is that they all, without exception, have a full 3D model of the environment around the car.

Until Tesla demonstrates that we can't really take their claims about self driving seriously.

At 22:15, we don't see where the lane lines are in this overhead view, but we know from Green's posts that is being mapped. I'm going to guess the red cars are from the standard vision recognition, as they like to dance, and the green is the Pseudo-Lidar constructions.



I do agree with @diplomat33 i like a good sleek interface that shows what the car sees. It give a certain guarantee and peace of mind.

This is the best interface. Just simulate an actual driver.

 
I have not such a high opinion of The Economist as I think their articles are rather shallow. I thought I post it anyway.

Driverless cars show the limits of today’s AI

"In 2015 Elon Musk, the boss of Tesla, an electric-car maker, predicted the arrival of “complete autonomy” by 2018"

I don't remember Elon using the term "complete autonomy". I wonder if perhaps the article is thinking of "feature complete".

"The few firms that carry passengers, such as Waymo in America and WeRide in China, are geographically limited and rely on human safety drivers."

I would give this a "mostly true but misleading" rating if I were fact checking this article. The geographically limited part is accurate but Waymo uses safety drivers just in case the car gets stuck and to provide more peace of mind for passengers. Waymo cars don't rely on safety drivers for the actual daily driving tasks. In fact, Waymo has done some rides without a safety driver.

I think the article is right about the challenges of camera vision.

"One study, for instance, found that computer-vision systems were thrown when snow partly obscured lane markings. Another found that a handful of stickers could cause a car to misidentify a “stop” sign as one showing a speed limit of 45mph. Even unobscured objects can baffle computers when seen in unusual orientations: in one paper a motorbike was classified as a parachute or a bobsled. Fixing such issues has proved extremely difficult, says Mr Seltz-Axmacher. “A lot of people thought that filling in the last 10% would be harder than the first 90%”, he says. “But not that it would be ten thousand times harder.”"

I feel like this probably summarizes pretty well why Tesla's FSD is taking longer than Elon thought. Elon probably thought that camera vision would be relatively straight forward, especially with Tesla's data, but in reality, it has proven much much harder to get to the reliability needed for driverless FSD. And I think this also explains why companies like Waymo adopted their approach of multiple sensors, including lidar and HD maps. They realized that to get the most reliable FSD system, it's best to give the car as much help as possible. And it's why the cars are geofenced. It's best to start small and get FSD working in one area and then build from there rather than try to go for general AI L5 right away.

"Dr Marcus, for his part, thinks machine-learning techniques should be combined with older, “symbolic ai” approaches. These emphasise formal logic, hierarchical categories and top-down reasoning, and were most popular in the 1980s. Now, with machine-learning approaches in the ascendancy, they are a backwater."

This seems to be a reference to neuro-symbolic AI, a new type of AI that seeks to combine symbolic AI with machine learning. Awhile ago, I shared this article that talks about it:
Why Neuro-Symbolic Artificial Intelligence Is The A.I. Of The Future

Some researchers are putting their hopes that this neuro-symbolic AI will overcome the limits of the current machine learning and therefore be able to achieve a better AI capable of better FSD.

I appreciate the article's conclusion of trying to set more realistic expectations for FSD. Yes, if you are expecting FSD like in Knight Rider, then we would need true general AI. But certainly a more limited FSD, like geofenced L4 or highway L3 would not need general AI in my opinion.

And I do think there is something to be said for developing smart roads and smart infrastructure that can make the job easier for FSD cars. A big reason why L5 is so hard is because our road system sucks. Yeah, it kinda works for human drivers because we have the general intelligence to figure it out. But the inconsistent traffic lights, poorly marked roads, strangely layed out roads etc are hell for a computer to figure out on its own. If we could build a new road system designed for FSD cars from the get-go, it would be much easier for FSD and worth it in the long term because of increased safety.
 
In other news, Waymo is hosting a workshop on "Scalability in Autonomous Driving" at CVPR 2020 on June 15.

EaPnFNPWAAAxFag
 
  • Informative
Reactions: willow_hiller
Cruise just posted a short but interesting video that shows how their sensors model simulations. In the video, you can see the 3D simulation and below you can see how the radar, lidar and camera sees the 3D simulation:


The 3D simulation looks very realistic. For a moment, I almost thought it was real. And it is nice to see how the different sensors perceive the world. 3D simulations can't replace real world driving of course but I think they can pretty effective, especially for testing scenarios that might be hard to replicate on demand quickly in the real world.
 
  • Informative
Reactions: 1 person
Neat! I do wonder how that Uber path prediction would do with this type of simulation. I'm specifically thinking about the pedestrians, as they don't move naturally. Most pedestrians would slow or stop when almost getting hit by a car, and they hardly walk through each other, usually opting to move around each other.
 
I disagree here. L5 is difficult because human drivers are unpredictable and/or don't follow the law.

Yes, that is a big factor as well. I just think that our road infrastructure certainly does not help things. I am thinking in particular for Tesla's camera vision approach. Companies that use lidar and HD maps can get around these problems. But Tesla only has camera vision which has to be able to see and understand what it sees. Our bad road infrastructure creates more cases that the camera vision has to handle. For example, faded lane lines that make lane keeping harder for the cameras. Inconsistent traffic lights that make it harder for camera vision to recognize traffic lights. Bad signs or missing signs that will cause problems. And there is also the unnecessarily complex or weird intersections that can sometimes confuse even human drivers and that computers have to figure out, which can be harder when you don't have HD maps to "cheat".

Imagine how much easier it would be for camera vision if every road was a nice clean, well marked road like this:

_qcO_rX1eJK6Vd-rknDWj7ZyMVLVJ6WbO70ykbKeph-KtgfCztbwa6zR0z39Gow41zqLR1sVZfHL9WUsutHI17lDM7wp1ldeSGtuJwOMlMiiM_hQt6GSZV06JIHj3VD6bfrVd39Ky_5noDy3JwHIefa4j361mJw


Camera vision would not need to be trained to handle every weird edge case like this anymore:

poor-quality-roads-can-cause-auto-accident.jpg


And imagine if there were smart traffic lights and smart stop signs that sent signals to every FSD car to coordinate when each car should stop and go. Traffic jams would be a thing of the past. And, you would not need to train cars to read every traffic light and every stop sign and train them with individual driving policies to figure when it is safe to go.
 
  • Like
Reactions: Bitdepth
"In 2015 Elon Musk, the boss of Tesla, an electric-car maker, predicted the arrival of “complete autonomy” by 2018"

I don't remember Elon using the term "complete autonomy". I wonder if perhaps the article is thinking of "feature complete".

Back in 2015 Elon was not using the term "complete autonomy" (a lazy term in any case) but he was describing full self-driving (FSD) as a system that would, among other things, take your kids where you wanted it to, without a driver, and would function as a robo-taxi, going off on its own to pick up passengers and take them where they wanted to go. Effectively he was describing L5, as the car would not need a driver. I don't remember him ever stating a completion date for it, but he sure seemed to think it was only a very few years away and that the bigger issue was regulatory approval.

It wasn't until some time after I bought my car (March, 2018) that he stopped talking about a car that would drive itself without a driver and began promising "feature complete" for the so-called "full self-driving" package. Suddenly, in Elon-speak, "full self driving" meant that for normal driving in the city, there would be no common, normal driving situations that the car could not handle some of the time as long as an alert driver is ready to take over at her/his own discretion at any time. He re-defined "FSD" when he realized that actual full autonomy (L5) is much harder and further away than he previously thought.

The honorable thing would have been to say, "I made a mistake. We are not going to be able to keep the promise we made to early buyers. Here's what we now think we can achieve in the next few years... If you paid for FSD before this date, we are refunding your money with interest and will continue to deliver the features we are able to..." I really am a great admirer of Elon Musk. He has done great things and has made the world a better place. This is why I wish he would admit his mistakes when he makes them, because everybody makes mistakes and the bigger man or woman admits them and doesn't hide behind linguistic gymnastics.

In 2015 Elon was promising L5, maybe not using that term, but definitely describing it.

Note: This is not a personal grudge. I did not pay for FSD so I have no dog in this fight. I just want to see a man I admire do the honorable thing.
 
  • Informative
Reactions: 1 person
Back in 2015 Elon was not using the term "complete autonomy" (a lazy term in any case) but he was describing full self-driving (FSD) as a system that would, among other things, take your kids where you wanted it to, without a driver, and would function as a robo-taxi, going off on its own to pick up passengers and take them where they wanted to go. Effectively he was describing L5, as the car would not need a driver. I don't remember him ever stating a completion date for it, but he sure seemed to think it was only a very few years away and that the bigger issue was regulatory approval.

It wasn't until some time after I bought my car (March, 2018) that he stopped talking about a car that would drive itself without a driver and began promising "feature complete" for the so-called "full self-driving" package. Suddenly, in Elon-speak, "full self driving" meant that for normal driving in the city, there would be no common, normal driving situations that the car could not handle some of the time as long as an alert driver is ready to take over at her/his own discretion at any time. He re-defined "FSD" when he realized that actual full autonomy (L5) is much harder and further away than he previously thought.

The honorable thing would have been to say, "I made a mistake. We are not going to be able to keep the promise we made to early buyers. Here's what we now think we can achieve in the next few years... If you paid for FSD before this date, we are refunding your money with interest and will continue to deliver the features we are able to..." I really am a great admirer of Elon Musk. He has done great things and has made the world a better place. This is why I wish he would admit his mistakes when he makes them, because everybody makes mistakes and the bigger man or woman admits them and doesn't hide behind linguistic gymnastics.

In 2015 Elon was promising L5, maybe not using that term, but definitely describing it.

Note: This is not a personal grudge. I did not pay for FSD so I have no dog in this fight. I just want to see a man I admire do the honorable thing.

Just a slight point of clarification, I agree that Elon was apparently promising L5 in 2015 but not because it would have no driver. L4 can also have no driver. So, that's not what would have made it L5. What makes Tesla's 2015 FSD description L5 is that it would have worked with no special restrictions, no geofence, pretty much anywhere, even in other countries.

It is also worth nothing that Elon has avoided specifically mentioning any SAE level. During Autonomy Day, he did respond with a "yes" when a reporter asked about L5 but otherwise, he's not really mentioned the SAE levels directly by name. So we are left to try to match Elon's FSD promises to what we think the SAE level might be which has caused a lot of confusion since everyone can have different interpretation.

Elon has offered estimates for dates for various FSD milestones. I don't have the tweets on hand but he did offer dates for when FSD would diverge from EAP, when a coast to coast FSD demo would happen, when drivers could sleep in their Tesla etc... And during Autonomy Day, he did estimate that in 2020, we would not need to pay attention to the road and that Tesla would roll out robotaxis in certain cities. But of course, Elon also couches his tweets with some vagueness and caveats like "pending regulatory approval" or "if all goes well" so that it is not definitive.

But I do agree that FSD started off as something that sounded a lot like L5 but has now been downgraded down to something that sounds more like a semi-autonomous L2. And yes, I think Elon should have been more honest and admitted his mistakes.
 
Here is a 1 hr podcast interview with Cruise VP of Simulation, Tom Boyd. It is very informative. He goes into depth of how Cruise tests and develops their FSD software:

Cruise Simulation with Tom Boyd - Software Engineering Daily

Click the play button at the top to listen to the podcast or click here for the transcript: https://softwareengineeringdaily.com/wp-content/uploads/2020/06/SED1089-Cruise-Simulation.pdf

Listened to it on my commute home yesterday. Interesting stuff. They addressed realistic pedestrian motion, such as their reactions to the autonomous vehicle. This was one of my concerns about that simulation video.
 
  • Like
Reactions: diplomat33
Listened to it on my commute home yesterday. Interesting stuff. They addressed realistic pedestrian motion, such as their reactions to the autonomous vehicle. This was one of my concerns about that simulation video.

Glad you liked it.

Listening to Boyd, I will say that it seems like perception is done. He talks a lot about "planning" which is next step in autonomous driving after perception. Planning seems like the big part of autonomous driving that Cruise is focused on improving now. Planning has proven to be very challenging because of all the different driving scenarios and behaviors that a car might face on the road. Now, Tesla talks a lot about perception but does not really talk about planning a whole lot. Not to sound overly critical of Tesla, but it seems like Tesla is still on "step 1" of FSD, ie perception, and have not really gotten to the really tough parts of FSD like planning yet. I would love to hear more from Tesla on what they are doing with planning and driving policy.
 
Just a slight point of clarification, I agree that Elon was apparently promising L5 in 2015 but not because it would have no driver. L4 can also have no driver. So, that's not what would have made it L5. What makes Tesla's 2015 FSD description L5 is that it would have worked with no special restrictions, no geofence, pretty much anywhere, even in other countries.

It is also worth nothing that Elon has avoided specifically mentioning any SAE level. During Autonomy Day, he did respond with a "yes" when a reporter asked about L5 but otherwise, he's not really mentioned the SAE levels directly by name. So we are left to try to match Elon's FSD promises to what we think the SAE level might be which has caused a lot of confusion since everyone can have different interpretation.

Elon has offered estimates for dates for various FSD milestones. I don't have the tweets on hand but he did offer dates for when FSD would diverge from EAP, when a coast to coast FSD demo would happen, when drivers could sleep in their Tesla etc... And during Autonomy Day, he did estimate that in 2020, we would not need to pay attention to the road and that Tesla would roll out robotaxis in certain cities. But of course, Elon also couches his tweets with some vagueness and caveats like "pending regulatory approval" or "if all goes well" so that it is not definitive.

But I do agree that FSD started off as something that sounded a lot like L5 but has now been downgraded down to something that sounds more like a semi-autonomous L2. And yes, I think Elon should have been more honest and admitted his mistakes.

Here's one way I agree with Elon: I don't care about the SAE level definitions. Elon said that FSD meant no driver. I don't care whether that's level 4 or 5. A car need not conform exactly to a particular level to meet my needs. As the next step I'd be willing to pay for, I want to be able to ignore the road. As my ideal car, I want to be able to take a nap in the back while the car is my chauffeur.

I would love it if this year we would not need to pay attention to the road. I would bet serious cash money that will not happen in any meaningful way for any of the roads I routinely drive on. (No freeways here, so freeway-only L3 wouldn't help me.)
 
Here's one way I agree with Elon: I don't care about the SAE level definitions. Elon said that FSD meant no driver. I don't care whether that's level 4 or 5. A car need not conform exactly to a particular level to meet my needs. As the next step I'd be willing to pay for, I want to be able to ignore the road. As my ideal car, I want to be able to take a nap in the back while the car is my chauffeur.

I would love it if this year we would not need to pay attention to the road. I would bet serious cash money that will not happen in any meaningful way for any of the roads I routinely drive on. (No freeways here, so freeway-only L3 wouldn't help me.)

Yes, both L4 and L5 can do what you want. Both L4 and L5 can drive you around and you can completely ignore the road. The difference is that L4 can only do it in some areas whereas L5 can do it everywhere. So L4 can drive you around and you can completely ignore the road but only if the L4 works where you live. But L5 is guaranteed to be able to drive you around and you can completely ignore the road, since L5 works everywhere. So yes, the distinction matters.
 
Last edited:
Just a slight point of clarification, I agree that Elon was apparently promising L5 in 2015 but not because it would have no driver. L4 can also have no driver. So, that's not what would have made it L5. What makes Tesla's 2015 FSD description L5 is that it would have worked with no special restrictions, no geofence, pretty much anywhere, even in other countries.

It is also worth nothing that Elon has avoided specifically mentioning any SAE level. During Autonomy Day, he did respond with a "yes" when a reporter asked about L5 but otherwise, he's not really mentioned the SAE levels directly by name. So we are left to try to match Elon's FSD promises to what we think the SAE level might be which has caused a lot of confusion since everyone can have different interpretation.

Elon has offered estimates for dates for various FSD milestones. I don't have the tweets on hand but he did offer dates for when FSD would diverge from EAP, when a coast to coast FSD demo would happen, when drivers could sleep in their Tesla etc... And during Autonomy Day, he did estimate that in 2020, we would not need to pay attention to the road and that Tesla would roll out robotaxis in certain cities. But of course, Elon also couches his tweets with some vagueness and caveats like "pending regulatory approval" or "if all goes well" so that it is not definitive.

But I do agree that FSD started off as something that sounded a lot like L5 but has now been downgraded down to something that sounds more like a semi-autonomous L2. And yes, I think Elon should have been more honest and admitted his mistakes.

Unfortunately, that’s not how capitalism works.
 
Elon's comments from 2015:
Elon Musk Says Tesla Vehicles Will Drive Themselves in Two Years

Tesla’s Elon Musk Says Autonomous Driving Not All That Hard to Achieve
To be fair to Elon, I think WSJ misquoted him. I think he actually said: "I almost view it as a solved problem."

This is where he is quoted from:
51 minutes into the video.
58:30 minutes is the actual quote.

Yes, you are correct. He says "I almost view it like a solved problem". Here is the entire quote in more context:

"Highway cruise is easy. Low speed is easy. Intermediate is hard. Being able to recognize what you are seeing and make the right decisions in the suburban environment in that 10 to 50 mph zone is the challenging portion. This going to sound complacent but I almost view it like a solved problem. We know exactly to do and we'll be there in a few years."

He also says this at the start of the interview around the 51 mn mark:
"I don't think we have to worry about autonomous cars because it's a sort of a narrow form of AI. It's not something I think is very difficult. To do autonomous driving that is to a degree much safer than a person, is much easier than people think."

IMO, Elon comes across as rather simplistic and definitely underestimating the challenges of autonomous driving. He seems to think that autonomous driving is just a matter of doing camera vision to tell the car where everything is and then tell the car what to do. But actually implementing that approach has been harder than expected.