Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon: "Feature complete for full self driving this year"

This site may earn commission on affiliate links.
So... about those HD maps. Can we continue discussions from the agreed upon understanding, if you need to operate on the assumption map data could be inaccurate/out-of-date you can functionally operate without them.

Waymo and Cruize are not consumer vehicles, while I can appreciate the comparisons on a technical curiosity level, any system that reduces the rates of accidents, even if just while on the forward path to "FSD" claims, is in fact good enough.

No, HD maps don't work like that.

Many things are relatively fixed. Roads rarely move, buildings only move relatively slowly and usually only one at a time. If the car notices that a building is missing but the other features it expects are there it can cope with that.

Things could get bad enough that the car can't cope, some kind of major terrorist incident or a large earthquake perhaps, but in that case as long as the car can safely stop it's fine.
 
Just an interesting note, I noticed stop lights in Redwood City have been updated with yellow borders like this:

Retroreflective Borders on Traffic Signal Backplates – A South Carolina Success Story - Safety | Federal Highway Administration

But some of them have an alternating black/yellow dash-like border. Could be useful in helping autonomous vehicles detecting stop lights.

OMG!

What an excellent find!

But let me get this straight:

1. Extremely effective in reducing crashes.

2. Extremely low cost.

3. This paper was published in 2009, AND WE DON'T HAVE THESE ON EVERY FRIGGIN' SIGNAL YET?!?

Simply amazing.

Will send the link to my local City Council, County Government, and State Official TODAY.

Jaw-dropping that so little action has taken place in ELEVEN YEARS!
 
No, HD maps don't work like that.

Many things are relatively fixed. Roads rarely move, buildings only move relatively slowly and usually only one at a time. If the car notices that a building is missing but the other features it expects are there it can cope with that.

Things could get bad enough that the car can't cope, some kind of major terrorist incident or a large earthquake perhaps, but in that case as long as the car can safely stop it's fine.
I think HD maps should be secondary. First job of the car is:
1. Only drive on drivable area.
2. Don't drive on objects.
3. Don't drive into other vehicles and avoid other vehicle's expected path.

1 and 2 is hard enough, but solvable using sensors, camera and image recognition. If your map says that path is drivable, and the sensors say its not drivable, then its not.

If there's snow and road markings are not visible, there's a whole lot of drivable area, then your HD maps will be useful for where you should be driving. Human drivers use their HD maps for this too, if someone repainted the road in the winters and put the snow back, then most drivers would drive wrong.

Also HD maps is useful for picking a useful path through intersections. You won't crash without them as job 3 is to avoid other vehicles, but you need them for driving sensible and not cause chaos in complex situations that cannot be resolved purely by vision.
 
City NoA isn't anywhere in sight until AP can do the simple task of staying in a lane, especially on windy roads. AP still can't do that.

Lane centering on a windy road is more a rural driving scenario than a "City NOA" scenario IMO. But there are a ton of reasons why "City NOA" is a long ways off. For one, we have yet to see how reliable handling traffic lights will be. If AP can't reliably stop at a red light, "City NOA" has no chance of happening. Plus, there are a ton of other "City NOA" scenarios that AP will need to be able to handle. Watch any autonomous driving demo that involves city driving and you will see a ton of driving scenarios that "City NOA" will need to be able to handle. So far, all we have from Tesla is that stopping at red lights is in early access to gather more data. We have yet to see any indication from Tesla that "City NOA" can handle any other city driving scenarios like pulling over for emergency vehicles, yielding the right-a-way at an intersection with just stop signs, navigating a narrow street where you have to pass a double parked car, etc...

Here is just a small sample of what "City NOA" would need to be able to do reliably:


I hope I am wrong but I am a little concerned that Tesla will essentially just put out L2 "traffic light and stop sign response" and L2 "turning at intersections" and declare that they have achieved Full Self-Driving because the car can handle most commutes (with driver supervision). Tesla "fanboys" will say that since their cars are driving on city streets, stopping at red lights and stop signs, making turns at intersections, all without them holding the wheel, that Tesla has achieve autonomous driving.
 
Last edited:
Lane centering on a windy road is more a rural driving scenario than a "City NOA" scenario IMO.

I disagree with this. Turning at intersections is more complicated (and involves larger radius turns) than simply following the lane on a windy road, and AP still can't even reliably stay within the lane on windy roads. I agree that city NoA is too big a leap right now, or even within the next year... Lol I hope I'm wrong
 
I disagree with this. Turning at intersections is more complicated (and involves larger radius turns) than simply following the lane on a windy road, and AP still can't even reliably stay within the lane on windy roads. I agree that city NoA is too big a leap right now, or even within the next year... Lol I hope I'm wrong

I just see the two situations as very different problems. Lane centering on a windy road is just lane centering like AP already does, just with a larger turning radius. Turning at intersections is definitely more difficult. It does not involve lane centering but requires understanding the "map" of the intersection as well as excellent perception of all the other traffic vehicles. It also requires excellent path finding. But I concede your point that if AP can't do the easier situation, how can it do the more difficult situation.
 
City NoA isn't anywhere in sight until AP can do the simple task of staying in a lane, especially on windy roads. AP still can't do that.
One is not like the other.

Besides, in windy roads around here, AP has been quite good. Where it gets confused is when side roads join the winding road and its not clear what is "going straight". When they implement city NOA, it will actually help AP because it will know when side of the "fork" to take.
 
Besides, in windy roads around here, AP has been quite good. Where it gets confused is when side roads join the winding road and its not clear what is "going straight". When they implement city NOA, it will actually help AP because it will know when side of the "fork" to take.

I will say that AP has gotten progressively better in this scenario. When I first got my Model 3, it would "dive" towards the side road before self-correcting. Now, on HW3, it goes straight and only occasionally "nudges" the steering wheel to the side for a micron before self-correcting.
 
The lane keeping on 2020.12.5 still has not progressed much. Honestly, it doesn't seem to have progressed more than "5-10%" in the last year.

2020.12.5 is not the rewrite yet.

I agree. I've noticed some small incremental improvement in lane keeping but nothing huge. It seems to drive straighter through intersections. The phantom braking I was getting on 2020.8.2 appears to be gone as well.

But I would agree that the improvement is incremental and definitely no indication of the major rewrite yet.
 
  • Like
Reactions: willow_hiller
I've never had this much cognitive dissonance about a technology and been wrong about it, but it really doesn't seem like Tesla will be able to achieve a reliable robotaxi-like service with the current hardware and technology.

The leap from the current AP system to any form of reliable FSD seems indomitable.

The only leap that was remotely similar is SpaceX landing rockets on autonomous barges. So I guess there's some merit, but even then, we got to see SpaceX get closer and closer to achieving it. With AP, we haven't seen it get closer and closer to that level of reliability.
 
I've never had this much cognitive dissonance about a technology and been wrong about it, but it really doesn't seem like Tesla will be able to achieve a reliable robotaxi-like service with the current hardware and technology.

The leap from the current AP system to any form of reliable FSD seems indomitable.

TL;DR Elon was not honest about FSD. He made it sound closer than it was. Tesla's don't have enough sensors for proper perception and rely too much on "solving camera vision" which is not yet as advanced as it needs to be.

1) Elon's rhetoric did not match reality which created a disconnect between expectations and outcomes. Especially back in 2016-17, Elon made it sound like FSD was right around the corner because Tesla just needed to finish some software and validate it. Many Tesla fans believed this so they had high expectations that they would get FSD "soon" or at least see significant progress. But the reality is that Tesla has really only scratched the surface of autonomous driving. This lack of progress compared to their expectations, forces people to re-evaluate when or if Tesla can do FSD. I used to be the biggest Tesla fanboy, totally believing that FSD was coming in a couple years. But when I started seeing the lack of progress, especially compared to FSD leaders like Mobileye, Waymo or Cruise, I had to admit to myself that Tesla's FSD was still a long ways off.

2) Tesla has inadequate sensors to solve perception. For an autonomous car, perception is the foundation. It has to be solved first because everything else is dependent on the car being able to reliably "see" the world around it. If you lack the proper sensors for reliable perception, developing autonomous driving will be a serious struggle. Everybody else uses cameras, lidar and radar to give themselves reliable and redundant perception. Tesla rejected that approach to go with a camera centric approach because Elon believes that cameras alone give you enough information to do perception. Now, Mobileye has developed a really good FSD demo prototype with cameras only, so we might expect that Tesla could at least get some type of FSD demo with cameras only. But Mobileye uses 12 high res cameras for their demo compared to Tesla's 8 cameras. So Tesla did not just go with a camera-only system, they went with a minimalist camera-only system. When you only have the bare minimum of sensors, solving perception will be that much harder.

3) Tesla's camera vision is not as developed as we thought. Tesla's camera centric approach is entirely dependent on solving camera vision. And with excellent camera vision, they might be able to do a some type of FSD demo, similar to what Mobileye has done with their camera only prototypes. But the truth is that Tesla's vision is not as advanced as we thought. Mobileye, Waymo and others all have better camera vision. For example, Waymo's camera vision can distinguish between regular cars and emergency cars. It can also see when the driver side door of a parked car is open (indicating a driver might step out in front of you). It can read street addresses. So not just is Tesla dependent on camera vision to do perception, but their camera vision is not developed enough. Hence, why Tesla is still slow at making that jump from L2 to autonomous driving since they don't have the perception part done yet which is just the foundation for autonomous driving.
 
TL;DR Elon was not honest about FSD. He made it sound closer than it was. Tesla's don't have enough sensors for proper perception and rely too much on "solving camera vision" which is not yet as advanced as it needs to be.
Seems you need to drop the shtick already. Have we actually seen anything factual to show that Elon was dishonest? Failing to meet expectations, or timelines is not dishonesty. Or do you speak from the perspective of knowing what was in his, heart of hearts? On top of that, how many other companies or leaders in the field had similar perspectives on timelines? Quite a few and you know that. As well, I have not seen or do not understand how you can state with any semblance of fact, what "enough sensors" actually is. While then conversely stating _in-effect_, camera vision is enough, once it is advanced enough. Absolutism, especially about something like this, is just dumb.
 
Seems you need to drop the shtick already. Have we actually seen anything factual to show that Elon was dishonest? Failing to meet expectations, or timelines is not dishonesty. Or do you speak from the perspective of knowing what was in his, heart of hearts? On top of that, how many other companies or leaders in the field had similar perspectives on timelines? Quite a few and you know that. As well, I have not seen or do not understand how you can state with any semblance of fact, what "enough sensors" actually is. While then conversely stating _in-effect_, camera vision is enough, once it is advanced enough. Absolutism, especially about something like this, is just dumb.

Errr... California requires companies testing FSD systems to file with the DMV. About November 2016, Tesla released a video showing FSD driving a long mixed route with the caption ~'Pending regulatory approval'. But the DMV logs told a different story. Large amounts of errors in the tests, tests stop for awhile, then right before the video, they do 3? tests. Then stop testing for years.
It doesn't not take a rocket surgeon to figure out what happened.

System was not ready as intended. Needed a publicity video. Fired up their best car, geofenced or learned a loop, as soon as they got enough video, they went back to the drawing board. But they knew 'regulatory approval' wasn't the problem. Now remember at this point in time, a Tesla cannot see a stalled car on the road, so it's obviously not ready.

Not slamming them, just letting you know the history of FSD. They were deceptive in November 2016 and it was Elon who announced the video release. So he knew it's true status and that regulatory approval was not the problem. Elon is a human. Some people struggle with that. I used to write software so I completely understand why he did what he did, and I'd have probably done the same. "Gosh just a few more bug-fixes and I can release it" has gotten a lot of people in hot water.
 
Last edited:
  • Like
Reactions: ChrML
Seems you need to drop the shtick already. Have we actually seen anything factual to show that Elon was dishonest? Failing to meet expectations, or timelines is not dishonesty. Or do you speak from the perspective of knowing what was in his, heart of hearts? On top of that, how many other companies or leaders in the field had similar perspectives on timelines? Quite a few and you know that. As well, I have not seen or do not understand how you can state with any semblance of fact, what "enough sensors" actually is. While then conversely stating _in-effect_, camera vision is enough, once it is advanced enough. Absolutism, especially about something like this, is just dumb.

1) It's not a schtick. A schtick is a gimmick or a comedy routine. I am not doing a comedy act. I am expressing a serious opinion that I think is based on some logic and fact.

2) Tesla sold AP2 hardware in 2016 claiming it was "FSD capable". They sold a FSD package, saying that FSD only needed regulatory approval. Despite the fact that they repeatedly reported ZERO autonomous miles to the CA DMV. Then Elon repeatedly promised FSD, coast to coast demos, FSD divergence from EAP etc and missed those deadlines. So yes, when you sell a product that is not ready yet and repeatedly make promises that don't pan out, I think there is a big disconnect.

The fact is that there is a big difference between missing FSD promises when you have no real FSD yet (Tesla) and companies who miss deadlines but actually have some autonomous driving (Waymo or Cruise).

3) I am basing "enough sensors" on what the leaders in autonomous driving are using, like Waymo, Cruise and Mobileye. They are the ones who actually have achieved real autonomous driving so I think that's a good standard for what sensors are needed.

When did I ever state that camera vision is good enough when it is "advanced enough"? First of all, how do you define "advanced enough"? That is very vague. I said Mobileye has a demo prototyoe with camera only but I never said that it was good enough for deployment. The fact is that Mobileye is still planning to include lidar when they actually deploy their L4/L5 system to the customer on public roads because they don't think camera only is good enough to meet the safety and reliability requirements for deployment.

But again, I am just going by what I see the leaders in autonomous driving doing. The current leaders who have real autonomous driving now have some of the best camera vision on the planet and still use lidar. That tells me that lidar is still required for safe, reliable, autonomous driving.

No offense but when it comes to FSD, who should I trust more? The companies with actual autonomous driving or the company with only L2 and camera vision that is still a work in progress?
 
The only redeeming thing about Elon's FSD predictions is that it seems that Tesla will be the first to achieve a mass market FSD system. If any other company solves the perception problem with cameras, there's little doubt that Tesla's team would quickly adapt their approach and achieve similar results with less hardware.

What I'm saying is solving vision is a requirement for FSD. Tesla will be the first to solve it, if it's ever solved with the current tech, but Elon's timeline has been wrong time and time again. Some of us have been following Elon's FSD predictions for 3-4 years... Lol