Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • We just completed a significant update, but we still have some fixes and adjustments to make, so please bear with us for the time being. Cheers!

Autonomous Car Progress

diplomat33

Well-Known Member
Aug 3, 2017
6,873
7,862
Terre Haute, IN USA
Is FSD a forward-only driving system? What about reversing?

It's a rainy day. You're reversing down your driveway into the street and some yahoo zips past you at 60mph. Or your driveway is on an angle so the yahoo approaches you at 60mph. If radar is a better way of detecting this, and your car doesn't have rear & side radar, then isn't that a problem?

Do we have tunnel vision and just say, well nobody would reverse onto a street where traffic is going 60mph, so that's that.

Seems like 360° radar is needed if you need it at all.

You mention some good scenarios for why 360 degree HD radar is needed. You are right that FSD is not forward-only. You need to track objects coming from the side and rear too. It's why all reliable autonomous cars do have 360 degree radar. Of course, you need 360 degree camera vision too. But having 360 degree radar adds redundant coverage in conditions like rain or snow where cameras might struggle. So yeah, I think 360 degree HD radar is needed.
 
Last edited:
  • Like
Reactions: Matias and Dan D.

powertoold

Active Member
Oct 10, 2014
1,939
3,530
USA
So far, I've only seen Waymo cars go in reverse autonomously (during a three point turn). Otherwise, autopark also goes in reverse, but I don't think that counts :D

Come to think of it, not-a-smart summon goes in reverse as well.
 
  • Funny
Reactions: Dan D.

daniel

Active Member
May 7, 2009
4,738
3,562
Kihei, HI
FSD isn't about perfection or dealing with every extreme case.

True. It's about achieving a safety record that's better than human drivers. But it's hard for me to imagine how this can be accomplished without 360° sensors. And since cameras are weak at measuring distance (Tesla, I believe, uses radar for TACC) I am of the opinion that some sort of active sensors such as lidar or radar will be needed all the way around.
 
  • Like
Reactions: diplomat33

mspisars

Active Member
May 23, 2014
2,032
1,367
Charlotte, NC
Apology if this has been addressed already or if this is the wrong thread, but I am wondering how any of these autonomous driving technologies deal with busy pedestrians areas such as school zones? Often time we human look at body motions and gestures to understand the intent of crossing guards and people trying to cross the street. Do any of the systems work in these scenarios?
There is a good example of this from TesLatino video on FSD, others on FSD have had examples as well.
See the post below watch the link at the timestamp 9:10 (the guy in the center of the road)
Potentially interesting situation from TesLatino with a pedestrian walking at a crosswalk without right of way:
View attachment 627304
FSD beta didn't slow down at all, and that was the correct behavior expected by the pedestrian who is trying to walk right behind the Tesla in between moving vehicles. Unclear if Autopilot actively understood that or just believed it had right of way, e.g., stopping for crosswalk is ignored at a green light.
 

mspisars

Active Member
May 23, 2014
2,032
1,367
Charlotte, NC
And one more hardware upgrade they'll have to provide for people who paid for FSD
Definitely disagree with this...
Hardware suite would always be upgraded with time... or do you expect the current FSD computer to stay the same for decades?
Even during the Autonomy Day presentation when they revealed the FSD computer they said they were already starting on the next gen computer.

That goes for sensors as well.
Plus this video of Autopilot vs FSD on same stretch of road (4... almost 5 disengagements on Autopliot, 0 - zero - on FSD)

They are not even using the full potential of the current hardware suite yet.
 
Last edited:

powertoold

Active Member
Oct 10, 2014
1,939
3,530
USA
It's about achieving a safety record that's better than human drivers. But it's hard for me to imagine how this can be accomplished without 360° sensors. And since cameras are weak at measuring distance (Tesla, I believe, uses radar for TACC) I am of the opinion that some sort of active sensors such as lidar or radar will be needed all the way around.

Possibly, time will tell. The evidence is pointing at cameras being enough. The birds-eye-view predictions are super human level already.

If the side camera distance predictions are the last thing Tesla needs to improve to achieve FSD, then they're quite close IMO. I actually don't think that's the main challenge right now. The challenge is and has been lane semantics. Tesla can't rely on predefined maps all the time to figure out lane semantics.
 
  • Like
Reactions: DanCar

mspisars

Active Member
May 23, 2014
2,032
1,367
Charlotte, NC
And since cameras are weak at measuring distance
Camera's do not measure distance. (NOTE: Tesla has 2 set of sensors covering 360° -- cameras and ultrasonics)
It is the NN's that determine the distance from the images cameras feed.
The beauty of that is that instead of having a "weak measuring sensor" you now have a perpetually improving measuring tool!
The more data that is fed through for the training of NN's and the more miles it proves accurate the more reliable it gets.
Again, this video is a great example.
 
  • Like
Reactions: powertoold

powertoold

Active Member
Oct 10, 2014
1,939
3,530
USA
Camera's do not measure distance. (NOTE: Tesla has 2 set of sensors covering 360° -- cameras and ultrasonics)
It is the NN's that determine the distance from the images cameras feed.
The beauty of that is that instead of having a "weak measuring sensor" you now have a perpetually improving measuring tool!

Yup, that's the beauty of NNs. Deep Rain was still a joke just 6 months ago. Nowadays, I leave my wipers on auto, and it's been working well.

I don't know if you guys have noticed this, but the side cameras are positioned in a way that has a similar angular vantage point as the front cameras. This allows Tesla to train the distance predictions on the side cameras with the same images as the front cameras. For example, on a two lane opposing traffic road, the front camera is predicting oncoming car distances with its radar and camera. This same data can be used for cars on the side cameras as well (without the radar though). Not sure if I'm explaining this properly.
 
  • Like
Reactions: mspisars

mspisars

Active Member
May 23, 2014
2,032
1,367
Charlotte, NC
Yup, that's the beauty of NNs. Deep Rain was still a joke just 6 months ago. Nowadays, I leave my wipers on auto, and it's been working well.
This is another great example.
The pain and gnashing of teeth on this forum about the rain NN was something to behold.

I have not touched the Auto setting on my Y since I got it! Freaking better than the independent sensor on my X... which will turn on the wipers when I pull out from garage on bright sunny day!
 

powertoold

Active Member
Oct 10, 2014
1,939
3,530
USA
Are there any features still missing from the FSD-beta that are expected to be included ? (for US-release that is)

Not sure what you mean? Like features missing for it to be actually "fsd"? Tesla never really provided us with a list of expected fsd features. Elon mentions feature "complete" but doesn't go into detail on that either.

FSD beta has yet to demonstrate three point turns and reversing on roads (in rare gridlock type situations).
 
  • Like
Reactions: petit_bateau

Bladerskb

Senior Software Engineer
Oct 24, 2016
2,072
2,316
Michigan
Reversing is done slowly, so ultrasound may suffice.

ultrasonics have range of around 5 meters and are not very accurate. is that what you are relying for moving cars?

Camera's do not measure distance. (NOTE: Tesla has 2 set of sensors covering 360° -- cameras and ultrasonics)
It is the NN's that determine the distance from the images cameras feed.
The beauty of that is that instead of having a "weak measuring sensor" you now have a perpetually improving measuring tool!
The more data that is fed through for the training of NN's and the more miles it proves accurate the more reliable it gets.
Again, this video is a great example.

There we go with the data none-sense again. It has nothing to do with different NN or feeding more data. Both firmware are using the exact same NN as per verygreen. The difference is the conventional c++ control algorithm.

Possibly, time will tell. The evidence is pointing at cameras being enough. The birds-eye-view predictions are super human level already.

They are clearly not when there are multiple safety related disengagement per trip on tesla evangelist videos.

The challenge is and has been lane semantics. Tesla can't rely on predefined maps all the time to figure out lane semantics.
Here you go again with false statements. Alot of the disengagements has nothing to do with lane semantics.

If the side camera distance predictions are the last thing Tesla needs to improve to achieve FSD, then they're quite close IMO.

Again a statement based on nothing but lies and fairy tale.
Just like your prediction of Level 5, disengagement every 150k in 6 months which has 3 months left.
How is that working out for you?

Yet you continue to push BS without any underlying fact or evidence based qualifier.
 

Dan Detweiler

Active Member
Apr 21, 2016
3,005
12,474
Canton, Georgia
True. It's about achieving a safety record that's better than human drivers. But it's hard for me to imagine how this can be accomplished without 360° sensors. And since cameras are weak at measuring distance (Tesla, I believe, uses radar for TACC) I am of the opinion that some sort of active sensors such as lidar or radar will be needed all the way around.
Can the average schmuck driving a Carolla see 360 degrees around their car at all times? Can they operate the car safely? I don't see why FSD is expected to do things we would never expect a human driver to do.

Dan
 

daniel

Active Member
May 7, 2009
4,738
3,562
Kihei, HI
Definitely disagree with this...
Hardware suite would always be upgraded with time... or do you expect the current FSD computer to stay the same for decades?
Even during the Autonomy Day presentation when they revealed the FSD computer they said they were already starting on the next gen computer.

That goes for sensors as well.

When I bought my Model 3, Tesla said that my car had all the necessary hardware for the car to be able to drive itself without a human in it, and would be able to pick the kids up and bring them home from school, etc. Tesla said all that was needed was for the software to be completed and given regulatory approval.

This is why people who paid for FSD at that time were given the new computer for free: Because Tesla had given them a contractual promise that their payment would get them true and genuine FSD (not the watered down "feature complete" that Elon started talking about later).

When Tesla decides that more hardware is needed to fulfill their contractual obligation, they'll have to provide that for free also. Those legacy buyers of FSD were told they would get real driverless operation without paying any additional amount. If Tesla does accomplish FSD but needs more hardware to do it, it will have to upgrade or replace those cars. If it does not accomplish FSD within the normal expected lifetime of those cars, it will be in default, as it promised "If you pay now, your car will do thus-and-such." The risks of promising to deliver something that does not yet exist. If a farmer sells futures and his crop fails, he is legally obligated to pay the value of the crop he was unable to deliver. Tesla sold FSD futures.

Can the average schmuck driving a Carolla see 360 degrees around their car at all times? Can they operate the car safely? I don't see why FSD is expected to do things we would never expect a human driver to do.

Dan

When I'm driving I can see if a car is careening at me from the side or from behind. An autonomous car needs to be able to do the same. Camera data is inadequate to determine the speed of another vehicle, which is why Tesla uses radar for TACC. It needs radar (or lidar if they prefer, which Tesla doesn't) for side- and rear-approaching vehicles as well.
 
  • Like
  • Disagree
Reactions: Matias and mspisars

mspisars

Active Member
May 23, 2014
2,032
1,367
Charlotte, NC
When Tesla decides that more hardware is needed to fulfill their contractual obligation
Testing new hardware for future vehicle revisions does not automatically mean that it somehow applies to the legacy fleet.
Put another way, a 2021 car with a newer radar does not mean that Tesla cannot deliver FSD as promised in 2016 or today: Autopilot
And we do not know if this is more hardware, this could be an updated radar unit for the current one.
Just like Model Y introduced the heated radar and older Model 3s did not have.
 

MP3Mike

Well-Known Member
Feb 1, 2016
14,983
31,859
Oregon
Tesla has filed with the FCC to use a millimeter wave radar for FSD:

"The equipment under test (EUT) was an Vehicle Millimeter-wave Radar Sensor operating in 60 GHz band (60-64 GHz)"

Tesla files to use new 'millimeter-wave radar' on its 'full self-driving' electric cars - Electrek

IMO, this is a good sign that Tesla will probably add imaging radar to Tesla cars soon.

I think this radar is an in-cab occupancy detecting radar, not to be used for driving.

Tesla applies to add short-range interactive motion-sensing device inside cars - Electrek

I'm not sure why Electrek can't remember what they reported just a few months ago. :rolleyes:
 

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top