Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I even gave examples of situations where part 1 would not lead to part 2... instead it might lead to a slightly narrower ODD in which it can continue to operate perfectly safely at L4.
Those were good examples, but the likelihood of Tesla getting from here to L4 is close to zero on existing cars, regardless of weather. Perhaps is the ODD is limited to the close vicinity of a parking space at 2 mph max speed - but I doubt it. It's not their style to do "boring" things well.
 
  • Like
Reactions: pilotSteve
Those were good examples, but the likelihood of Tesla getting from here to L4 is close to zero on existing cars, regardless of weather


FWIW I completely agree with this.

I just don't believe "some unknown system degradation in rain" is the reason for that, nor do I believe that if they solved the OTHER issues they wouldn't be able to still operate L4 regardless of if it's raining a bit.


BTW, XKCD had a nice one today that seems relevant to the folks working on the system

progress.png
 
I even gave examples of situations where part 1 would not lead to part 2... instead it might lead to a slightly narrower ODD in which it can continue to operate perfectly safely at L4.

I am well aware of how L4 ODD can be limited in different ways. So, Tesla could still do L4, just where rain is outside the L4 ODD. But if that is the case, then the Tesla would pull over in rain, since it is now outside the ODD and L4 means it will stop or pull over when it is outside its ODD. I maintain that the degradation message suggests that if Tesla did do L4, it would likely need to remove rain from the ODD. So a Tesla robotaxi would pull over in rain, since it would be outside the L4 ODD.

And we saw a Waymo "stall" awhile back where 5 Waymo vehicles stopped due to dense fog. They stopped because the dense fog caused enough degradation in perception that the software was no longer "confident" in driving anymore. So that is a real world example of a robotaxi stopping or pulling over due to perception degradation from inclement weather. We know that rain causes degradation in Tesla's camera vision since we see the message on the screen. And perception degradation would cause a lack of confidence in driving. Would the loss in confidence be enough to cause a Tesla robotaxi to pull over? We don't know since Tesla does not have any robotaxis.. My opinion is that it probably would based on the fact that Tesla vision would be degraded from the rain, causing a loss in perception and a loss in confidence.
 
Last edited:
  • Like
  • Informative
Reactions: GSP and pilotSteve
I am well aware of how L4 ODD can be limited in different ways. So, Tesla could still do L4, just where rain is outside the L4 ODD.

Or where it's still inside the ODD, but top speed is reduced.

Because, for example, the reduced visibility means going 45 mph is still safe but going 85 mph is not.

Like I already explained. 3 times now.



I maintain that the degradation message suggests that if Tesla did do L4, it would likely need to remove rain from the ODD.

I realize you maintain that.

You just don't appear to have much basis for doing so.




And we saw a Waymo "stall" awhile back where 5 Waymo vehicles stopped due to dense fog. They stopped because the dense fog caused enough degradation in perception that the software was no longer "confident" in driving anymore. So that is a real world example of a robotaxi stopping or pulling over due to perception degradation from inclement weather.

Sure.

But as you point out- they still work in LESS dense fog, or rain.

Even though their perception is degraded there too.

But you insist this is UNPOSSIBLE for Tesla for some reason.
 
Or where it's still inside the ODD, but top speed is reduced.

Because, for example, the reduced visibility means going 45 mph is still safe but going 85 mph is not.

Like I already explained. 3 times now.

I realize you maintain that.

You just don't appear to have much basis for doing so.

Sure.

But as you point out- they still work in LESS dense fog, or rain.

Even though their perception is degraded there too.

But you insist this is UNPOSSIBLE for Tesla for some reason.

No, I get your position. You are saying that L4 does not have to pull over when perception is degraded, it could just slow down too, or reroute etc... I know that. And we don't know how much the degradation is, since the message does not say. Maybe the degradation is minimal and a Tesla robotaxi would just slow down and not need to pull over. So your position is that we don't know for a fact that a Tesla robotaxi would pull over just because there is a message that reliability is degraded, maybe the Tesla robotaxi would just slow down instead. There is no way of knowing who is right since Tesla does not have robotaxis. Maybe you are right, maybe I am right. I am expressing an opinion, not a fact.
 
  • Like
Reactions: GSP and Knightshade
lol what happened to billions of miles of data...


Nothing?

These would likely be the guys literatively testing very specific situations, and testing pre-public-release updates.

Distributing them to more places reduces some of the oft reported cases where it works great in say the bay area because that's where all their paid testers are but not elsewhere that it sees no testing until wide release.

Note the specific asks of:
attention to detail, and ability to work in a fast-paced dynamic environment.


So they might direct these folks "Test this situation, in these specific conditions, 100 times in a row" or "Test this situation, in these 10 different sets of conditions, 10 times a row" or even "Test this situation X times, then push this developer version button to switch software and retest X times, then reset with version C X times, etc...."
 
Nothing?

These would likely be the guys literatively testing very specific situations, and testing pre-public-release updates.

Distributing them to more places reduces some of the oft reported cases where it works great in say the bay area because that's where all their paid testers are but not elsewhere that it sees no testing until wide release.

Note the specific asks of:
attention to detail, and ability to work in a fast-paced dynamic environment.


So they might direct these folks "Test this situation, in these specific conditions, 100 times in a row" or "Test this situation, in these 10 different sets of conditions, 10 times a row" or even "Test this situation X times, then push this developer version button to switch software and retest X times, then reset with version C X times, etc...."
And why are people assuming these are necessarily about testing FSD? It could also be for testing other car features and getting them production ready. There is no way for existing vehicle owners to test unreleased hardware/software so there is always a need to hire test drivers.
 
Stop listening to the promises of salesmen.



So they work, but you want them to work better.



You should probably buy a Waymo or Cruise vehicle. I understand that they combine a number of different sensors.



Perhaps you should contact Tesla's FSD team and present your use and business case. I doubt they read these forums.
agreed, one of my biggest mistakes was listening to that lying kickstarter FSD huckster.

Yes they are windshield wipers, but the auto vision wipers are another half-baked beta product, that do not offer any benefit for the user, compared to existing working technology. as usual just excuses instead of engineering.

I don't care it it uses multiple sensors or one, I just expect it to Fully Self Drive my vehicle without my supervision or me taking 100% responsibility for a product that I did not write or manage. I paid for Full Self Driving, that is all I am after. Rented a Chevy Bolt in DC a couple a weeks ago. A cheap 280 mile range EV if you want to drive the car yourself, with nice cruise control.

I will call them directly, as a control engineer, hopefully they will have engineers to speak with, instead if people like Elon, not only claiming to be one, but the chief engineer. Even if they don't monitor forums, am sure their competition does.
 
I just expect it to Fully Self Drive my vehicle without my supervision or me taking 100% responsibility for a product that I did not write or manage. I paid for Full Self Driving, that is all I am after.

FWIW for over 4 years now the purchase page has been quite clear you're NOT buying something that delivers that.

Tesla right on the FSD sales description said:
The currently enabled features require active driver supervision and do not make the vehicle autonomous



I will call them directly, as a control engineer


GL with that.
 
In case anyone is interested, here is podcast interview with Jesse Levinson, Zoox co-founder and CTO.


Chapters:
00:00:00 Jesse Levinson
00:01:08 sponsors: Index Ventures and Weights and Biases
00:02:04 Zoox mission
00:02:59 starting with city driving instead of the highway
00:08:06 where can the public see Zoox cars
00:09:18 what does the Zoox car look like
00:11:39 designing a car without a driver's seat and safety
00:16:43 why design your own car
00:24:38 braking distance
00:25:33 manufacturing plans
00:30:15 AI in self-driving
00:37:49 how to keep improving the safety
00:40:48 role of humans in robotaxi fleet
00:44:12 is there an emergency button in the vehicle
00:45:31 how did Facebook affect Zoox
00:51:54 Zoox and Amazon delivery
00:53:47 Stanford days, DARPA challenge, early trajectory
01:01:06 raising the first round
01:03:38 ways to relax
 
The part on AI is interesting IMO. Jesse feels that we are still a long way from building a "GPT-4" that can drive a car. GPT-4 can do impressive stuff but can also suddenly screw up. That would be a big safety problem with a self-driving car. AI needs to be much more reliable. He believes we may get there in the future but we are not there yet. So he still sees value for now in keeping the current modular stacks that are tried and true. He does mention that they are incorporating AI advances into their stack. For example, AI helps with early sensor fusion. Also, AI is becoming quite good at planning and control, by suggesting paths. At Zoox, they are using a hybrid planning and control stack . They use both traditional coding and also neural nets for planning and the system picks whichever path is best. He does think that end-to-end is a long way off as well. Again, the issue is reliability. It needs to work incredibly reliably before you could actually deploy it on public roads without supervision. It is not there yet. The other issue is that it is hard to make adjustments or troubleshoot an issue. E2E is only as good as the training data. So, if you need to fix an issue or adjust the driving behavior, it can be very hard to do. Also, doing end-to-end learning in simulation is incredibly computationally expensive. He says that E2E may happen in the future, he just does not think we are there yet.
 
  • Informative
Reactions: kabin and Bitdepth
VW with Mobileye Drive is starting testing in Austin, plans to grow test fleet to at least 4 more US cities in coming years. VWGoA plans to launch commercial AVs (powered by Mobileye Drive) in Austin by 2026. For those who might not know, Mobileye Drive is Mobileye's autonomous driving stack designed for "driverless" like robotaxis and driverless delivery.

Herndon, Va. / Austin, Texas — Volkswagen Group of America, Inc. (VWGoA) is starting its first autonomous vehicle test program in Austin beginning in July 2023. The company will kick off its program with a batch of 10 all-electric ID. Buzz vehicles outfitted with an autonomous driving (AD) technology platform developed by the global Volkswagen Group in partnership with technology company Mobileye. Over the next three years, VWGoA plans to grow its test fleet in Austin and also progressively expand testing operations to at least four more American cities. Building upon investments throughout this initial pilot, VWGoA anticipates a commercial launch of autonomous driving vehicles in Austin by 2026.

 
Last edited:
  • Like
Reactions: DanCar and Bitdepth
I am well aware of how L4 ODD can be limited in different ways. So, Tesla could still do L4, just where rain is outside the L4 ODD. But if that is the case, then the Tesla would pull over in rain, since it is now outside the ODD and L4 means it will stop or pull over when it is outside its ODD. I maintain that the degradation message suggests that if Tesla did do L4, it would likely need to remove rain from the ODD. So a Tesla robotaxi would pull over in rain, since it would be outside the L4 ODD.

And we saw a Waymo "stall" awhile back where 5 Waymo vehicles stopped due to dense fog. They stopped because the dense fog caused enough degradation in perception that the software was no longer "confident" in driving anymore. So that is a real world example of a robotaxi stopping or pulling over due to perception degradation from inclement weather. We know that rain causes degradation in Tesla's camera vision since we see the message on the screen. And perception degradation would cause a lack of confidence in driving. Would the loss in confidence be enough to cause a Tesla robotaxi to pull over? We don't know since Tesla does not have any robotaxis.. My opinion is that it probably would based on the fact that Tesla vision would be degraded from the rain, causing a loss in perception and a loss in confidence.
lol. super safe. pull over on the side in rain with other vehicles flying by at 80+ mph?
 
lol what happened to billions of miles of data...
It could be to focus on challenging scenarios or a sign most FSD drivers aren't willing to let FSD run willy nilly mad max mode. Good for them to finally admit their current employee/shill testing approach sucks.
 
Last edited:
  • Like
Reactions: nativewolf