Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
The kerfuffle at 7:10 or so is, sorry, absolutely not the fault of the car. Those cones are not set correctly, they don't block access from the driving lanes to the holes in the road (in fact they look like lane markings, which is why the car tries to drive to their right!). And the construction vehicle is operating IN THE ROAD, with no flaggers or safety personnel visible. In fact the only construction worker I can see if the operator of the device!

I mean, it's true that at the end of the day we want automation to recognize holes in the road and swinging backhoes as "ZOMG" level obstacles and give them much wider berth than FSD did. But I'll bet anything that some car was wrecked in that street that day.
 

vid from Zeb in NC. He has a three minute summary clip at the start of his 20+ minute video that highlights some of the stuff he thought was interesting from his drive. There was some fairly heavy rain in the latter clips and he said the car performed pretty much the same in the rain as it did when it was clear. Have not yet had time to watch the raw footage.
 
The kerfuffle at 7:10 or so is, sorry, absolutely not the fault of the car.
Yet there is not a pile of cars in that hole, yet 100% of FSD Teslas tried to drive into it.
But I'll bet anything that some car was wrecked in that street that day.
Seriously, you think 1:1000 drivers will drive around those cones to the right and into that hole, and that's anywhere near the rate of Teslas that will? The Tesla that thought it could actually even fit on that side of the cones even though it would have hit the curb even if the cones were not there?

This is literally the hard part of autonomous driving- the real world is set up to work with real humans with actual common sense, and does not always align with perfect rules. This is like when Teslas liked to drive directly into barriers on the highway Seattle and people said "but that's a confusing roadway" when 100,000 drivers a day managed it with less than one accident a year and 0% of Teslas did.
 
That construction zone isn't even an edge case. I've probably seen backhoes digging holes in the road a hundred times in my lifetime. The cone placement seems fine for a low speed residential street.
they don't block access from the driving lanes to the holes in the road
Because the holes are not in a driving lane, they're in the parking strip.
 
  • Like
Reactions: gearchruncher
It doesn’t have to be able to deal with crazy construction zones haphazardly set up in order for it to be useful....
So what is the usefulness of a "Full Self Driving" package that will drive into cones and holes in the road, or try and drive around cars that are stopped waiting to make a left turn, or drive right into concrete pillars, or cut in front of oncoming traffic at left turns?

As a driver, how would this reduce your workload on city streets, given that the car requires such intense monitoring?

It's a neat trick, and it does show Tesla is working seriously on actual FSD, but it also shows how far away they are, and how the car doesn't even have a simple understanding of "don't drive into solid objects" yet, which seems like a baseline for a useful ADAS system.
 
It doesn’t have to be able to deal with crazy construction zones haphazardly set up in order for it to be useful....

If v9 was able to clear this, I think that would be truly amazing and mean it was way ahead of Waymo....
It's true that a Waymo operating driverless would definitely phone home if it saw that situation. In a sense Waymos are Level 3 vehicles (except that the remote operator is not directly controlling the car).
There are definitely two camps with regards to the usefulness of city FSD. I think it will only be useful once it is autonomous but many people think it would be useful as a driver assist feature. If Tesla could make FSD a Level 3 system so it drove autonomously and only required driver attention when it encountered situations like that construction zone that would be awesome.
 
So what is the usefulness of a "Full Self Driving" package that will drive into cones and holes in the road, or try and drive around cars that are stopped waiting to make a left turn, or drive right into concrete pillars, or cut in front of oncoming traffic at left turns?

Without addressing any specific situations… FSD will be very useful to me before it can handle 100% of my trips with no interventions. I’m totally ok with having to take over every time certain situations arise in the next release I receive. Of course I would prefer if “make a left turn” isn’t one of those, but I’m ok if it is, with the understanding that it will continue to improve.

I think we all see some low-hanging fruit for improvement in the current beta, and I’m sure there are things they want to fix or restrict (require confirmation or intervention for) before a wide release. I hope and expect to see some regular improvements rolling out. I’m eager to get it, well before it approaches robotaxi quality.
 
It's true that a Waymo operating driverless would definitely phone home if it saw that situation.
Which, FWIW, is exactly what I meant. An irresponsible and dangerous (and almost certainly illegal, though I don't have the MI statutes at my fingertips) construction zone is the edgiest of edge cases. And I'll repeat: I'll bet anything that at least one car was damaged there that day. Again the backhoe is driving around in traffic lanes! And you can drive straight past the cones into the holes!

You don't engineer for that. You just don't. Tesla or Waymo could spend decades trying to crack that nut and never solve it, because it's just not a safe situation for reasons that have nothing to do with the driver or automation. The solution here can't be in logic space, it just needs to see that there are holes in the road (something that is a known shortcoming of FSD) and either stop or evade.
 
That's not a good faith interpretation of what I wrote and you know it.
An irresponsible and dangerous (and almost certainly illegal, though I don't have the MI statutes at my fingertips) construction zone is the edgiest of edge cases. And I'll repeat: I'll bet anything that at least one car was damaged there that day.

So, in the couple hours of Dirty Tesla V9 videos, he happens to find something that is the edgiest of edge cases, something so bad that you are 100% sure that 1:1000 humans was unable to avoid it?

You don't engineer for that. You just don't.
Yes you do. This happens in the real world, and humans do a pretty good job of avoiding it. Your system must avoid it as well as humans, or it is not a system which is better than humans.

All the car had to do here was not drive into objects it could see (cones). You can see the route preview directly route into them!
 
  • Like
Reactions: Daniel in SD
Yes you do. This happens in the real world, and humans do a pretty good job of avoiding it. Your system must avoid it as well as humans, or it is not a system which is better than humans.
I agree you have to for level 5 to work, but not for level three, and not for whatever Tesla is currently trying to achieve with beta9.

youre trashing them for no reason.
 
Which, FWIW, is exactly what I meant. An irresponsible and dangerous (and almost certainly illegal, though I don't have the MI statutes at my fingertips) construction zone is the edgiest of edge cases. And I'll repeat: I'll bet anything that at least one car was damaged there that day. Again the backhoe is driving around in traffic lanes! And you can drive straight past the cones into the holes!

You don't engineer for that. You just don't. Tesla or Waymo could spend decades trying to crack that nut and never solve it, because it's just not a safe situation for reasons that have nothing to do with the driver or automation. The solution here can't be in logic space, it just needs to see that there are holes in the road (something that is a known shortcoming of FSD) and either stop or evade.
Yeah, self-driving is super hard. That looks like a perfectly normal residential construction zone to me. There are a lot of places you could drive that you shouldn't, I'm sure I could navigate between cones at most construction sites to get to somewhere I'm not supposed to be...
Maybe we have a different definition of edge case. To me it means things I probably won't see once in my lifetime (less once per million miles?).
 
  • Like
Reactions: gearchruncher
Maybe we have a different definition of edge case. To me it means things I probably won't see once in my lifetime (less once per million miles?).

Yea, I would consider edge cases to a lot more common than that. Eg, you likely won’t ever see that same exact layout of cones and construction more than once in several billion miles. That makes it an edge case. Which is Unique from roads which have clearly marked lines edged ect...
Eh I’m not sure if I’m explaining myself right.
 
  • Like
Reactions: rxlawdude
Yea, I would consider edge cases to a lot more common than that. Eg, you likely won’t ever see that same exact layout of cones and construction more than once in several billion miles. That makes it an edge case. Which is Unique from roads which have clearly marked lines edged ect...
Eh I’m not sure if I’m explaining myself right.
By that definition everything is an edge case!
As Elon said, nothing has more degrees of freedom than reality.
 
  • Like
Reactions: gearchruncher
Maybe we have a different definition of edge case. To me it means things I probably won't see once in my lifetime (less once per million miles?).
I think that's part of it. Would you consider other driver behavior an edge case? Would you demand that autonomy be able to avoid another car running a red light into the intersection, or merging directly into the side of the autonomous vehicle? I mean, we'd both agree that we'd want the vehicle to be able to do that. But I think we'd also both agree that this is just out of scope: there are some things in the universe that just can't be avoided.

And I view hostile environments like that construction zone as on that list. They need about 3x as many cones, and flaggers to block traffic for the tracked vehicle in the road. Frankly if FSD was allowed to drive unbidden into that hole, I bet the state would have been on the hook and that guy in the backhoe would have been fired.
 
  • Like
Reactions: rxlawdude
  • Like
Reactions: rxlawdude
Which, FWIW, is exactly what I meant. An irresponsible and dangerous (and almost certainly illegal, though I don't have the MI statutes at my fingertips) construction zone is the edgiest of edge cases. And I'll repeat: I'll bet anything that at least one car was damaged there that day. Again the backhoe is driving around in traffic lanes! And you can drive straight past the cones into the holes!

You don't engineer for that. You just don't. Tesla or Waymo could spend decades trying to crack that nut and never solve it, because it's just not a safe situation for reasons that have nothing to do with the driver or automation. The solution here can't be in logic space, it just needs to see that there are holes in the road (something that is a known shortcoming of FSD) and either stop or evade.
Random Google search result about detecting potholes:
  • Pothole detection system using 2D LiDAR and camera

  • 3D LiDAR based Drivable Road Region Detection for Autonomous Vehicles
Some people are studying this very thing. So there's no reason to say that FSD can't be able to figure out what a hole is. Or a fence, pole, white truck, underground parkade wall or all the other things we see FSD Beta head towards. Or maybe it does see them but the path planning sucks. Either way the end result is the same, FSD seems to head directly at objects. Do Waymo, Mobileye, Zoox, etc also head for objects? From the videos I've seen, they actually don't.
 
  • Like
Reactions: gearchruncher