Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
If perception still is not there at a level even approximating human vision, I am not sure what can really be superhuman at this point in time.

I think the birds eye view predictions are superhuman, at least in terms of speed. I have no idea how Tesla has achieved them.

... Longer post got cut short by TMC connection issues.

Right now, Tesla is doing a poor job on the visualizations. It's difficult to figure out if V9 got lucky wrt to certain decisions or if it was able to predict object trajectories. I think Tesla is focused on improving the fundamentals right now, so they're less focused on UX-type features.
 

a number of downgrades from v9. Discouraging.

edit - he tags 4:00 as a good left turn, but the car did poorly here. Nothing too overly dangerous, but he notes that he had to hit the go pedal to get the car to make the turn which means the car didn’t make a good left turn at all.

Whatever logic Tesla is using to determine when to try to go around a stopped car is completely useless because it’s stupidly dangerous right now. Whoever is hand coding that needs to rewrite that.

Horrible drive by the car.
 
Last edited:
  • Like
Reactions: Bladerskb
I think the videos of V9.1 show nice improvements and forward progress, BUT I also think we still have a looonnnnggggggg way to go and a lot of little steps forward until wide release of full self driving. If I had to bet I'd say middle to late 2022 for FSD to go live with full capabilities while being safe.

We're closer than ever, but there's still a long way to go yet.
I think we could get a wide release this year, but a highly gimped version that is super cautious and requires user confirmation or user takeover for any complex situation. Unprotected turns/straights and going around stopped cars would be at the top of that list. Whatever logic Tesla uses to determine if a car is stopped in the road and needs to be driven around is completely useless since it’s so dangerous right now, trying to initiate passing around cars that are stopped due to traffic and traffic signals/signs either Into the sidewalk or opposing lanes of traffic, sometimes with opposing traffic coming.
 
at least in terms of speed.
It seems subhuman if it frequently loses track of vehicles, as shown in the capture above. Humans are able to judge vehicle position and approximate speed. Humans do this screw this up occasionally to ill effect, certainly, but FSD will need to be substantially better than that of course.

Anyway, to me it currently appears to be subhuman, with much greater consistency (even on “simple” perception - not even discussing policy here) needed to be superhuman.

As discussed already, any system implemented with something close to the current perception, operating in the background (L0 I guess? Not sure how active safety features are classified) has the potential to provide superhuman capabilities on an accelerated schedule, but that is a separate topic and not the apparent objective of Tesla or (apparently) what is desired by the market right now.
 
4:15 - missed a right turn for no apparent reason. Good news it didn’t try to make a dangerous maneuver to fix it and instead re-routed smoothly.

9:20 - car goes on an unprotected left with a car coming and driver is forced to disengage to avoid a possible collision.

12:35 - still has trouble with the double stop signs so the car is stopping twice, once correctly at the stop line, and again in the middle of the road which is incorrect and dangerous.
 
  • Like
Reactions: powertoold
#FSDBeta 9.1 - 2021.4.18.13 - 2020 Model Y (Radar Installed) - Testing if there is Radar Usage

10:40 - damn, you’re a braver man than me. I would have intervened immediately when it was moving to change lanes around the stopped lane. That looked damn close to scraping the car in front of it. I thought it might just be the camera angle for the vid and there was more space than I thought, but you said it was damn close too. Whew.
 
If perception still is not there at a level even approximating human vision, I am not sure what can really be superhuman at this point in time.

The current system works great at times to clean up certain human errors. A good use case for incomplete perception. That I guess is technically superhuman, but not what people are thinking in this context.

With a fully alert human performing the DDT, it is true that it is a superhuman system, although imperfect, with false alarms. That has been the case for several years though.
The Hype is superhuman, for sure.
 
Well it sure doesn't act like it sees it either! Also, even after the truck is displayed it flickers out of existence briefly. Why would it display the truck and then not display the truck?
View attachment 690557View attachment 690556

I can see it quite clearly in a YouTube video. Isn't it close enough to the center that it would be seen by the long rang cameras? I'm sure it would be easy to see on the built in dash cam.
I don't really know anything about how they label all this video to train the NN. My understanding is that it's all done by hand so maybe they just haven't labeled cars more than 200ft away? I have no idea.
The flickering in and out could be many things, maybe the NN not detecting the object as it changes angles or comes out of view from one camera and into view of another. Computer vision tends to be very flakey like that.

For what it's worth; human vision is probably "flakey" in the same way, but we have additional layers of processing and logic that smooths out such things. With computer vision, you're really only seeing the raw output of object recognition.

What needs to happen is building some sort of object permanence into the system, where it can place objects correctly into the scene as the raw perception fades the object in or out.

I *think* (hope) this is something covered by Elon's latest post about the hyperbolic "superhuman" prediction. Some nice human, or even subhuman, prediction night have made the car not stop tracking this truck.
 
actually, that's what I'm afraid of; tesla saying 'ok' when its clearly not ready yet.

the hardware does not exist (that we know of) since its not functional enough to do level 3, let alone 4 or 5.

it would be irresponsible for them to release what they have now and let the unwashed masses have at it.

fastest way to get more restricive laws made, too. show enough bravado and release dangerous stuff early and we'll find out. tesla might actually end up ruining things for EVERYONE if they push too hard, too fast.

the fact that tesla will take your $10k does not mean that I want to be on the same roads as a bunch of yahoos running around with alpha-quality software controlling their several thousand pound killing machines.
well you are on the road with yahoos that have had a six pack after work, are texting to break up with their boyfriend, and who are just tired-really tired. In a few months I think I would rather have that Tesla FSD than many humans. Of course I am not saying it is good enough yet but I would say that progress seems to be accelerating. Seems tesla is well aware of the many issues and the progress you can see from release to release is heartening.
 
Interesting behavior here. Tesla must have some NNs that identify emergency vehicles. I wonder why it didn't ask for the driver to take over, especially with a big cross over double yellows:

We’ve seen in other videos that it will try to pass other vehicles that it thinks are stopped in the road, when they are actually just obeying traffic signals and shouldn’t be passed at all, so there’s nothing to really glean from this vid, imo. The logic that Tesla is using to determine when to go around a vehicle is pretty broken right now and has produced the most dangerous fails requiring intervention behind unprotected turns.
 
  • Like
Reactions: Matias
We’ve seen in other videos that it will try to pass other vehicles that it thinks are stopped in the road, when they are actually just obeying traffic signals and shouldn’t be passed at all, so there’s nothing to really glean from this vid, imo. The logic that Tesla is using to determine when to go around a vehicle is pretty broken right now and has produced the most dangerous fails requiring intervention behind unprotected turns.

It doesn't always do it, especially on two lane roads with double yellows. It's kinda intriguing how it decides to pass and when. I don't think it's hand coded. There's likely a "should pass" NN.
 

8:23 - interesting moment. Car has arrived at Costco after a fairly uneventful drive. It is turning in the parking lot but sees a line of cars filling the lane it was going to use and then decides to abort the turn mid-turn and go straight instead and then take the next turn in the parking lot to get to the destination marker. Actually pretty impressive but I’m not sure how safe it would be if there had been a car behind him that was going to go straight.
 
  • Like
Reactions: powertoold
Aside: I use the gear icon and change the speed to x1.75 which still makes them very watchable and the commentary is perfectly understandable.
---
Navigating around Manteca, CA in FSD Beta 9.1
Aug 2, 2021 -- AI DRIVR -- 31.9K subscribers

Roof-mounted camera for the 180 degrees top ribbon is aligned with B-pillar cameras.
I absolutely love those views because they let 'you see' what AP has available to it to make decisions.

yNa7g2z.jpg
 
Last edited:
  • Like
Reactions: powertoold