Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
I think we have reason to dissect Elon's tweets nowadays. "Looks promising" and "about 2 weeks later" are not solid commitments.

The wording of "public opt in request button" could also raise an eyebrow: does that suggest instant access to the software, or is this a button to request access with that request being subsequently approved or rejected?
 
I think we have reason to dissect Elon's tweets nowadays. "Looks promising" and "about 2 weeks later" are not solid commitments.

The wording of "public opt in request button" could also raise an eyebrow: does that suggest instant access to the software, or is this a button to request access with that request being subsequently approved or rejected?
1630591665885.png

(https://media1.giphy.com/media/uUIFcDYRbvJTtxaFNa/giphy.gif)
 
  • Like
Reactions: Matias
The inside camera thing has already been live on mass-release vehicles (though I think vision only ones from Greens posts)
Teslafi shows about 125 vehicles getting 2021.32.5 over the last few days so it appears Tesla is starting a slow roll-out to vehicles with radar. Finally feels like all the parts and pieces are finally coming together. The optimist would say better late then never but having purchased FSD twice without anything to show for it I'll wait before getting too excited.
 
Last edited:
  • Informative
Reactions: S4WRXTTCS
Teslafi shows about 125 vehicles getting 2021.32.5 over the last few days so it appears Tesla is starting a slow roll-out to vehicles with radar. Finally feels like all the parts and pieces are finally coming together. The optimist would say better late then never but having purchased FSD twice without anything to show for it I'll wait before getting too excited.

Indeed. I think I'll keep FSD on my MS LR order due around EOQ. I might be an idiot.
 
The more I watch these videos the more ridiculously difficult the problems seems.

In some cases where that appears to be the case, there are just a (?very) few key elements being ignored or not taken into account correctly that make the puzzle seem unsolvable.

discredit the progress that has been made in fsd beta.

Progress can be regarded as just working on a problem if the assumption is that with passage of time a solution is inevitable. I find it hard as a passive and external observer to determine what is overall progress vs long or short term regression.

Is banging your head on the wall still progress if you are inadvertently missing a key component in your model?

the hybrid of man and machine 'thinking' on the same road, that's the worst and the hardest to program for.

I agree and an assumption that a solution to fsd must include a coexistence with human driver-featured cars may be flawed until you recreate every human capability in AI.

not to build a car that can reason its way out of any situation, but to provide sufficient rules so that the car knows when to shout for help, but not to do it so often that its more bother for the human than manually driving.

Yes. A reasoned approach (or belief in rationalised computed set of actions), is very much human capability / competency based. Following a finite ruleset feels quite different from being able to reliably infer every required element and facet of a detected (changing / moving) image. And that still ignores the huge question of perspectives, including the difference between statistically knowing there will be road deaths compared with predetermined actions leading to death based around previous data and algorithms.

Surely there must be a finite set of rules to identify relevant objects?

I was in London a few days ago and ran across 6 lanes of busy traffic without waiting for the crossing lights. It struck me when I safely reached the far side just what a staggering thing had just happened, ranging from a fairly rational, risk averse guy apparently risking life and limb way beyond anything he would normally do and for no obvious reason, to the complex multi path projection, object detection, tracking and analysis that I did without being consciously aware of it.

I take the view that at the moment Tesla FSD Beta is probably barking up many wrong trees but that this is OK and necessary.

The ability of humans to move their head to force a change of perspective and process the way the perceived view changes could be a key part of how we process our vision. The brute force approach of processing every vision element with equal significance would seem to be at risk of throwing up many false danger detections as well as taking processing power away from essential tasks. I certainly wouldn't have responded to my phone ringing while I was running across that busy road in London.

Every human driver is unique. They perceive, learn and control in a potentially unique way to achieve a standardized end result. But they habitually break rules, make individual judgements and apply different strategies. FSD is trying to apply one system to all situations. The newbie misconception that their car is learning turn by turn to engage better with its environment is obviously incorrect yet that is how every human goes about every daily tasks.
 
Last edited:
  • Like
Reactions: Phlier and daktari
On what do you base that view?
Probably based on the same as many views expressed here. Reading, watching, driving, comparing and in my case a little messing with very simple autonomous robot design.

It would be great to find a complete and perfect oracle, and hope they got a job at Tesla.

It is quite common (based on first hand life experiences) that when you approach a challenge (both something new to you but especially something new to humanity) you don't get it right first time. Also there are always constraints and those constraints can change (eg: long / short term component supply and eol) as well as new options opening up. Engineering teams change as do their competencies and approaches.

FSD has multiple kinds of moving targets to deal with, not all on the highway.
 
Last edited:
Here's a video of FSD Beta 8(? old visualizations) in Kiev, Ukraine:

Are there any others from outside the US? Although maybe it's related to a software version accidentally allowing Factory Gateway mode to then enable FSD Beta?
kiev factory gateway.jpg
 
After we heard about 5 Tesla accidents in Yosemite at the same location in just 1 month, now we got a video saying "with ease" but I don't think the driver visited that same crash site.

The title seems to be too enthusiastic because it scared me at the fork going to the right as the Autosteer seemed to go crazy, turned all the way to the left as if it wanted to swerve into the concrete divider on the left. But the driver said, "It handled that like a champ".

I would like know which of these testers own Tesla stock.
 
  • Like
Reactions: daktari
It's been a while since I've seen FSD Beta incorrectly predict an intersection with a clear view. I believe this was caused by the adjacent truck (illegally?) switching from the right lane to the left lane through a shifted intersection. @DirtyT3sla did you go through this intersection (University Dr & Pine St) multiple times without someone cutting into your lane, and did it perform better as the path prediction was good until the truck switched from right to left lane?

Truck in right lane (good path prediction):
truck right.jpg


Truck switched to left (bad path prediction):
truck left.jpg


11:33 from
 
After we heard about 5 Tesla accidents in Yosemite at the same location in just 1 month, now we got a video saying "with ease" but I don't think the driver visited that same crash site.

The title seems to be too enthusiastic because it scared me at the fork going to the right as the Autosteer seemed to go crazy, turned all the way to the left as if it wanted to swerve into the concrete divider on the left. But the driver said, "It handled that like a champ".
Wow. Nice catch.
I would say car almost crashed, the driver was not prepared (complacent and wrong hand placement) and made 270 degree wheel turn to save it, with awkward positioning. And this was at jogging speed. "Like a champ". So much for TOSV personal integrity. Like comical Ali.
We are not far off the first FSD beta complacency accidents.