Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
If you watch wholemars video closely, his car isn't magical or he picks demo routes purposely to show its good side. The guy just never intervene unless he finds the car stuck or about to hit something. It makes mistakes compared to human driving all the time and do miss turns. He just let it reroute and the car usually recovers. So people claiming that the car require less and less safety related disengagement is just saying it no longer want to smash into things like how unprotected turns on early v10 was really a crap shoot.

You are looking for perfect lane selection and using that as your measurement for improvements. That's just one thing out of a bunch of things....
The measurement for improvement spans everything from very minor smoothness / comfort gains thru each accident avoided / minimized, just as with a human learning to drive.

That said, perhaps in my desire to not fill the Internet with negativeness r.e. FSDb, I am actually minimizing what happens too much. Strictly speaking of safety of lane changes, in last night's drive there were at least 5 times that I can clearly remember where the lane change (or attempted lane change) was at a spot it would be illegal. Given that it was night here and traffic was very light / nonexistent on some roads, I gave the car more leeway than I would have otherwise. In one instance it dove *across* a separator / bike lane that is inserted between the turn lane and the forward lane; in another, almost very-unfortunate instance, it attempted to change lanes into an *occupied* lane before I took control (adjacent turn lane fans out to be extra wide to ease the traffic onto the next road, so the last 10' or so of the adjacent lane looks far wider than a normal lane and so perhaps it thought it had room despite showing the other cars properly rendered on the screen). Based on prior experiences at these same intersections, if I do not intercede, it does slam on the brakes in time to avoid actual collisions, but I have learned to intercede before that is necessary. 4 of the 5 were reported to Tesla with voice notes...the first one I did let it finish the maneuver to see what it would do, then looped back so I could repeat it, disengaged at the reasonably-should-disengage-point, and provided the voice note.
 
So people claiming that the car require less and less safety related disengagement is just saying it no longer want to smash into things like how unprotected turns on early v10 was really a crap shoot.
The issue is - "safety" is not a done once and forgotten "feature".

I'll give an example from morning drive to drop off kids. Flashing yellow arrow - huge gaps. FSD refuses to move people honking from the back. Now - is that "safe" - you probably think it is ... it is "safe" unless someone already under great deal of stress and getting late for morning work snaps and takes out a gun.

Lets say Tesla goes down the priority and addresses the issue. They become more aggressive (or rather, less passive) and try to take the turn quickly like humans do. Guess what, that has not increased the risk of collision and FSD is now "less safe". I can give quite a few such examples ...

That is what makes this whole thing slow and painful. 11.x has several regressions for me - even though it is better in many respects. Each feature brings new safety concerns, regressions etc. I mean Tesla is so far behind the feature list - they haven't even bothered to stop for school buses !
 
If you watch wholemars video closely, his car isn't magical or he picks demo routes purposely to show its good side. The guy just never intervene unless he finds the car stuck or about to hit something. It makes mistakes compared to human driving all the time and do miss turns. He just let it reroute and the car usually recovers. So people claiming that the car require less and less safety related disengagement is just saying it no longer want to smash into things like how unprotected turns on early v10 was really a crap shoot.

You are looking for perfect lane selection and using that as your measurement for improvements. That's just one thing out of a bunch of things....
@Singuy - question for you:

Driving past an elementary school (kids approximately ages 5-11) this morning, I see the 15mph speed limit signs placed in the center of the roadway, as they always are from shortly before school starts to shortly after school ends on school days such as today. These are full-fledged speed limit signs (not the yellow 'recommended' speed limit signs often seen on curves), and override the normal 25mph speed limit for this morning's road. I give the car every opportunity to see the signs, re-confirm via a second drive that it does not recognize the signs regardless of direction, etc, and on my final drive thru that area hit the brake as it enters the 15mph area and leave Tesla a voice note on the scenario. On that final pass, there was no traffic (no cars, no pedestrians), so if I had let it speed thru at 25mph it wouldn't have crashed. However, allowing it to go 10mph over the limit across the mid-street (not at a corner) crosswalk for young kids to get to school seems like it would be a horrible idea / huge safety issue. But since it wouldn't have "smashed into things", would you say this was not a "safety related disengagement"?
 
@Singuy - question for you:

Driving past an elementary school (kids approximately ages 5-11) this morning, I see the 15mph speed limit signs placed in the center of the roadway, as they always are from shortly before school starts to shortly after school ends on school days such as today. These are full-fledged speed limit signs (not the yellow 'recommended' speed limit signs often seen on curves), and override the normal 25mph speed limit for this morning's road. I give the car every opportunity to see the signs, re-confirm via a second drive that it does not recognize the signs regardless of direction, etc, and on my final drive thru that area hit the brake as it enters the 15mph area and leave Tesla a voice note on the scenario. On that final pass, there was no traffic (no cars, no pedestrians), so if I had let it speed thru at 25mph it wouldn't have crashed. However, allowing it to go 10mph over the limit across the mid-street (not at a corner) crosswalk for young kids to get to school seems like it would be a horrible idea / huge safety issue. But since it wouldn't have "smashed into things", would you say this was not a "safety related disengagement"?
Fsd beta does illegal things all the time. Right now Tesla is just nearing the end of building its foundations before start crossing ts and dotting Is. The primary directive is to not crash right now.

If this was a building, we are just near the end of setting the foundation while you want the cabinets drilled in. In fact without vision going to full NN, Tesla is still setting the foundation.
 
The issue is - "safety" is not a done once and forgotten "feature".

I'll give an example from morning drive to drop off kids. Flashing yellow arrow - huge gaps. FSD refuses to move people honking from the back. Now - is that "safe" - you probably think it is ... it is "safe" unless someone already under great deal of stress and getting late for morning work snaps and takes out a gun.

Lets say Tesla goes down the priority and addresses the issue. They become more aggressive (or rather, less passive) and try to take the turn quickly like humans do. Guess what, that has not increased the risk of collision and FSD is now "less safe". I can give quite a few such examples ...

That is what makes this whole thing slow and painful. 11.x has several regressions for me - even though it is better in many respects. Each feature brings new safety concerns, regressions etc. I mean Tesla is so far behind the feature list - they haven't even bothered to stop for school buses !
What can I say, localization using vision only is a hard problem. There's a reason why every company out there just skipped this step and went straight into the niddy griddy.
 
  • Like
Reactions: Hiline
Fsd beta does illegal things all the time. Right now Tesla is just nearing the end of building its foundations before start crossing ts and dotting Is. The primary directive is to not crash right now.

If this was a building, we are just near the end of setting the foundation while you want the cabinets drilled in. In fact without vision going to full NN, Tesla is still setting the foundation.
I appreciate the response, but I didn't see an answer...do you, @Singuy , in your own personal opinion, consider the disengagement report I sent in to Tesla this morning to be a "safety related disengagement" (your phrase)? Or no, if it wouldn't have "smashed into things"?
 
I appreciate the response, but I didn't see an answer...do you, @Singuy , in your own personal opinion, consider the disengagement report I sent in to Tesla this morning to be a "safety related disengagement" (your phrase)? Or no, if it wouldn't have "smashed into things"?
Based on the primary directive, it is not safety related if the car can stop.

Going the speed limit doesn't mean the car can stop prior to hitting someone. Going above the speed limit doesn't mean the car will hit someone.

People follow traffic laws can still get into accidents. Then you have my friend who has never gotten into an accidently but man he breaks 5 laws on every drive, going 25 miles above the speed limit, making left and right turns while not in the turning lane and actively cutting across traffic. The guy should be banned for reckless driving but as of today he has yet to hit anything or caused harm to himself after 25 years of driving.
 
  • Funny
Reactions: pilotSteve
I appreciate the response, but I didn't see an answer...do you, @Singuy , in your own personal opinion, consider the disengagement report I sent in to Tesla this morning to be a "safety related disengagement" (your phrase)? Or no, if it wouldn't have "smashed into things"?

I vote no. There was no immediate harm to yourself or anyone else. If a kid had been present, different story.
 
Based on the primary directive, it is not safety related if the car can stop.

Going the speed limit doesn't mean the car can stop prior to hitting someone. Going above the speed limit doesn't mean the car will hit someone.

People follow traffic laws can still get into accidents. Then you have my friend who has never gotten into an accidently but man he breaks 5 laws on every drive, going 25 miles above the speed limit, making left and right turns while not in the turning lane and actively cutting across traffic. The guy should be banned for reckless driving but as of today he has yet to hit anything or caused harm to himself after 25 years of driving.

I vote no. There was no immediate harm to yourself or anyone else. If a kid had been present, different story.

Well, based on these perspectives, I can happily say that I have very, very few "safety related disengagements" if this is the bar, as I know from repeated experiences that it won't actually "smash into things". Even last night's incident where the car attempted to merge into the next lane even though it clearly saw (and showed on the display) that there were already cars in that adjacent lane wouldn't be one, since in repeated prior times when I didn't disengage, the car would hit the brakes hard and stop ~3 feet away from the other cars (stopping halfway in each lane since it cannot complete the lane change maneuver).

IMHO, however, safety encompasses many things which are precautionary. Wearing safety goggles in a chemistry lab, wearing gloves in a biomedical lab, observing a lower speed limit at a mid-street crosswalk in front of a kids school while school is in session...safety isn't just about waiting until the really bad thing is going to happen and then doing something heroic to stop it. But now that I understand your perspective, I can see why you perceive such low and rapidly decreasing "safety related disengagements", so thanks for clarifying your perspective on these.
 
What can I say, localization using vision only is a hard problem. There's a reason why every company out there just skipped this step and went straight into the niddy griddy.
Yes its a hard problem.

But the basic disagreement in this board is where exactly is Tesla in "solving" the problem. My position is that they are only now seriously getting started - 11.x is what I thought we'd get when we first got FSD ! There are so many basic features still missing (roundabouts, school zone, stop for school bus, handle emergency vehicles, flaggers ...). Then there is a looong tail of smaller issues - and issues yet to be discovered !

One big overall technical issue is that the whole thing seems to be *slow*. Thats why FSD stops before a small roundabout trying to figure out the path around (takes a couple of seconds), brakes hard late in a number of situations, slow to move from a stop etc. When you still need to add a lot of features, but the current NN is already taking up a lot of time .... optimization on HW3 may yet be the biggest challenge.
 
Yes its a hard problem.

But the basic disagreement in this board is where exactly is Tesla in "solving" the problem. My position is that they are only now seriously getting started - 11.x is what I thought we'd get when we first got FSD ! There are so many basic features still missing (roundabouts, school zone, stop for school bus, handle emergency vehicles, flaggers ...). Then there is a looong tail of smaller issues - and issues yet to be discovered !

One big overall technical issue is that the whole thing seems to be *slow*. Thats why FSD stops before a small roundabout trying to figure out the path around (takes a couple of seconds), brakes hard late in a number of situations, slow to move from a stop etc. When you still need to add a lot of features, but the current NN is already taking up a lot of time .... optimization on HW3 may yet be the biggest challenge.
FSDB stops at some small roundabouts and blow past others. I am still not 100% sure what the rhythm or reasons are but it's a thing. I think we are all waiting for this gigantic overhaul of how FSDB treats map data. It is really the thing that kills FSDB right now. My car would find random places in the middle of the road to want to fully stop when it never happened before, and of course there are all sorts of wonky speed limit problems on hwys.
 
  • Like
Reactions: EVNow and navguy12
Yes its a hard problem.

But the basic disagreement in this board is where exactly is Tesla in "solving" the problem. My position is that they are only now seriously getting started - 11.x is what I thought we'd get when we first got FSD ! There are so many basic features still missing (roundabouts, school zone, stop for school bus, handle emergency vehicles, flaggers ...). Then there is a looong tail of smaller issues - and issues yet to be discovered !

One big overall technical issue is that the whole thing seems to be *slow*. Thats why FSD stops before a small roundabout trying to figure out the path around (takes a couple of seconds), brakes hard late in a number of situations, slow to move from a stop etc. When you still need to add a lot of features, but the current NN is already taking up a lot of time .... optimization on HW3 may yet be the biggest challenge.

As a Tesla driver, FSD progress is somewhat frustrating.

As a TSLA investor, I've come to the realization that I LOVE how hard autonomous driving is. Why? Because when Tesla solves it, they are going to have an insurmountable advantage over everyone else.
 
As a Tesla driver, FSD progress is somewhat frustrating.

As a TSLA investor, I've come to the realization that I LOVE how hard autonomous driving is. Why? Because when Tesla solves it, they are going to have an insurmountable advantage over everyone else.
Don't be so sure. Few here seem to understand the progress Chinese OEMs are making or for that matter MobilEye.
 
Don't be so sure. Few here seem to understand the progress Chinese OEMs are making or for that matter MobilEye.
I like how Tesla has PROVEN that people are willing to pay 15k for FSD beta, with zero regulatory hurdles/consequences so far letting the mass have the tech driving them around. Yet not one of any of these companies who are claiming robotaxis next month, are allowing their own customers have it for a fee. I like how it's all hidden behind a curtain with all sorts of claims to be better than any Tesla...and yet no one is monitizing it like Tesla.

Also I like how Tesla with the best AI engineers on self driving are not trying the method all these so called competitors who are claiming to have found the panaceia to true FSD.
 
Also can someone explain to me why all of these cars that uses Mobile Eye like the Ford Mach E, or all the Lidars they put on EQS(L3 they claim), Lucid Air, or whoever can't seem to beat a Tesla at active safety braking when NCAP actually put them through a standardized test? So what Mobile Eye is hiding their best stuff behind closed doors while allowing the Mach E to hit bicyclists right now?

So far all the stuff in actual customer's hands are trash, while the parent companies are bragging about FSD everywhere next month. Only Tesla to my knowledge actually released real FSD to customers and charging for it. Everyone else seems to be hiding something.