Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla.com - "Transitioning to Tesla Vision"

This site may earn commission on affiliate links.
Saves it in your car. If you bring it to the service center they'll be able to pull it up in the logs.

And chances are they will not understand anything. They will just see that it braked . But to what? does it save minutes worth of video footage along with it? I don't think so! Without video it would be useless.
It could maybe solve things like wrong speed limits or things like that.

Tesla should be more transparent about how data should be reported and how that works.

To hear that tesla vision cars also phantom brake breaks my heart LOL. They should really have all hands on deck to fix existing issues that we who paid for it all experience every day, instead of 100% focusing on something that many of us that bought FSD will not experience in maybe 3-5 years.
 
  • Like
Reactions: BogStandard
And chances are they will not understand anything. They will just see that it braked . But to what? does it save minutes worth of video footage along with it? I don't think so! Without video it would be useless.
It could maybe solve things like wrong speed limits or things like that.

Tesla should be more transparent about how data should be reported and how that works.

To hear that tesla vision cars also phantom brake breaks my heart LOL. They should really have all hands on deck to fix existing issues that we who paid for it all experience every day, instead of 100% focusing on something that many of us that bought FSD will not experience in maybe 3-5 years.
Or at least give people the option to also use 'dumb' cruise (like all the older Model 3s where not getting AP was an option). They have the code for this, they just need to add it to the AP options on the screen. But no way they will, because Tesla.
 
Still no firm timetable when FSD Beta Button will widen to public release; or when Vision available at all for radar cars.

Seems like it’s going to be at least the rest of the year before things get sorted out an aligned between features and cars.
Excuse me for being sarcastic but as I fully paid for FSD in 12/2016, this “later this year” excuse has no juice left IMO.
 
So I just got the 2021.4.21.3 update and decided to take a quick cruise to see if "minor bugs and fixes" helped with TACC at all. Sad to say no.
In 20 miles I had 5 unintended/phantom brake occurrences. One was a full regen 60-25mph slowdown while climbing a hill, no other car in sight.
The other 2 phantom brake events were cresting a hill and the other is only explainable by a tree shadow on a flat&straight section of road.
The 2 unintended brake events were cars travelling in the opposite oncoming lane, and both were on relatively flat areas.

Sigh, I'll keep waiting for a fix I guess. So frustrating. If they can't figure this out, I need a dumb cruise option, or a different vehicle. I travel at least 120 miles a day. Decent cruise control is a must. I've only had the car since June and I'm already tired of throttle hovering. The whole experience feels like the antithesis to Tesla's vision. I've been doing the same commute with dumb cruise for 12 years now, so this feels like a step back in that regard.

I know in general it's better to be on the safer side of things and have it overreact to things, but this is too far on the safe side to the point of inconvenience compared to my dumb old Toyota Camry. Tesla shouldn't need to feel on the hook for liability to inattentive drivers, and let us maintain some sort of manual control on things. At least have the option for now.
I don't know, do I call up Tesla service to voice my issue? I don't think anything can be done, but at least have it known out there.

Long winded, but my $0.02 on what I've been dealing with the last month or so.
 
So I just got the 2021.4.21.3 update and decided to take a quick cruise to see if "minor bugs and fixes" helped with TACC at all. Sad to say no.
In 20 miles I had 5 unintended/phantom brake occurrences. One was a full regen 60-25mph slowdown while climbing a hill, no other car in sight.
The other 2 phantom brake events were cresting a hill and the other is only explainable by a tree shadow on a flat&straight section of road.
The 2 unintended brake events were cars travelling in the opposite oncoming lane, and both were on relatively flat areas.

Sigh, I'll keep waiting for a fix I guess. So frustrating. If they can't figure this out, I need a dumb cruise option, or a different vehicle. I travel at least 120 miles a day. Decent cruise control is a must. I've only had the car since June and I'm already tired of throttle hovering. The whole experience feels like the antithesis to Tesla's vision. I've been doing the same commute with dumb cruise for 12 years now, so this feels like a step back in that regard.

I know in general it's better to be on the safer side of things and have it overreact to things, but this is too far on the safe side to the point of inconvenience compared to my dumb old Toyota Camry. Tesla shouldn't need to feel on the hook for liability to inattentive drivers, and let us maintain some sort of manual control on things. At least have the option for now.
I don't know, do I call up Tesla service to voice my issue? I don't think anything can be done, but at least have it known out there.

Long winded, but my $0.02 on what I've been dealing with the last month or so.
Regarding this popular issue (new to me) about slowing down when approaching the crest of a hill:
In theory any hill-crest can be a "blind hill" if you know nothing about the topography of the road that lies beyond it. It always could be a sharp drop into a local valley that could hide a stopped or oncoming car, big boulder or other hazard.

Obviously such blind hills do exist. However they're rare, it's well-understood to be dangerous and it merits a warning sign, light, speed-bump or other hazard mitigation.

In actuality, the sight horizon over a hill-crest is part of standard road engineering practice and is based on standard geometry assumptions. For any human looking over a dashboard, the eye height above the road surface is usually taken to be at least 4ft ≈ 1.2m, perhaps a bit less for especially low cars. The height of hazardous debris is sometimes taken as 6" ≈ 15cm; of course stopped or oncoming vehicles are much taller. These assumptions along with (real) traffic speed and cconservative reaction+stopping distance values, and the possibility of oncoming passing-maneuver traffic in the driving lane, all govern the acceptable crest profile to avoid a "blind hill" hazard.

So. if Tesla AP is braking abnormally upon approaching a crest in the road, it suggests one of two things:
  • Tesla driving decision policy is being quite overcautious in assuming a very unlikely road-design failure. As we know such over-caution, leading to unexpected braking, is itself a traffic hazard.
    • This concern BTW can be mitigated by using the elevation data in established navigation maps. I don't know whether maps are typically marked with dangerous blind- locations, but said elevation topography data could be used to create such markers.
  • Tesla is not being purposefully over-cautious, but the problem is it cannot confidently determine the distance to the crest, and it interprets this as a possible rare blind-hill scenario for traffic coming the other way over the crest.
Given all of the above, my conclusion is that this is a very solvable issue as long as Tesla Vision can basically see. Unlike the pillar-camera situation, here I'm not seeing a fundamental geometry problem with the existing perception suite. In fact the forward-looking cameras are sitting higher than the human driver.
 
  • Like
Reactions: Dan D.
Regarding this popular issue (new to me) about slowing down when approaching the crest of a hill:
In theory any hill-crest can be a "blind hill" if you know nothing about the topography of the road that lies beyond it. It always could be a sharp drop into a local valley that could hide a stopped or oncoming car, big boulder or other hazard.

Obviously such blind hills do exist. However they're rare, it's well-understood to be dangerous and it merits a warning sign, light, speed-bump or other hazard mitigation.

In actuality, the sight horizon over a hill-crest is part of standard road engineering practice and is based on standard geometry assumptions. For any human looking over a dashboard, the eye height above the road surface is usually taken to be at least 4ft ≈ 1.2m, perhaps a bit less for especially low cars. The height of hazardous debris is sometimes taken as 6" ≈ 15cm; of course stopped or oncoming vehicles are much taller. These assumptions along with (real) traffic speed and cconservative reaction+stopping distance values, and the possibility of oncoming passing-maneuver traffic in the driving lane, all govern the acceptable crest profile to avoid a "blind hill" hazard.

So. if Tesla AP is braking abnormally upon approaching a crest in the road, it suggests one of two things:
  • Tesla driving decision policy is being quite overcautious in assuming a very unlikely road-design failure. As we know such over-caution, leading to unexpected braking, is itself a traffic hazard.
    • This concern BTW can be mitigated by using the elevation data in established navigation maps. I don't know whether maps are typically marked with dangerous blind- locations, but said elevation topography data could be used to create such markers.
  • Tesla is not being purposefully over-cautious, but the problem is it cannot confidently determine the distance to the crest, and it interprets this as a possible rare blind-hill scenario for traffic coming the other way over the crest.
Given all of the above, my conclusion is that this is a very solvable issue as long as Tesla Vision can basically see. Unlike the pillar-camera situation, here I'm not seeing a fundamental geometry problem with the existing perception suite. In fact the forward-looking cameras are sitting higher than the human driver.
Great thoughts on the blind-hill braking analysis. I haven't driven or experienced this myself so I'll just assume that what is being reported is actually happening to some people.

Perhaps if Tesla was driving in a more human way it could be less startling an effect. For example I might coast a bit before a blind hill-crest in anticipation of the situation that I can see well ahead of time. Assuming Tesla is actually braking for the blind hill-crest as a cautious decision policy then it just comes down to its reaction time being too short. Let's say we start at 60mph and at the slowest we are at 40mph, but I would have coasted sooner, and maybe braked slower; something that doesn't coast and brakes late just seems aggressive, even though the end result speed is the same.
 
Regarding this popular issue (new to me) about slowing down when approaching the crest of a hill:
In theory any hill-crest can be a "blind hill" if you know nothing about the topography of the road that lies beyond it. It always could be a sharp drop into a local valley that could hide a stopped or oncoming car, big boulder or other hazard.

Obviously such blind hills do exist. However they're rare, it's well-understood to be dangerous and it merits a warning sign, light, speed-bump or other hazard mitigation.

In actuality, the sight horizon over a hill-crest is part of standard road engineering practice and is based on standard geometry assumptions. For any human looking over a dashboard, the eye height above the road surface is usually taken to be at least 4ft ≈ 1.2m, perhaps a bit less for especially low cars. The height of hazardous debris is sometimes taken as 6" ≈ 15cm; of course stopped or oncoming vehicles are much taller. These assumptions along with (real) traffic speed and cconservative reaction+stopping distance values, and the possibility of oncoming passing-maneuver traffic in the driving lane, all govern the acceptable crest profile to avoid a "blind hill" hazard.

So. if Tesla AP is braking abnormally upon approaching a crest in the road, it suggests one of two things:
  • Tesla driving decision policy is being quite overcautious in assuming a very unlikely road-design failure. As we know such over-caution, leading to unexpected braking, is itself a traffic hazard.
    • This concern BTW can be mitigated by using the elevation data in established navigation maps. I don't know whether maps are typically marked with dangerous blind- locations, but said elevation topography data could be used to create such markers.
  • Tesla is not being purposefully over-cautious, but the problem is it cannot confidently determine the distance to the crest, and it interprets this as a possible rare blind-hill scenario for traffic coming the other way over the crest.
Given all of the above, my conclusion is that this is a very solvable issue as long as Tesla Vision can basically see. Unlike the pillar-camera situation, here I'm not seeing a fundamental geometry problem with the existing perception suite. In fact the forward-looking cameras are sitting higher than the human driver.
That certainly is a good a theory and could explain what it's doing. The problem is I live in a river valley with lots of rolling hills and valleys so it happens frequently on every drive, and the inconsistency is baffling. Sometimes it does it on very shallow hills where you can see most, if not all, the road ahead, and sometimes it wont brake/slow on very sharp hills where one could assume the road just ends at the top.

However, it occurs more often than not with oncoming traffic, that's what consistently triggers these events. It's like the system can't vector the images properly without seeing the road lines and just assumes the other vehicle is coming straight at me. If road lines are visible next to the oncoming vehicle it doesn't happen often, although interesting enough it did yesterday, twice.

As others have mentioned, I wonder if a temporary loss of GPS or LTE data connection that it's using causes the car to freak out as it doesn't always know how to negotiate the images it's seeing. Where I live the connection is spotty.
 
Anyway, tweets like that really makes you question about Musk’s competence regarding fsd or integrity. It is hard to explain how he could say that.

I would say even the way TACC/AP handles on the released version puts into question whether anyone high up at Tesla actually DRIVES.

Like one of the first things I'd have the engineers do is have the car recognize turn signals from other cars that want to get into your lane so it can slow down to allow them to or speed up (if a-hole mode is enabled).

Then there is the constant need to re-center anytime a merge point occurs.
 
To hear that tesla vision cars also phantom brake breaks my heart LOL. They should really have all hands on deck to fix existing issues that we who paid for it all experience every day, instead of 100% focusing on something that many of us that bought FSD will not experience in maybe 3-5 years.
I'd like to get a progress report showing how, on a radar equipped car (with dual computers; it would have to be, you cant run one and be in 'dual mode'; physics isn't like that, lol) the vision-only scheme is showing less falses and better overall drive experience.

is there such a thing? internally, I would have to assume so; but do we get to see the a/b parallel data and how it SHOULD get better and better, over time, for the vision-only car?
 
  • Like
Reactions: pilotSteve
I'd like to get a progress report showing how, on a radar equipped car (with dual computers; it would have to be, you cant run one and be in 'dual mode'; physics isn't like that, lol) the vision-only scheme is showing less falses and better overall drive experience.

is there such a thing? internally, I would have to assume so; but do we get to see the a/b parallel data and how it SHOULD get better and better, over time, for the vision-only car?

Karpathy claimed that they ran vision I shadow mode and compared it to radar until it was good enough.

But one phantom brake is one too many.
 
Karpathy claimed that they ran vision I shadow mode and compared it to radar until it was good enough.

But one phantom brake is one too many.
people of science trust numbers, not word of mouth.

and again, shadow mode is BS; what I am describing is true parallel mode. one codebase in one cpu and one in another. dual array of sensors so that you truly have 2 cars on 1 chassis (computer wise). you run them both and you log data to both.

THAT is what I'm talking about. until we see that, there is zero proof to the user community. they should be proud to show us proof if this is actually real.
 
people of science trust numbers, not word of mouth.

and again, shadow mode is BS; what I am describing is true parallel mode. one codebase in one cpu and one in another. dual array of sensors so that you truly have 2 cars on 1 chassis (computer wise). you run them both and you log data to both.

THAT is what I'm talking about. until we see that, there is zero proof to the user community. they should be proud to show us proof if this is actually real.
You are assuming they have any need to "show proof" (especially in the rigor you are suggesting). Karpathy's talks are not for a general audience in the first place. If you are buy a Tesla Model 3/Y today, you have no choice in the matter, and yet they are selling like hot cakes. They really don't have any incentive to need to show anything. All they really need to get done is getting things back to "parity" in terms of more explicit limitations (I believe the only two remaining is top speed and the follow distance setting, which recently went down to 2).
 
Last edited:
I'm speaking as an owner, but also as someone technical enough in the field to know progress from BS. I do want to believe they have progress, but its a higher bar to actually SHOW it.

I think enough, here, are technical enough to appreciate a dual-stack drive thru demo series.

tesla DOES have to prove this; if we are to trust our lives, YES, they DO have to prove it and prove it and keep proving it for a few years before that seemingly asymptotic level 5 can be reached.
 
  • Like
Reactions: Dan D.