Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Absolutely.

Not everybody has the intestinal fortitude to trust their own skills enough to actually push the envelope to the extent necessary to gain confidence in the system. Doing so may require on the tester's part a deeper understanding of the physics, design, and ability to predict how far to go before "ejecting" and this is what makes test pilots different from recreational pilots. It is a Beta TEST, emphasis on the "test" part. People not accustomed to risk taking may not be able to perform well in such an environment.

Sounds like this guy is unable to execute the task of letting the system work in order to see how it does on its own. At least not as effectively as many others have demonstrated.

If a Beta participant is unable to channel their inner Chuck Yeager as well as many others in the real-world test of FSD, this could result in the system being blamed when FSD may have successfully completed the task at hand had the driver not chosen to take over in some instances.

This illustrates the difference in comfort level between those able to effectively test FSD Beta and those who are the target cases for FSD to exist.
this is 100% my experience driving FSD Beta for over 6 months ... many situations where I take control are as follows:

I am not comfortable with the way FSD Beta is handing a given situation , then i take over.
It may handle a majority of these situations in an acceptable manner , i just don't want to risk an accident/damage that will likely take months to fix ...
i am an unpaid tester not an idiot :p
 
Absolutely.

Not everybody has the intestinal fortitude to trust their own skills enough to actually push the envelope to the extent necessary to gain confidence in the system. Doing so may require on the tester's part a deeper understanding of the physics, design, and ability to predict how far to go before "ejecting" and this is what makes test pilots different from recreational pilots. It is a Beta TEST, emphasis on the "test" part. People not accustomed to risk taking may not be able to perform well in such an environment.

Sounds like this guy is unable to execute the task of letting the system work in order to see how it does on its own. At least not as effectively as many others have demonstrated.

If a Beta participant is unable to channel their inner Chuck Yeager as well as many others in the real-world test of FSD, this could result in the system being blamed when FSD may have successfully completed the task at hand had the driver not chosen to take over in some instances.

This illustrates the difference in comfort level between those able to effectively test FSD Beta and those who are the target cases for FSD to exist.
This attitude will eventually lead to major problems. If you trusted my car on FSD, you would be injured or dead right now.
 
Sounds like you're just making excuses and don't know anything about "this guy". Maybe do some research first. I'll help:


I honestly don't understand his rant. He is acting like he doesn't understand FSDb. People who have used it awhile understands where it would fail and where it would succeed. Fail area are usually places incompatible with the map data or the car chooses the wrong lane where it sacks its probability of success. This is rather consistent and with every update we usually test the same area to see if there are any improvements.

With that said, there are also many routes where FSD will give you intervention free drives and this is slowly expanding wider and wider. The car also performs unprotected turns WAY better than before. His 10% unprotected left is BS. I'm at 80-90% success rate. It used to be random luck.

His "I need to intervene 10 times in a miles" is just completely BS after he acknowledge that lane keeping is solid. 99% of anyone's milage is going a straight line. Take a length of road to a destination, calculate how much of the length of road is a turn, and how much of is just going straight. He pretty much agrees 99% of the drive he doesn't need to do anything even IF he intervene at every turn.
 
Jason’s experience reflects mine. FSD is worse than useless on my car, it’s highly dangerous. Car to car variance is apparently very high. For those who don’t know, Jason probably knows more about Teslas than anyone else outside Tesla. They tried to hire him, but he won’t leave Hickory. Calling him a whiner is unfortunate at the least. He calls ‘‘em like he sees them. I see the same.

Tesla sent FSD team to Florida to work on Chuck Cook's left turn. Maybe they can send some people to North Carolina to work on wk057's car "trying to kill him".

Honestly I have no problem with people saying FSDb doesn't work on their car or in their area the way they like. I just don't subscribe to the line of thought that the car is sentient and is trying to kill someone on purpose. And I don't think it's accurate to use that language if you just think it's not sentient and isn't up to the task of driving safely.

"almost killed me" when X happened is fine, "tried to kill me" when X happened is not. And either one should be reserved for the worst possible scenarios. I don't believe you if you mention a minor issue that "almost killed you".
 
Last edited:
Tesla sent FSD team to Florida to work on Chuck Cook's left turn. Maybe they can send some people to North Carolina to work on wk057's car "trying to kill him".

Honestly I have no problem with people saying FSDb doesn't work on their car or in their area the way they like. I just don't subscribe to the line of thought that the car is sentient and is trying to kill someone on purpose. And I don't think it's accurate to use that language if you just think it's not sentient and isn't up to the task of driving safely.

"almost killed me" when X happened is fine, "tried to kill me" when X happened is not.
He has a long history of both having engineering rigor and having very strong opinions.

With that being said, this video is highly inconclusive and I hope he posts more videos. Looks like someone posted a comment that breaks down his issues ;)

Would be nice to see the build #, drive at the speed limit, capture the video directly from the MCU and have a better view of the road. And in general, FSD is overly cautious currently, the beta has been extended to >1/4 million cars so safety is paramount. Yes, some turns are going to seem to take forever and there will be issues. I get frustrated as well from time to time so I get it. 0:40 NP (Neural Planner) plans a path close to the stop sign. Passes the poles fine however. Due to takeover, not sure if this would have resulted in a good or bad turn. To the left is occluded by a chainlink fence: Google Maps 1:11 Lanes net issue, detecting a turn only lane should be fed into the inference stack through sparse map data as a hint to lanes net. 3:27 NP incorrectly plans a path to the left of a waiting car. DE (Data Engine) incorrectly classifies the waiting car as parked and LN (Lanes Net) incorrectly. 3:28 The visualization can't be seen at all due to an edit so I can't tell what was happening afterwards. 3:40 After the edit, it appears that after the takeover, FSD switches to AP. You can see the lane lines that AP sees vs the FSD visualizations. At 3:51 it switches back. I've not seen this occur without a UI warning popping up so I wonder what build is being used in this video and if it has somehow been modified. 4:10 FSD seems to do the right thing as the NP decides to not go (line turns grey) when the takeover occurs. Notice that it again switches over to AP visualizations without a warning.
 
Tesla sent FSD team to Florida to work on Chuck Cook's left turn. Maybe they can send some people to North Carolina to work on wk057's car "trying to kill him".

Honestly I have no problem with people saying FSDb doesn't work on their car or in their area the way they like. I just don't subscribe to the line of thought that the car is sentient and is trying to kill someone on purpose. And I don't think it's accurate to use that language if you just think it's not sentient and isn't up to the task of driving safely.

"almost killed me" when X happened is fine, "tried to kill me" when X happened is not.
I'm embarrassed to say that I said exactly that to the poor tech guy at Tesla about 4 years ago, after AutoPilot made an unexpected turn on a 2-lane road, because I got excited in a not-good way, but wanted to make sure they saw exactly what happened and took appropriate driver assistance programing action.
 
He has a long history of both having engineering rigor and having very strong opinions.

With that being said, this video is highly inconclusive and I hope he posts more videos. Looks like someone posted a comment that breaks down his issues ;)

Would be nice to see the build #, drive at the speed limit, capture the video directly from the MCU and have a better view of the road. And in general, FSD is overly cautious currently, the beta has been extended to >1/4 million cars so safety is paramount. Yes, some turns are going to seem to take forever and there will be issues. I get frustrated as well from time to time so I get it. 0:40 NP (Neural Planner) plans a path close to the stop sign. Passes the poles fine however. Due to takeover, not sure if this would have resulted in a good or bad turn. To the left is occluded by a chainlink fence: Google Maps 1:11 Lanes net issue, detecting a turn only lane should be fed into the inference stack through sparse map data as a hint to lanes net. 3:27 NP incorrectly plans a path to the left of a waiting car. DE (Data Engine) incorrectly classifies the waiting car as parked and LN (Lanes Net) incorrectly. 3:28 The visualization can't be seen at all due to an edit so I can't tell what was happening afterwards. 3:40 After the edit, it appears that after the takeover, FSD switches to AP. You can see the lane lines that AP sees vs the FSD visualizations. At 3:51 it switches back. I've not seen this occur without a UI warning popping up so I wonder what build is being used in this video and if it has somehow been modified. 4:10 FSD seems to do the right thing as the NP decides to not go (line turns grey) when the takeover occurs. Notice that it again switches over to AP visualizations without a warning.

FSD is making life safer for those around the Tesla...

 
Is the FSD approach even possible in those tunnels? Side repeater cams only see white walls, front cameras only see a small black triangle ("black road going near a point") on a white background.
Car is surrounded with white surfaces and lacks sufficient information to recreate a 3D space.
A different stack or maybe a 2D line guide system should already be enough (autopilot v1) to successfully drive through the tunnels.
If it cant work in a tunnel, its pretty terrible. There are lots of tunnels in the real world, FSD should find the boring company tunnels extremely easy. No pedestrians and all vehicles are the same size/type.
I'd be very worried about FSD if the reason its not in the vegas loop is because thats beyond its abilities!
 
Is the FSD approach even possible in those tunnels? Side repeater cams only see white walls, front cameras only see a small black triangle ("black road going near a point") on a white background.
Car is surrounded with white surfaces and lacks sufficient information to recreate a 3D space.
A different stack or maybe a 2D line guide system should already be enough (autopilot v1) to successfully drive through the tunnels.
A tunnel is possibly the easiest place to run FSD. OccNet is on vacation, just need lanes net.

In all seriousness, it is actually very easy for OccNet to map this space.
 
If it cant work in a tunnel, its pretty terrible. There are lots of tunnels in the real world, FSD should find the boring company tunnels extremely easy. No pedestrians and all vehicles are the same size/type.
I'd be very worried about FSD if the reason its not in the vegas loop is because thats beyond its abilities!
It's probably more to do with the stopping and starting and to'ing and fro'ing at either end picking up passengers than driving in the tunnel. I mean, how am I going to drive through the Mersey Tunnel otherwise?
 
I honestly don't understand his rant. He is acting like he doesn't understand FSDb. People who have used it awhile understands where it would fail and where it would succeed. Fail area are usually places incompatible with the map data or the car chooses the wrong lane where it sacks its probability of success. This is rather consistent and with every update we usually test the same area to see if there are any improvements.

With that said, there are also many routes where FSD will give you intervention free drives and this is slowly expanding wider and wider. The car also performs unprotected turns WAY better than before. His 10% unprotected left is BS. I'm at 80-90% success rate. It used to be random luck.

His "I need to intervene 10 times in a miles" is just completely BS after he acknowledge that lane keeping is solid. 99% of anyone's milage is going a straight line. Take a length of road to a destination, calculate how much of the length of road is a turn, and how much of is just going straight. He pretty much agrees 99% of the drive he doesn't need to do anything even IF he intervene at every turn.
Precisely my point and calling him a whiner. When you provide a report, you need to provide a full picture of where it works find and where it fails. Looking at umpteen videos, it appears all the vanilla drives - simple UPLs, Right turns seems to be largely intervention free. Just can't believe when he says it fails 75% of Right turns
 
He has a long history of both having engineering rigor and having very strong opinions.
Here's one of those strong opinions inbound. Your "diagnosis" is mostly nonsense.

Would be nice to see the build #
It's been in the description since I posted the video.
drive at the speed limit
lol. That would probably be the least safe thing someone could do around here, or in most places.
capture the video directly from the MCU
Feel free to shoot me something I can use to do this. Until then a GoPro aimed at the screen will have to suffice.
and have a better view of the road.
I can see the road fine... 🤷‍♂️
[...] 0:40 NP (Neural Planner) plans a path close to the stop sign. Passes the poles fine however. Due to takeover, not sure if this would have resulted in a good or bad turn.
The issue is that it was driving off the road into the adjacent parking lot. I've no idea if it would have hit the stop sign or whatever because I wanted to stay on the road.

To the left is occluded by a chainlink fence: Google Maps
Chainlink fences are not opaque last I checked.

1:11 Lanes net issue, detecting a turn only lane should be fed into the inference stack through sparse map data as a hint to lanes net.
👍
3:27 NP incorrectly plans a path to the left of a waiting car. DE (Data Engine) incorrectly classifies the waiting car as parked and LN (Lanes Net) incorrectly. 3:28 The visualization can't be seen at all due to an edit so I can't tell what was happening afterwards.
The visualization is front and center throughout the entire video. The only edits made are to speed up the video, rewind and replay sections, etc. Saying I edited and hid the visualization somehow is ludicrous. It's right there the entire time. Where I rewind at 3:28, it just replays that entire section again slowly, again uncut. So don't lie.

3:40 After the edit, it appears that after the takeover, FSD switches to AP. You can see the lane lines that AP sees vs the FSD visualizations.
At 3:51 it switches back. I've not seen this occur without a UI warning popping up so I wonder what build is being used in this video and if it has somehow been modified.
Again, there was no edit.

Any time I disengage by turning the wheel it switches back to the AP visualization for a moment. I've no idea why, but it does this on every vehicle I've used the beta in. Reengaging gets it back, or just drive a moment and it'll come back.

Nothing's been modified. In fact, this is my wife's 3 and is completely stock.
4:10 FSD seems to do the right thing as the NP decides to not go (line turns grey) when the takeover occurs. Notice that it again switches over to AP visualizations without a warning.

Nope!

The little blue creep wall in the visualization moved backwards, so the car decided it was no longer bound by this and fully let off the brake and started accelerating. Me clearly seeing the car to the left, my only real option was to nail it. In hindsight, should have jumped to the other lane entirely.

Again, the AP visualization thing happens on every car I've used this on when disengaging.

---
 
Precisely my point and calling him a whiner. When you provide a report, you need to provide a full picture of where it works find and where it fails. Looking at umpteen videos, it appears all the vanilla drives - simple UPLs, Right turns seems to be largely intervention free. Just can't believe when he says it fails 75% of Right turns
Mine fails 100% of right turns.
 
Here's one of those strong opinions inbound. Your "diagnosis" is mostly nonsense.


It's been in the description since I posted the video.

lol. That would probably be the least safe thing someone could do around here, or in most places.

Feel free to shoot me something I can use to do this. Until then a GoPro aimed at the screen will have to suffice.

I can see the road fine... 🤷‍♂️

The issue is that it was driving off the road into the adjacent parking lot. I've no idea if it would have hit the stop sign or whatever because I wanted to stay on the road.


Chainlink fences are not opaque last I checked.


👍

The visualization is front and center throughout the entire video. The only edits made are to speed up the video, rewind and replay sections, etc. Saying I edited and hid the visualization somehow is ludicrous. It's right there the entire time. Where I rewind at 3:28, it just replays that entire section again slowly, again uncut. So don't lie.


Again, there was no edit.

Any time I disengage by turning the wheel it switches back to the AP visualization for a moment. I've no idea why, but it does this on every vehicle I've used the beta in. Reengaging gets it back, or just drive a moment and it'll come back.

Nothing's been modified. In fact, this is my wife's 3 and is completely stock.


Nope!

The little blue creep wall in the visualization moved backwards, so the car decided it was no longer bound by this and fully let off the brake and started accelerating. Me clearly seeing the car to the left, my only real option was to nail it. In hindsight, should have jumped to the other lane entirely.

Again, the AP visualization thing happens on every car I've used this on when disengaging.

---
Feel free to post another video and we'll see what changes you make :)

Chris, Chuck and Omar have fabulous videos!
 
Mine fails 100% of right turns.

Mine refuses to leave its current route 100% of the time…
1673294555510.jpeg
 
Now that all the black ice is off the roads here and we can use FSD (latest version) again gotta say it’s definitely an improvement, and this time really with no regression in some area. Super smooth in so many situations now where it was clunky before. It still fails the same two intersections (lane choice) and still puts the signal light on for no reason but overall it’s beginning to act very human. I would never trust it without constant supervision yet but it’s definitely less “dramatic” than 3 months ago. 😊.

Jmho.
 

Mercedes is the first automaker to receive such approval in the U.S., beating out names like Tesla, General Motors, Ford, and even Honda. The auto giant holds the title as the first automaker to put out a production vehicle with Level 3 autonomy in its Japan-only Honda Legend sedan.
Terrible misleading article.

It’s self-certified. Nevada doesn’t certify individual OEMs.

L3 is limited to 40 mph on *freeways*.
 
Mine fails 100% of right turns.
I find this to be 100% suspect... as in you do not have FSD B enabled... in my vehicle FSD beta handles majority of RH turns even on one way streets with cars parked to the corners of the intersection...

i cant remember if FSD beta will make a right turn just by signaling ... you may have to navigate for it to execute ... will need to check this one
 
Mine fails 100% of right turns.
Yes we know that your non-Tesla car fails 100% of right turns. Also fails to lane keep 50% of the time, couldn't engage most of the time, and doesn't stop 20% of the time for the car in front. This is with over 50k miles of trying to use these other "motion drive pilot assist" or whatever the F they try to brand them.