Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
Not an expert on anything, but I’d like to add a comment. I’m 80 years old and just helped an 80 year old friend drive her Mercedes 900 miles over two days. We were both exhausted, which surprised me as I has just driven my Tesla on Autopilot 720 miles by myself in one day. I finished the drive not tired at all. Autopilot, right now, is a huge boon to me. It is very relaxing to just sit there looking for the occasional problem down the road or put the car to following a truck through a busy city. I would happily take what they have now if they would give it to me.
There should be a Senior discount especially if FSD can save lives,
like for someone having a heart condition with a risk of seizure while driving.
 
This might be the ultimate torture test:

FSD Beta on Lombard St:


Someone needs to try FSD in NYC. That street is nothing. A nyc driver will do that in reverse while drinking a coffee.

I'm skeptical fsd would even work in nyc. We don't really have lanes when there is traffic and we pretty much play chicken all day with each other trying to merge. The car would have to learn to be aggressive enough to get within a few inches of another car and then beep it's horn in anger.
 
Pretty sure some of the owners here would feint at the thought that parallel parking there often includes bumping into parked cars lightly.
I drove in NY once or twice. Almost died. One time, while on the freeway on Long Island, some guy in an SUV already on the off ramp decided to take an almost 90 degree turn to the left back onto the freeway and almost plowed into us. I caught up to him and while passing I looked over and he had a smile on his face like 'what's the big deal??'.
 
  • Funny
Reactions: P3dStealth
But, people have failed time and time again to supervise things.

They get bored, and their mind wanders.

What's really worrisome about FSD is not that it won't get better, but that it will quickly get substantially better without crossing the 99% mark. With the current state of FSD (the wide release version) it constantly reminds me to be vigilant.

With the FSD re-write (or whatever one wishes to call it) I expect some pretty substantial improvements.

What I don't know, and I can't predict is how close it will get to lulling me into a bored state of mind where I stare into the empty void with my mind completely elsewhere.

Yet, I'm someone who strongly believes that the current HW3 Sensor suite doesn't have enough redundancy for L3/L4/L5 autonomous driving. So I have all the reason to stay vigilant, yet even I have my limits on what I can supervise.

2021 should be a pretty exciting year for FSD because its the year that either L2 everywhere (basically an improved version of the FSD beta) gets released, and we go on a path of proving that it's safe enough for autonomous driving without human supervision. Or the regulatory/insurance shuts the whole thing down. Where they put limits on what L2 vehicles can do like what Europe regulatory did on AP.
Yes. The problem arises, when FSD is almost good enough. If you give people the possibility to ignore monitoring the car, they will.

I’m probably on the bad side of the Gaussian distribution, because even with AP1 I sometimes read my phone and type messages. With almost good enough FSD I would do it much much more. I hope Tesla implements driver monitoring with camera asap.
 
NYC really isn't that bad because the speeds are pretty slow. Training a computer to be aggressive enough seems like an extraodinarily hard problem though. Seems like it would just instantly get stuck as everyone realizes that it can be bullied.

Yes it happened to me like you said. The car stopped on the expressway and started letting everyone on the ramp go. It was pretty funny for us. Then it couldn't lane change multiple times. It's something they will need to figure out. The car will need to make judgement calls on the chances of that guy not letting you in or of them backing off before hitting you. We all do this without thinking about it but it's really extreme in ny.

Story: last year my girlfriend driving on the service road and a ford explorer gets off the exit. He merged into us and the cars got stuck together temporally. They seen each other and both made a bad call. The thing is he didn't care about his car it wasn't even his and it was older.

How much risk should it take? Should we be able to control that risk? I think so.
 
In my opinion beta city FSD is like using autopilot with 1% of human skill to fly a Blue Angels show. It probably wouldn't make it easier even it had 50% human skill. I don't think it's analogous to airplane autopilot like Autosteer and TACC (and even that analogy has limitations since there are way more thing to run into on the ground than in the air).


I don’t know how you can make that comparison unless you have personal experience flying in a Blue Angel aircraft, and also have experience using beta city FSD, this is purely conjecture on your part. That’s fine, you are entitled to an opinion, but without real world experience in both, it’s nothing more than an opinion.

It’s true there is more stuff to run into on the ground, but you are moving far, far more slowly, and have a lot more time to react to most situations. Even at 150 kts, a “fast” small aircraft speed, it’s very easy for the inexperienced to “get behind” an aircraft, much less when you are going 300 or 400 kts. Lots of things are happening very fast, especially after takeoff and during approach and landing, and you have to keep up with many things, including changing weather, traffic, checklists, ATC communications, changes in course altitude, etc. etc. “Getting behind the aircraft” as it is referred to, kills a number of non professional pilots every year. Properly trained and experienced in the use of autopilots, and having one available, increases safety of pilots by a good deal.

It’s much more challenging for a human to learn to fly, especially in bad weather, than driving car, and takes a greatly deal of training and even more experience. City driving is a piece of cake by comparison.

Tesla autopilot is analogous to aircraft , because autopilots in aircraft are workload reduction devices, and they are exactly the same thing in cars. Too many people who own Teslas do not understand this very simple fact, and treat them as fail safe, “set it and forget it”, and then complain when it doesn’t work they way they expect.

I treat the autopilot as a workload reduction feature, and that’s exactly what it does. I am far less tired driving my Tesla over distance than any other car I have ever driven. Most of that is from autopilot use.

I cannot speak to beta FSD city driving, as I have zero experience with it. Perhaps you are right, beta city FSD is more demanding than simply driving the car, but highway FSD is far less demanding, and I anticipate an improved city FSD by the time it reaches my car.

Remember, this is all very new, so all of us will have a learning curve on using it, and at first this may make it seem like it is more difficult than simply driving the car, at first. But once you learn the “quirks” of city FSD, where and when it’s likely to fail for example, your experience will become more familiar, and you will be more comfortable with it.
 
I don’t know how you can make that comparison unless you have personal experience flying in a Blue Angel aircraft, and also have experience using beta city FSD, this is purely conjecture on your part. That’s fine, you are entitled to an opinion, but without real world experience in both, it’s nothing more than an opinion.

It’s true there is more stuff to run into on the ground, but you are moving far, far more slowly, and have a lot more time to react to most situations. Even at 150 kts, a “fast” small aircraft speed, it’s very easy for the inexperienced to “get behind” an aircraft, much less when you are going 300 or 400 kts. Lots of things are happening very fast, especially after takeoff and during approach and landing, and you have to keep up with many things, including changing weather, traffic, checklists, ATC communications, changes in course altitude, etc. etc. “Getting behind the aircraft” as it is referred to, kills a number of non professional pilots every year. Properly trained and experienced in the use of autopilots, and having one available, increases safety of pilots by a good deal.

It’s much more challenging for a human to learn to fly, especially in bad weather, than driving car, and takes a greatly deal of training and even more experience. City driving is a piece of cake by comparison.

Tesla autopilot is analogous to aircraft , because autopilots in aircraft are workload reduction devices, and they are exactly the same thing in cars. Too many people who own Teslas do not understand this very simple fact, and treat them as fail safe, “set it and forget it”, and then complain when it doesn’t work they way they expect.

I treat the autopilot as a workload reduction feature, and that’s exactly what it does. I am far less tired driving my Tesla over distance than any other car I have ever driven. Most of that is from autopilot use.

I cannot speak to beta FSD city driving, as I have zero experience with it. Perhaps you are right, beta city FSD is more demanding than simply driving the car, but highway FSD is far less demanding, and I anticipate an improved city FSD by the time it reaches my car.

Remember, this is all very new, so all of us will have a learning curve on using it, and at first this may make it seem like it is more difficult than simply driving the car, at first. But once you learn the “quirks” of city FSD, where and when it’s likely to fail for example, your experience will become more familiar, and you will be more comfortable with it.
How reliable would an autopilot system need to be for you to use it for takeoff and landing?
It's true that I have not used city FSD or flown an aircraft. I'm just going from watching the videos and how many situations I see where an error by the car would require extremely quick correction. I agree that highway autopilot is a great workload reducer and Is safe as long as most people are not abusing it. The problem with trying to learn the "quirks" in the system is that it might do something correct 100 times in a row and screw up on the 101st time. Becoming comfortable with it is the problem. I guess we'll see what happens.
 
Interesting video from a drone. Shows a bad miss at 1.25. Also good driving at 6.00 where it purposely crosses line because of parked cars.


He seems like a little too much of a drama queen. And I could tell he isn't familiar enough with his car to know when it's too close to an object. At around 1:56, he's no where near enough to that truck to react the way he did. And it only took him like 2 days to figure out the in-cabin cam position was poor. It's unfortunate so many are referencing his videos. Can't wait to actually test this myself.
 
He seems like a little too much of a drama queen. And I could tell he isn't familiar enough with his car to know when it's too close to an object. At around 1:56, he's no where near enough to that truck to react the way he did. And it only took him like 2 days to figure out the in-cabin cam position was poor. It's unfortunate so many are referencing his videos. Can't wait to actually test this myself.

1:56 is the car slamming on the brakes not him. That was a phantom brake I think from the cars shadow maybe. It's pretty bad.

You can tell because he is still in FSD even after the car starts moving again. It doesn't disengage until he pulls over after. You hear it. Also if you watch the screen he gets that be prepared alert a half second prior.

Also I don't think it needed to cross the lines. I guess it's an option though. Personally I wouldn't have there is enough room.
 
Last edited:
  • Like
Reactions: Matias
1:56 is the car slamming on the brakes not him. That was a phantom brake I think from the cars shadow maybe. It's pretty bad.

You can tell because he is still in FSD even after the car starts moving again. It doesn't disengage until he pulls over after. You hear it. Also if you watch the screen he gets that be prepared alert a half second prior.

Also I don't think it needed to cross the lines. I guess it's an option though. Personally I wouldn't have there is enough room.

Thanks. If you look later on through the video at around 6.04 and other parts, you'll see that it brakes for parked cars. At 1:56 it braked harder than normal and it did it precisely at the point the truck box-icon changed from green to red. It was overly cautious (maybe it thought it was too close). There are lots of shadows throughout parts of the route that it didn't brake for, so I doubt it's that. Still, rather than try to understand what happened, he reacted like a teenage girl just learning to drive. As a beta tester, he should have turned around and made a second pass to see if it would react the same way, possibly figuring out what prompted it.
 
I have to respectfully disagree with your second sentence. Monitoring a system is not anywhere near as much work as doing it all yourself.

Agreed. The workload reduction lets you put more of your mental capacity into situational awareness.

NYC really isn't that bad because the speeds are pretty slow. Training a computer to be aggressive enough seems like an extraodinarily hard problem though. Seems like it would just instantly get stuck as everyone realizes that it can be bullied.

Clearly, Tesla will have to implement "New Jersey" mode... "Hey, I'm drivin' here!" This will probably require a hardware upgrade so the car can give other drivers the finger.
 
Thanks. If you look later on through the video at around 6.04 and other parts, you'll see that it brakes for parked cars. At 1:56 it braked harder than normal and it did it precisely at the point the truck box-icon changed from green to red. It was overly cautious (maybe it thought it was too close). There are lots of shadows throughout parts of the route that it didn't brake for, so I doubt it's that. Still, rather than try to understand what happened, he reacted like a teenage girl just learning to drive. As a beta tester, he should have turned around and made a second pass to see if it would react the same way, possibly figuring out what prompted it.

Maybe it was the trucks brake lights turning on.