Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta - Hands & human drivers being unsafe?

This site may earn commission on affiliate links.
Most videos I see the human driver has hand on there legs or lap.

I think this is very unsafe. I think this dangerous increases driver reaction time in a situation where the human driver already has very little time left to prevent an accident.

Here is one example of what I think was a good safe driver for FSD Beta
First drive with Tesla Full Self Driving (FSD) Beta code in North Carolina
Zeb Hallock
Oct 25, 2020


What do you think? Should the human drivers of FSD Beta always have their hands on the steering wheel?
 
Last edited:
I have watched a lot of videos of FSD beta and the video I posted earlier was the first I have seen where driver had hands on wheel for 95% or more of the trip.

Maybe people can post links here of other examples of good FSD beta safety drivers who have their hands on wheel most of the time the car is operating in FSD Beta mode.
 
From the videos I’ve seen, it looks like the nags are still there, so they don’t have their hands off the wheel for too long.

I do not think the old nag prompt is enough. The original auto pilot was used mostly on divided highways where all surrounding traffic was going same direction, no turning at intersections, few pedestrians & bike. Smart cruise control which has been around for years and is much easier than what FSD beta new functions are trying to do.

This issue is reaction time. The critical maneuvers do not have a lot of leeway. You need to give the FSD Beta some time to do the right thing which leaves the human driver very little time to realize the car is not going to do the right thing and for the human driver to actually correct the cars movement.

This FSD Beta needs much more attention of the human safety drivers than the previous style Tesla FSD.

Human driver full attention (like the human was actually driving) and hands on the wheel at all times.
 
  • Disagree
Reactions: drift and croman
I agree, but it is a much more difficult problem to solve. For example, should the driver keep their hands on the wheel while the car is making a 90 degree turn? If so, how do you detect it? How do you prevent that detection from interfering with the actual turn?

Obviously, the solution is a more robust detection method, perhaps using the inside camera instead (or in addition to) the current nags. Of course, the better the autonomous driving becomes, the harder it becomes to catch the errors.

I think the best solution is to go right to level 4, and remove the driver from the whole mess. :)
 
I agree, but it is a much more difficult problem to solve. For example, should the driver keep their hands on the wheel while the car is making a 90 degree turn?

Absolutely!!!! Yes, both hands on wheel as car does 90% turn. Let wheel slide through hands. Hands in driver control positions (9 & 3 o'clock or similar)
see first video. That is what he does.

Most 90% turns are at intersections or left hand turns in front of oncoming traffic. This is where most non-highway severe and fatal accidents occur.

At all times hands should be in driver control positions. NOT one hand or both hand resting at bottom of steering wheel.
 
  • Helpful
Reactions: pilotSteve
Absolutely!!!! Yes, both hands on wheel as car does 90% turn. Let wheel slide through hands. Hands in driver control positions (9 & 3 o'clock or similar)
see first video. That is what he does.

Most 90% turns are at intersections or left hand turns in front of oncoming traffic. This is where most non-highway severe and fatal accidents occur.

At all times hands should be in driver control positions. NOT one hand or both hand resting at bottom of steering wheel.
Ok. How do you propose the car sense the drivers hands on the wheel during a turn?
 
I think it depends on the situation. If there's going to be a failure of FSD (or an incorrect decision) and intervention is needed quickly, then one should escalate one's readiness to match. In other scenarios, there'd be a lot more time before an intervention is needed and with lower stakes (no pedestrians will be mowed over). I think Zeb did a great job (watched most of his video) along with James Locke in Santa Clarita.
 
This is NOT because of hands not on steering wheel but 99.99% of drivers distractions or tail gating other vehicles.

I do not think 99.99% are distractions or tail gating.

A lot is that there is just a lot less room for errors at intersections.
You might have traffic coming from three directions at once with each oncoming vehicle at different speeds and distance.
Each vehicle can be a different size & color so estimating speed & distance can vary.

When at a stop then making a left turn into a standard narrow two way street there are a lot of hazards and precision needed.

how fast and how far away are oncoming vehicles from left, right, and across street toward you.
navigate precisely such you do not steer into oncoming traffic lane on left.
On the left between the lane directions there might even be a thin & low divider there.

If the left side has straight oncoming and also it's on left turn lane (with car waiting to turn), when you turn left you can be very close to the waiting car. If the Tesla makes a mistake and turns too sharply you have hardly any time to correct it to prevent the Tesla from crashing into the front of the waiting car or the median.

When time periods are this short the whole safety driver concept is flawed. The safety driver needs time to even recognize the car is starting to do something wrong. Driving the whole procedure manual has a low reaction time than FSD + safety driver.
Most of the time there is still enough time either way but sometimes reaction time from FSD + safety driver will just be too long in some situation. This is real world physics.
 
It was inevitable. They should be using trained drivers with cameras to monitor their attention. Instead they have untrained members of the public, some of whom are trying to make a quick buck from hands-off YouTube videos.

Just waiting for the first victim now.
 
I do not understand what problem you are anticipating with hand detection.

How short is the no driver detected timer?

If the drivers hands are properly on the wheel before the turn started why should detection be an issue during the short time the turn is being made?
I'm not sure if your concern is that generally drivers in the current crop of videos are not keeping their hands on the wheel as much as they should, or if you are worried about putting this out to the general public (or even a larger set of testers) that will result in more accidents due to not enough hands on the wheel.

If your concern is the first situation, then, yes, I agree that Tesla should perhaps communicate to that small group that things are not as good as they seem and they should be on a hair trigger to take over at any moment. I do see the urge to do a "look, ma, no hands" when making videos so that it is clear the car is doing the driving. But this is a small group and both needs can be met with sufficient care.

On the other hand, if your concern is the second situation, then there really is no amount of training or disclaimer screens or phone calls from Tesla that can prevent the larger public from going hands off as much as possible. Therefore, you have to design in a system that verifies a certain level of interaction of the driver with the environment to give the driver and car a reasonable chance at preventing any dangerous unforeseen circumstances.

In this case, Tesla has limited options available in the current car models. There is the steering wheel torque sensor we are all aware of. The problem with this, is that it is an indirect way to monitor driver engagement. This can be seen by the number of people working to defeat this sensor by hanging oranges and weights off the steering wheel. This sensor becomes less effective when the car is operating on city streets where large turns by the steering wheel are frequent occurrences. For sure, the driver can let the wheel slide though their hands while the turn is being made, but using the torque sensor during that maneuver would be quite difficult. If the driver puts too much pressure on the steering wheel, it will disengage in the middle of a turn, too little, and the car will have to break off the turn in the middle due to non-detected driver. There are many other circumstances that can be imagined that will cause similar problems with that sensor. Of course, you could limit the nags to just when the car is going straight, but that is pretty much the condition we have right now, so that clearly doesn't solve your "hands off wheel" concerns.

The only other sensor that is available is the interior camera, which is currently only available on Model 3s and Ys. That potentially could solve the problem, but there is some doubt if the camera is the correct place for that role, and/or if the camera is sensitive enough to allow for that type of monitoring. In any case, that sensor is not currently active and in use, so unless Tesla activates it for this purpose, it is not available.

This is why I am currently in the camp that says that Level 3 autonomous driving is a fools errand, and it is better to just go right to Level 4 where you don't need to count on the driver for anything. That said, I would personally be happy with a Level 3 car, and would put up with the inconvenience until such a time that wasn't necessary. I am not convinced, however, the effort required to get a properly working Level 3 is worth the investment on the way to Level 4 (i.e. Level 3 is not a required step on the progression from Level 2 to Level 4). All that driver monitoring that is required for "high level" Level 2 and "basic" Level 3 is a bunch of work that is completely unnecessary if you can get a reliable Level 4 vehicle.
 
Most videos I see the human driver has hand on there legs or lap.

I think this is very unsafe. I think this dangerous increases driver reaction time in a situation where the human driver already has very little time left to prevent an accident.

Here is one example of what I think was a good safe driver for FSD Beta
First drive with Tesla Full Self Driving (FSD) Beta code in North Carolina
Zeb Hallock
Oct 25, 2020


What do you think? Should the human drivers of FSD Beta always have their hands on the steering wheel?

That left turn at 12.10... *gulp*
 
  • Funny
Reactions: 1 person
Good FSD Beta driver example. Both hands on the wheel ( 7 & 3 o'clock).

Tesla FSD Beta - 17 minute drive with ZERO disengagements!
route through Edna Valley driving entirely on FSD Beta !
FSD handled stop signs, intersections, cyclists, light rain, traffic lights and more!
Software version is 2020.40.8.11.
Sofiaan Fraval
Oct 26, 2020
 
  • Informative
Reactions: Mmackinnon
What on earth is Tesla thinking, letting random people drive FSD 2.0 beta?!
Dr. Know-it-all Knows it all
Oct 26, 2020


Dr. Know-it-all Knows it all
Hi, actually I have a MS in Artificial Intelligence and my thesis was on Neural Networks. I absolutely agree that you can't train a corner case by itself: it requires retraining the entire network, which is as you say quite unpredictable. You can end up increasing rather than decreasing the loss of your software. Still, Tesla has some really smart folks working for it (much smarter than me I'm sure!), and they seem to be using some sort of boot-strapping method to focus training on these edge cases without (hopefully) mangling the general cases. It's a seriously hard tightrope to walk--I know from experience! It does appear they're managing this balance though.
 
Last edited:
Good FSD Beta driver example. Both hands on the wheel ( 7 & 3 o'clock).

Tesla FSD Beta - 17 minute drive with ZERO disengagements!
route through Edna Valley driving entirely on FSD Beta !
FSD handled stop signs, intersections, cyclists, light rain, traffic lights and more!
Software version is 2020.40.8.11.
Sofiaan Fraval
Oct 26, 2020

Videos like this just make the situation more dangerous. The more people are convinced that it's safe the less attention they will pay to it.
 
  • Disagree
  • Like
Reactions: Hrtme and mikes_fsd
I think the beta members are all in all very responsible and the beta build itself seems very safe, almost too timid in many situations. I have seen no unexpected failure modes other than a construction sign reflection within a puddle which was identified wrongly as a cone/sign and surprised the driver with a small maneuver.

If you compare with the famous Uber case, these people are 10 times as alert.