Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Hold Steering Wheel every 20-25 seconds?

This site may earn commission on affiliate links.
Defensive driving courses teach that both hands should be on the wheel at all times (apart from brief moments to use indicators, controls etc.)
A sudden event such as a blown tire, pothole, or Autopilot glitch requires a firm grip on the wheel to save the situation.
One hand at the bottom of the wheel = dangerous driving


Yes - now I remember why I stopped posting here. No matter what you say, someone's always ready to bash you for it... Bye again TMC.
 
I could buy 1 to 2 sec but 10 sec is an eternity and not a realistic number. You would have to be in a sound state of sleep to take that long. Count out 1001, 1002,... to 1010 while driving and you will see that it is way too long a time. I could see people over reacting, i.e. hitting the brakes in a panic on a alert but that can happen with normal driving without EAP when an unexpected situation arises.

Level 3 autonomy, the first true autonomous stage, must give the user at least 5 seconds for handover of control. AP can instantly disengage and hand over control because it's L2 semi autonomous.

However, the long range camera can resolve 250m which is enough for over 5 seconds at most reasonable speeds. Longer than 10 at speeds most people travel especially in suburban or urban areas. The AP2.5 radar also resolves to 250m vs 160m in prior AP.

I think Tesla is finally getting to building a great system thanks to AK. I do think it's going to be rough until that's funny built out and trained. Look at how the wipers still aren't awesome at night and dealing with tunnels. Apparently they are actively trying to fix those issues as well but it's a working system that people said was impossible.
 
Last edited:
  • Like
  • Informative
Reactions: Big Earl and Bilen
Wait a minute, the AP1-2 radar only have a 160m range? Is that why cars that appear on the “horizon” in the IC display are so close in real life? That “horizon” is only 160m away?

Mid-range radar sensor (MRR)

The vision is capable of 250m. Radar, ideally, should only be used for confirmation. I'm not sure what is the primary sensor in AP2 but it is likely still radar. I am doubtful its purely radar because my car now consistently recognizes stopped vehicles in excess of 50mph, which was a big failure point for AP1. I am becoming convinced vision is now providing additional sensor range which is why Elon is confident full autonomy is possible with vision alone (and therefore AP2 is FSD hardware enabled).
 
Yes - now I remember why I stopped posting here. No matter what you say, someone's always ready to bash you for it... Bye again TMC.

You are putting your own life and that of other road users by engaging what is indisputably dangerous driving.
You are also encouraging other Tesla owners to follow you your example.
But what would I know.
I'm just a basher.
Right?
 
This has not been an issue for me - I rest one hand on the bottom of the steering wheel and never see an alert. Honestly, after the latest updates, the auto-steer has been fabulous in my AP1 Model S.

Maybe you should read some of the complaints before responding to them. Everybody admits that one hand on the wheel is about the best way to avoid the nags. The entire point is that two hands on the wheel is safer but it cannot detect two hands on the wheel. I like to keep my hands at 9 and 3, and it nags with your hands in that position, even though it's much safer than the one-hand alternative.

I have tried rebooting as some have suggested, but it doesn't help. To make it worse, I have a really hard time seeing the visual nags now -- they just don't register in my peripheral vision usually, so then I get a beep. And after 3 beeps I think autosteer disables for the rest of the drive, even if you responded immediately to the beeps.

If one more person tells me to just drive with one hand or wiggle the wheel gently every 20 seconds I give up on humanity.
 
  • Like
Reactions: kavyboy
To make it worse, I have a really hard time seeing the visual nags now -- they just don't register in my peripheral vision usually
That is likely intentional, making it hard for people who are not paying too much attention to react to the visual nag in time. The idea may be to force people to pay a lot closer attention, but it is misguided as we should be paying attention to the outside and not the IC. This nag change was just bad all around.
 
  • Like
Reactions: MIT_S60
Level 3 autonomy, the first true autonomous stage, must give the user at least 5 seconds for handover of control. AP can instantly disengage and hand over control because it's L2 semi autonomous.

I still argue that 5s is not always enough for an inattentive driver to become attentive and gain situational awareness. 95% of the time it is but not 100%. Remember that presumably if this happens something unexpected is going on that the car can't deal with itself -- it's some kind of complex situation. And you've had your nose in a book, eyes off the road, maybe you've even fallen asleep because an L3 system doesn't have steering wheel nags. Or maybe your eyes are on the road but you're under "road hypnosis". The point being you have no idea what's happening outside the car. Maybe you've had a heart attack even.

However, the long range camera can resolve 250m which is enough for over 5 seconds at most reasonable speeds. Longer than 10 at speeds most people travel especially in suburban or urban areas. The AP2.5 radar also resolves to 250m vs 160m in prior AP.

What if you're cresting a hill or going around a bend? None of the sensors on board can see around corners or over hills.The car will have to react to whatever it sees when it crests the hill or rounds the bend, and it may have less than a second to do it. Not enough time to alert the driver. They can't even see up a hill, if you're headed down into a valley, and it doesn't need to be steep for this to completely blind all of the sensors, a very gentle down slope followed by very gentle up slope will put the road after the valley out of range of both cameras and lidar -- it's just as bad as a crest in fact for the car, but not for humans -- we can swivel our cameras.

L3 systems are hard. I would argue that even if the L3 system sees something it can't handle at the edge of its sensing range (so like Tesla's long range forward camera) where in principle it has 5s for the driver to become attentive and take over, it basically needs to handle that situation by immediately starting to pull over and come to a stop, because once you allow the driver to be inattentive you can't rely on them becoming attentive. The system must be able to come to a safe stop itself. And it might need to start executing that right away. That's not going to be a pleasant experience for the occupants. But also, things can happen much closer than the boundaries of your sensing range; they can happen right in front of you. The system needs to be able to react to that autonomously because there will be no time to alert the driver. This level of performance is possibly within reach for restricted-access highway driving in the near future, but local roads? Forget it. (And Elon has implied that "FSD" will handle local roads, without a driver in the seat even, nevermind attentive or not.)

L3 doesn't make sense. By the time we have an L3 system I would trust with my life, we will have L4 systems.

Now, on the other hand, a very good L2 system that actually enforces in a reasonable way that the driver is attentive (ahem) can save a lot of lives while we wait for L4. I think autopilot is already preventing a lot of accidents, as an L2 system. When people start treating it as an L3 system is when it starts killing people.
 
It doesn't matter how far ahead in distance you can see, its only a really rough proxy for seeing ahead in time, ie what happens in 1 second time may be different to what the image shows. AP also does nothing with respect to positioning the car on the road to get a better view further down the road.

Until you are allowed to trust the car to deal with all conditions on a given road, you need to be able to react within a fraction of a second.
 
  • Like
Reactions: rnortman
You can easily overcome most of the issues cited with HD maps. The car is already using SLAM. I can't imagine they aren't also working on HD maps. I hate that Tesla is too cheap to just buy HD map data from TomTom and get a head start...but this is not on topic.

Really? You can easily overcome these issues with maps? Issues like another vehicle or pedestrian doing something unexpected in front of you? Why are you not yet a billionaire? (I'm just assuming here that you're not, maybe I'm wrong... if you are a billionaire and you think this is easy I have an AV startup I'd like to pitch to you... PM me.)

What evidence do we have that they're using SLAM to localize on a map in real time -- I mean at production scale, outside of a couple of experiments here and there and a demo video (which is suspected to not actually be real-time, and/or to be using very different compute hardware)?

As soon as they have HD maps and camera-based centimeter-precision localization on those maps in all lighting and weather conditions, and the maps are updated in real time (like millisecond latency) with the positions, velocities, and (reasonably reliably) predicted trajectories of all vehicles, cyclists, pedestrians, animals, construction pylons, emergency vehicles, falling trees, small road debris, etc, then yeah, it's totally easy to just consult the map and figure out what to do. Point conceded.
 
This thread reminds me of back when I was working on trying to get computers to understand spoken or written English (i.e. the meaning not just the word sounds). People expect computers to do with 100% accuracy what humans cannot do with 100% accuracy. We extract meaning with many different cures and yet, far too often, we misunderstand what someone meant.

The simple truth is that some accidents will occur and the computer will not be able to avoid all accidents. The best we can hope for is that the computer does not behave in a way that causes an accident. Things like crossing the median into an oncoming car. Not stopping when the vehicle in front has stopped or there is an obstacle detected. Not steering around an obstacle when there is clear path. Not stopping when it shouldn't and there is a vehicle behind you. Failing to stop for a stop sign or red light. Failing to yield when it should, etc. There are around 37 scenarios identified by the NHTSA as causes of accidents. But accidents will still happen. A pedestrian steps out from between two vehicles and is not visible until there is insufficient time to detect and react. An oncoming vehicle veers from the other lane at the last second and there is insufficient time to react, etc. No autonomous or human system can avoid every accident. A wheel comes off an oncoming vehicle and takes a bounce and is traveling at you at 80mph and comes crashing down into your windshield. You think there is any computer or human system that can avoid that accident?
 
People expect computers to do with 100% accuracy what humans cannot do with 100% accuracy. ... No autonomous or human system can avoid every accident.

On this I agree completely. But L3 is still a very significant step change from L2 which cannot be taken lightly.

And practical, safe L4 on local roads given the hardware Tesla has to work with is a pipe dream. They may get something working, but it will crawl along at 20mph and slam on the brakes constantly at every little provocation, and that only if the weather is nice (but no direct sun in the cameras, so not early morning or evening), roads are dry, and you have been careful to clean all the cameras before starting your drive.

But to get back to the subject of this thread -- the question is, given that EAP is and ever will be an L2 system, and given that L2 systems require driver attentiveness, and given that Tesla has only a crude steering wheel torque sensor to measure driver attentiveness, and given that the long arm of the law and of public opinion is coming crashing down on Elon's playground after some high-profile crashes, what will become of our EAP cars? How will they solve the attentiveness problem in a way that makes Autosteer still worth using? Or will they somehow win the liability and public opinion battle? Or will they buy back all our cars and give us HW3 cars? (har har har)

Maybe I'm just a pessimist, but I fear this is not going to turn out well for those of us that bought into EAP.
 
You can't fix stupid and you can't force people to be responsible. What you can do is protect yourself (Here, I mean Tesla can protect itself). The solution would cut both ways. On the 3 and retrofitted to the S & X, is a camera that can monitor the driver. The camera could be on a 60 second loop where it over writes the information every 60 seconds (or longer if it is determined that longer is needed) and it would record the driver's actions. In the event of an accident, the loop containing the drivers actions before the accident would be locked and could be reviewed. If the driver was asleep, playing with their phone or in some other way disengaged it would be evident that the driver was not paying attention to the road. Tesla and the system would be absolved of blame if the driver could have prevented the accident had he/she been attentive. On the other hand, if the video showed the driver was attentive but the car misbehaved in some unpredictable fashion, then Tesla would bear the responsibility for its systems action in causing the accident. Blocking the camera or making it such that the image was unusable (smearing vaseline on the lens, taping or painting over it, etc.) would automatically lock out EAP. The onerous is then on the driver to pay attention or risk having the liability in an accident with EAP.
 
  • Like
Reactions: Big Earl
Can you name a computer controlled situation anywhere on the planet where a single human death is tolerated ? The nearest we have is suicide under an automated train. Pharmaceuticals also comes close but even that is significantly different.
 
You are putting your own life and that of other road users by engaging what is indisputably dangerous driving.
You are also encouraging other Tesla owners to follow you your example.
But what would I know.
I'm just a basher.
Right?
A bit overboard? Are you saying that everyone on the road is using 2 hands firmly on the wheel at ALL TIMES? I have no statistics to back up my position but I would say that most people do not drive with both hands on the wheel at ALL TIMES.

And further the person you are responding to did not respond to any one person. He simply responded to the thread indicating what he was doing and that AP 2018.21.9 was working better then it did before.

For me personally.... I drive mostly with 1 hand on the wheel when it is safe (which is most of the time for me). If I see a situation that I feel may not be safe I will use two hands at that time and also be more alert. But no way do I drive with two hands on the wheel at ALL TIMES.