Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
Why not let FSD do a road test to get a driver’s licence like everyone else? If it passes it is good enough..
That ain't saying much in the US. Also ironically FSD would get better with time while humans get worse and don't have ANY follow up testing. So we humans pass an EASY and minimal test when 16 and you are still 100% qualified 90 years later.
 
The truth is that we don't really test for safety when we give a person a driver's license. We just test for basic competencies and knowledge of the rules of the road. That makes sense for people since we can't have a person drive like 10M miles to see if they are safe enough. But the result is that we give driver's licenses to a lot of people who should not get one. There are a lot of drivers on the road today that might have basic competencies but are not safe drivers.

If we applied your idea to AVs, we'd get the same problem. We'd get AVs on the road that might demonstrate basic competencies but might not necessarily be safe at all. So we'd get a lot of avoidable accidents with this approach.

And, your approach is not necessary for AVs since we can collect millions of miles of real world and simulation miles to know statistically if an AV is safe enough.

to add an anecdote that shows drivers don’t always have even basic competency behind the wheel: Just yesterday I walked past a woman and her friend freaking out as she was trying to back up out of a parking space where someone had parked too close to her passenger side. Somehow while trying to back up with her friend’s attempted guidance from outside the car she had gotten the front end within a couple of inches of the car on her right and couldn’t for the life of her seem to figure out how to angle the wheels to back up without hitting the car. Her friend was trying to tell her what to do but wasn’t doing a good job of explaining and they were in that panic phase where they were just talking to each other in increasingly louder tones while saying the same thing instead of calming down and trying to explain it differently and more calmly. I had to stop and tell her to turn her wheels to her right before backing up slowly to get more space, and then straighten out the wheel to backup straight, and then turn the wheel back right to complete the maneuver and then shift back into drive once she had cleared the space. Both thanked me profusely, and as I walked back to my car I just marveled that she had gotten a license to operate a heavy, fast moving projectile in the first place. I made sure to see her exit the parking lot first before I left my parking space.
 
to add an anecdote that shows drivers don’t always have even basic competency behind the wheel: Just yesterday I walked past a woman and her friend freaking out as she was trying to back up out of a parking space where someone had parked too close to her passenger side. Somehow while trying to back up with her friend’s attempted guidance from outside the car she had gotten the front end within a couple of inches of the car on her right and couldn’t for the life of her seem to figure out how to angle the wheels to back up without hitting the car. Her friend was trying to tell her what to do but wasn’t doing a good job of explaining and they were in that panic phase where they were just talking to each other in increasingly louder tones while saying the same thing instead of calming down and trying to explain it differently and more calmly. I had to stop and tell her to turn her wheels to her right before backing up slowly to get more space, and then straighten out the wheel to backup straight, and then turn the wheel back right to complete the maneuver and then shift back into drive once she had cleared the space. Both thanked me profusely, and as I walked back to my car I just marveled that she had gotten a license to operate a heavy, fast moving projectile in the first place. I made sure to see her exit the parking lot first before I left my parking space.
This highlights one of the difficulties with self-driving cars. Humans can get out of the car and survey the situation from another vantage point. Self-driving cars need to have enough sensors to see everything around the car.
The current deployed or soon to be deployed self driving systems (Waymo and Cruise) ask remote humans for advice quite frequently.
 
  • Funny
Reactions: WarpedOne
This highlights one of the difficulties with self-driving cars. Humans can get out of the car and survey the situation from another vantage point. Self-driving cars need to have enough sensors to see everything around the car.
The current deployed or soon to be deployed self driving systems (Waymo and Cruise) ask remote humans for advice quite frequently.

To be clear, it wasn’t really necessary for anyone to have been guiding her in the first place. Some basic knowledge as to how to backup would have been plenty. I’ve been in situations where I have welcomed help from people (generally parking attendants at those parking lots in downtown LA where they pack in cars like sardines to maximize how much money they make) but this was nowhere close to that.
 
To be clear, it wasn’t really necessary for anyone to have been guiding her in the first place. Some basic knowledge as to how to backup would have been plenty. I’ve been in situations where I have welcomed help from people (generally parking attendants at those parking lots in downtown LA where they pack in cars like sardines to maximize how much money they make) but this was nowhere close to that.
Theoretically if we replaced only the worst drivers with AVs the performance threshold would be lower but I doubt there is any practical way to do that. Practically they really need to be a couple times better than average, that's Tesla's goal.
 
Scary stuff.

Once it gets down to 1 intervention a month how much attention do you think the crash test dummies, I mean beta testers will pay? One of them is going straight into the back of a truck at 70 MPH.

And updates every week? So they get used to it safely navigating a section of road, stop paying attention at that point and then one day the updated code decides to drive into oncoming traffic. Better hope it's not you coming the other way.
Agree. Complacency. Remember that Uber accident where the safety driver had stopped paying attention.

These British people have some interesting points. The background is a law change on the Isles, where Thatcham says todays tech is only suitable for assisted driving, not automated. The FSD beta is obviously "assisted driving" at the moment, but not really helpful or what?
 
  • Like
Reactions: diplomat33
Why not let FSD do a road test to get a driver’s licence like everyone else? If it passes it is good enough..

Fascinating idea! A FSD Turing Test. Then liability would be assumed by Tesla? Or the owner since the car gets permission to drive from the owner?

I assume Waymo is liable in Phoenix for its selfdriving taxis.

So it has to pass the written test, then the vision test and finally the driving test to get a driving license. At the moment the written test, different in most states, would be a failure today. It doesn’t understand, school buses, school and hospital speed limits or recognize emergency vehicles. I think Tesla FSD today could pass the vision test on signs, speed limits and construction cones.

Humorously it doesn’t parallel park well, nor does it find a parking spot in a DMV parking lot. But FSD could pass the driving test much of the time today....

so a likely “No” on passing a DMV driving test at the moment...
 
Last edited:
Agree. Complacency. Remember that Uber accident where the safety driver had stopped paying attention.

These British people have some interesting points. The background is a law change on the Isles, where Thatcham says todays tech is only suitable for assisted driving, not automated. The FSD beta is obviously "assisted driving" at the moment, but not really helpful or what?

Out of the points raised in the video, responding to stationary traffic and also debris are two issues I would reinforce. Normally traffic tends to slow down before coming to a standstill, but it is not uncommon to suddenly come up against stopped traffic.

In the US, there are many long distance routes where I can see automated systems being able to cover many more miles safely than human drivers. Move to more congested roads with more diverse interchanges, 'smart' road signs, a variety of conventions for speed signs, on / off ramp warnings etc and the requirements for safe driving over large distances changes.

I sincerely hope that the UK Goverment DOES allow pilot schemes and trials of proven automation technologies but at the same time I hope they are not railroaded into accepting pressure to implement prematurely.

I regard myself as a relatively cautious driver and often feel as though AP / TACC etc are not developing / learning nearly enough reasonable caution - like lifting off on the brow of a hill or approaching complex freeway interchanges.
 
Last edited:
  • Like
Reactions: daktari
I don’t think it requires a map to work. However, as a safeguard to consumers, it’s a good idea to only allow activation in a map area. Otherwise, what’s to stop it from being activated in a random field or off-road? FSD is intended to be a robotaxi in mapped areas. I wouldn’t consider this as a geo-fence however.

I added my driveway to Open Street Map specifically to check when/if my M3 would start taking advantage of that info. :)

In the current beta we are seeing a lot of driver disengagement. Why? At first glance, you might say "because the car made a mistake". But actually, if you watch, many of the disengagements are "I thought the car was going to make a mistake". This is hardly surprising, its a beta, and the drivers have been warned to be vigilant (and are obviously anxious not to get into a crash etc). So any dubious situation and they quickly shut off FSD. I know I would do the same. But do we know in how many of these cases the car would actually have caused a crash?

This is a really important point. FSD doesn't just have to drive safely, it has to also drive in a way that does not upset the meatbags inside the car, which means driving like a really careful and vigilant human. It will need to do really subtle things, like shifting slightly away from oncoming traffic when it can (which signals to them that it sees them), and splitting the lane lines when driving past parked cars if possible, just in case someone opens a door or a kid runs out into the street.

I have a whole checklist of places in my town where the current AP drives safely but inhumanly (inhumanely?). It will be interesting to see how FSD handles them when we hoi polloi get access to it.
 
It will need to do really subtle things, like shifting slightly away from oncoming traffic when it can (which signals to them that it sees them), and splitting the lane lines when driving past parked cars if possible, just in case someone opens a door or a kid runs out into the street.

The beta has been doing a lot of these subtle things. It still needs a lot of polishing. If we look at it from a standard software development pov, it's about 1-2 years away from being widely deployed. But if they're using "software 2.0" with NNs doing some of the driving policy and path planning, then it may be quicker.
 
That was pretty bad. If there was someone coming that would have been a collision.
Obviously that guys beta privileges should be revoked. Keep your hands on the wheel!
What's odd is that the perception looks fine, I wonder what went wrong?
Screen Shot 2020-11-01 at 7.16.34 PM.png

Everyone who posts videos with bad camera angles like this guy should also have their beta privileges revoked.
 
That was pretty bad. If there was someone coming that would have been a collision.

Agree it was a significant fail, but if there had been a car in the other lane I rather doubt the car would have steered into it! I'd love to see a map of that intersection because it looked like an unusual edge case with an unmarked lane shift.
 
Agree it was a significant fail, but if there had been a car in the other lane I rather doubt the car would have steered into it! I'd love to see a map of that intersection because it looked like an unusual edge case with an unmarked lane shift.
Sure? The release notes literally say "It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road."
Here's the intersection:
Screen Shot 2020-11-01 at 7.34.03 PM.png

Google Maps
 
My point was your assertion that the number is impractical. Do you know what that number is? Or how Tesla triage them?
I added my driveway to Open Street Map specifically to check when/if my M3 would start taking advantage of that info. :)



This is a really important point. FSD doesn't just have to drive safely, it has to also drive in a way that does not upset the meatbags inside the car, which means driving like a really careful and vigilant human. It will need to do really subtle things, like shifting slightly away from oncoming traffic when it can (which signals to them that it sees them), and splitting the lane lines when driving past parked cars if possible, just in case someone opens a door or a kid runs out into the street.

I have a whole checklist of places in my town where the current AP drives safely but inhumanly (inhumanely?). It will be interesting to see how FSD handles them when we hoi polloi get access to it.

Exactly .. a self driving car can easily plot a course that leaves 10cm clearance past another car and do it safely .. but it will freak out the passengers and scare the other cars driver to death. We are in a curious world where we need to make self driving cars drive WORSE to reassure the human occupamts.
 
Last edited:
What's odd is that the perception looks fine, I wonder what went wrong?

You fail to appreciate that they are building these failures into the beta, which are designed to only occur in safe situations, to help keep the beta testers on their toes. Why else would they have included the "It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road" message, for the beta? They designed it that way. You are not giving Tesla enough credit here.
 
You fail to appreciate that they are building these failures into the beta, which are designed to only occur in safe situations, to help keep the beta testers on their toes. Why else would they have included the "It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road" message, for the beta? They designed it that way. You are not giving Tesla enough credit here.
That's actually not the worst idea. That guy definitely failed the test and should have his beta revoked! haha.
 
  • Like
Reactions: RabidYak