Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
It’s possible that no anticipation ability is mutually exclusive of safety level 10x that of a human driver. I’m really not sure; it’s actually hard to say because we don’t know how many accidents are avoided due to this capability. Clearly FSD could calculate trajectories and anticipate in that sense, so that is good, but further “next level” anticipation might be harder and I don’t know what additional level of safety it provides.
So here’s a hypothetical question - there are many laws that are bent or followed loosely in every day driving. Coming to a full stop before a stop sign is a good example. If Tesla strictly follows the laws (which I assume it must) and gets into an accident not because it did anything But because it followed a law that no one does, how should that be classified?
 
  • Like
Reactions: Phlier
That's great advice but as been said often you need to be cautious when you end AP.

If you drive 100 miles on AP and then exit the highway you don't want to jam on your brakes as you run up on a slow vehicle on the ramp. Assuming it is after the 3 second buffer you will get a bad score for those 2 events which will result in a bad score for your whole 100 mile+ day. Since your overall rolling 30 day score is based on scores for each day weighted by the miles driven it will have a very negative impact on your cumulative score. On the other hand, if you really focus on the driving before and after AP you should easily have a 100 score for a high mileage day.
In my experience, even if you get a bad score on a particular metric due to a short non-AP component of the drive, the overall daily score will still calibrate based on the total distance driven. On a recent day I had aggressive turning and unsafe following both severely in the red, but the overall daily score (which was nearly all on autopilot) was 98%. That said, I try to keep all my scores green as much as possible! 99% cumulative after several hundred miles on my new Model Y, just waiting for Elon to expand the beta!
 
Last edited:
So here’s a hypothetical question - there are many laws that are bent or followed loosely in every day driving. Coming to a full stop before a stop sign is a good example. If Tesla strictly follows the laws (which I assume it must) and gets into an accident not because it did anything But because it followed a law that no one does, how should that be classified?
IANAL, but in general, my understanding is that pedestrian safety supersedes ordinary traffic laws. I.e. if the only way to avoid hitting a pedestrian is to break a traffic law, then that's ok. (Assuming the foreseeable consequences of breaking the traffic law are not worse.) Driver or passenger safety can also supersede; e.g. if you're driving a gunshot wound victim to the hospital, speeding is probably ok. There may also be superseding laws that say that if the normal set of traffic laws are technically impossible to obey in a given instance, then it's ok to break them enough to proceed safely. E.g. if you're in a "Minimum speed 45" zone and the car in front of you is doing 40, then you obviously can't be faulted for doing 40 as well. Or if you're on a two-lane road with a double-yellow line, and your lane is completely obstructed by e.g. a dumpster, then it may be ok to carefully cross the double yellow line and pass. (Again, caveat; I am not a traffic lawyer.)

Following traffic laws to the letter is rarely actively unsafe, but can be terribly annoying. It's weird that people get so upset about FSD being settable to roll slowly through stop signs, but never bat an eye at Autopilot being settable 15mph or more above the speed limit. I remember seeing a statistic that for every traffic ticket you receive, you've probably technically broken the law about 3,000 times. (Based on about 1/3000th of your driving being actively observed by law enforcement.) But in any case, I think we are at least several years before this issue comprises a significant fraction of FSD mistakes; right now FSD incorrectly breaks the law magnitudes more often than it incorrectly observes the law. As L4 approaches reality, it will be interesting to see how this plays out.
 
Last edited:
...But in any case, I think we are at least several years before this issue comprises a significant fraction of FSD mistakes; right now FSD incorrectly breaks the law magnitudes more often than it incorrectly observes the law. As L4 approaches reality, it will be interesting to see how this plays out.
Not to mention that 99% of these decisions are made in civil trials where a jury will likely proportion liability between Tesla and the driver based on which lawyer makes the most compelling argument, and each case will be decided separately and on its own unique fact pattern. We don't have to wait to see how this approach will play out - this is how it's always been done since cars were invented.
 
  • Like
Reactions: Ben W
It’s possible that no anticipation ability is mutually exclusive of safety level 10x that of a human driver. I’m really not sure; it’s actually hard to say because we don’t know how many accidents are avoided due to this capability. Clearly FSD could calculate trajectories and anticipate in that sense, so that is good, but further “next level” anticipation might be harder and I don’t know what additional level of safety it provides.

I think we can get distracted in this area of the forums by the intelligence aspect of manual and autonomous driving. When you look at human-caused accidents, a lot of them are due to inattention. Even before the days of cell phones, people not looking both directions and pulling out was very common. These kinds of accidents have nothing to do with reaction time, how far we can see with our eyes, how intelligent we are, or how good our intuition is.

Then there are all the testosterone or alcohol influenced mistakes that people make.

The other thing we're sometimes blinded to are current FSD mistakes that would likely NOT lead to an accident. For example in Rob Maurer's recent video where his car ignores a "do not enter" sign and is effectively traveling the wrong direction of a one-way street. I see humans making this mistake somewhat often in downtown, and most people will honk at the offending driver, but it's generally easy to prevent an accident. However, the car suddenly swerving across the double yellow is definitely something that is way likelier to cause an accident. When FSD makes a very hesitant and jerky turn, we may criticize it as being out of control or unsafe, but IMO this is a low accident risk maneuver.

As for cameras not having enough visibility at distance, we humans also have this problem, particularly with poor weather. The universal solution here is to slow down to give yourself more time to react. The car should do the same. If the current hardware suite has a resolution limitation, then to remain safe, it should have a max speed limitation. This is the same rationale for nerfing vision-only AP max speeds. If the car can't see as far as it used to with radar fusion, then it should go slower. This isn't a blocker for autonomy; just maybe not as desirable max speeds.

Clearly FSD has a lot of room for improvement, but I think our intuition is off when it comes to estimating how accident-prone it is. Discomfort and annoying other drivers can skew that intuition. We might be closer to matching a human's safety level than we think.

I define safety with collisions, as those are what can impact your health/safety. If we can get all the low-hanging fruit accidents off the road (inattention, testosterone, DUI, etc), that's going to make a big difference in overall safety.
 
Not sure what the topic is people seem to be discussing while we wait for the release of 10.12, but I am guessing I won't have an update for this weekend's roadtrip...

Screen Shot 2022-05-26 at 6.37.06 AM.png
 
I think we can get distracted in this area of the forums by the intelligence aspect of manual and autonomous driving. When you look at human-caused accidents, a lot of them are due to inattention. Even before the days of cell phones, people not looking both directions and pulling out was very common. These kinds of accidents have nothing to do with reaction time, how far we can see with our eyes, how intelligent we are, or how good our intuition is.

Then there are all the testosterone or alcohol influenced mistakes that people make.

The other thing we're sometimes blinded to are current FSD mistakes that would likely NOT lead to an accident. For example in Rob Maurer's recent video where his car ignores a "do not enter" sign and is effectively traveling the wrong direction of a one-way street. I see humans making this mistake somewhat often in downtown, and most people will honk at the offending driver, but it's generally easy to prevent an accident. However, the car suddenly swerving across the double yellow is definitely something that is way likelier to cause an accident. When FSD makes a very hesitant and jerky turn, we may criticize it as being out of control or unsafe, but IMO this is a low accident risk maneuver.

As for cameras not having enough visibility at distance, we humans also have this problem, particularly with poor weather. The universal solution here is to slow down to give yourself more time to react. The car should do the same. If the current hardware suite has a resolution limitation, then to remain safe, it should have a max speed limitation. This is the same rationale for nerfing vision-only AP max speeds. If the car can't see as far as it used to with radar fusion, then it should go slower. This isn't a blocker for autonomy; just maybe not as desirable max speeds.

Clearly FSD has a lot of room for improvement, but I think our intuition is off when it comes to estimating how accident-prone it is. Discomfort and annoying other drivers can skew that intuition. We might be closer to matching a human's safety level than we think.

I define safety with collisions, as those are what can impact your health/safety. If we can get all the low-hanging fruit accidents off the road (inattention, testosterone, DUI, etc), that's going to make a big difference in overall safety.
Some people in this forum will be calling you fanboi for sure. 😂
 
Then there are all the testosterone or alcohol influenced mistakes that people make.
It is certainly true that many accidents and fatalities are caused by alcohol, excessive speed, and similar. The number of deaths could be far lower.

However, I was just wondering what the accident and fatality rates would look if we weren't human with human capabilities. To me, it seems hard to say how many accidents (and fatalities) are prevented each year by those capabilities. Based on some results from actual autonomous vehicles, it does seem the safety level can be pretty high and rival that capability & safety level, but I wonder how well this will generalize. Maybe it's not necessary to have all the the abilities a human has (potentially could be offset by other improvements in sensing and reaction time)? Eventually we'll find out!

In any case it's important to know how much this cleans things up, since if there is any such hidden capability, we'll lose it (potentially - maybe we'll figure out how to emulate it) at the same time as we eliminate the preventable accidents due to alcohol & excessive speed and testosterone, etc.
 
It is certainly true that many accidents and fatalities are caused by alcohol, excessive speed, and similar. The number of deaths could be far lower.

However, I was just wondering what the accident and fatality rates would look if we weren't human with human capabilities. To me, it seems hard to say how many accidents (and fatalities) are prevented each year by those capabilities. Based on some results from actual autonomous vehicles, it does seem the safety level can be pretty high and rival that capability & safety level, but I wonder how well this will generalize. Maybe it's not necessary to have all the the abilities a human has (potentially could be offset by other improvements in sensing and reaction time)? Eventually we'll find out!

In any case it's important to know how much this cleans things up, since if there is any such hidden capability, we'll lose it (potentially - maybe we'll figure out how to emulate it) at the same time as we eliminate the preventable accidents due to alcohol & excessive speed and testosterone, etc.
I was thinking about this too. One thing I thought was that computers don't have emotion which may get in the way of driving. They don't get road rage, nor do they get fear or panic when an accident happens. I was recently watching Chuck's latest video regarding his accident, and in that video he praises Tesla's AP for keeping it together after he was hit. The car shook just a little, but maintained its lane perfectly, and he credited it for preventing further problems.
 
I do wonder often about handing driving off to the cars, they have so much to learn, not just the stuff we are concentrating on now with roundabouts and unprotected left turns. I still cannot imagine how the cars will handle police chases and garbage trucks.

Last night I was driving home, got to an intersection and another car was there, he wanted me to go so he flashed his brights. Only problem is the projector lenses on his car kept me from seeing any change while observing him. Out of my peripheral vision I saw something move and he had to flash them many times before I was looking where his high beams were projecting to realize what was happening.
 
  • Informative
Reactions: Phlier
I do wonder often about handing driving off to the cars, they have so much to learn, not just the stuff we are concentrating on now with roundabouts and unprotected left turns. I still cannot imagine how the cars will handle police chases and garbage trucks.

Last night I was driving home, got to an intersection and another car was there, he wanted me to go so he flashed his brights. Only problem is the projector lenses on his car kept me from seeing any change while observing him. Out of my peripheral vision I saw something move and he had to flash them many times before I was looking where his high beams were projecting to realize what was happening.
It's scary, for sure. But they'll get there someday. We put 15 year olds behind the wheel, which scares me. And they likely won't handle edge cases well either until they have significant experience driving. But they learn, and so will computers.
 
So here’s a hypothetical question - there are many laws that are bent or followed loosely in every day driving. Coming to a full stop before a stop sign is a good example. If Tesla strictly follows the laws (which I assume it must) and gets into an accident not because it did anything But because it followed a law that no one does, how should that be classified?

There are plenty of human drivers that follow the law by the book.

They're annoying as all hell, but if in accident happens its not their fault of following the law.

What I predict will happen is Autonomous cars will be forced to follow the law where the intention is to use them to get human drivers to behave themselves. In fact some group is studying the impact autonomous vehicles will have on how closely humans follow rules like the speed limit.

What I've seen when I drive is people tend to adapt to what they see. I think I'm actually in the minority where I know exactly what speed I'm going to go on the freeway. It's either 10 over, and it's only less if I need to conserve.
 
  • Like
Reactions: GlmnAlyAirCar