Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I think supervised FSD is a terrible idea, and is going to fall on its face so I agree with you on that one.

I don't doubt that some version of FSD (or a competitors version) will be safer than the average human driver, but I don't think that's a good comparison.

The entire comparison to average driver should be thrown out.

The reason being is most accidents aren't really accidents so why are we comparing FSD driving to non-accidents?

I'm not a drunk so why should my driving score be reduced because of drunks?
I don't drive while distracted so why should my driving score be reduced by those who do?
I'm not an unexperienced driver so my score shouldn't be driven down by the unexperienced.
I'm not a million years old so my score shouldn't be driven down by them.
I don't race on the roads so my score shouldn't be driven down by them.
I don't road rage so my score shouldn't be driven down by those who do.

Once you throw out all the trash drivers the human score is a LOT better.

I also don't want an autonomous systems "score" to be driven down by all the human nutjobs running into them.
If you don't count the human nut jobs then you'd end up deploying systems that decrease road safety. I wouldn't want to ride in a system that's worse than the average human at avoiding nut jobs.
 
Waymo hit a pedestrian in San Francisco yesterday. It was an operator manually driving, according to Waymo. Haven't been following the stats (maybe someone who has can chime in), but this may be the first accident a Waymo vehicle is involved in that may be at fault?
Waymo Self-Driving Test Vehicle Hits Pedestrian But Waymo Says A Human Was Driving

Someone in comments also posted a recent crash of a self driving bus in Canada that hit a tree.
Self-driving bus in Whitby collides with tree; operator suffers critical injuries
 
That might make me seem like I'm anti-Tesla or anti-FSD, but its actually because I'm anti-Human. :)

That the human is prone to failing and can't be relied on.
Sorry, can't help you there. Not a therapist ;)

I don't drive while distracted so why should my driving score be reduced by those who do?
Then you are not even human ;)
 
Last edited:
  • Funny
Reactions: S4WRXTTCS
If you don't count the human nut jobs then you'd end up deploying systems that decrease road safety. I wouldn't want to ride in a system that's worse than the average human at avoiding nut jobs.

That is just speculation.

ps : I can't figure out how people just accept a mind numbing 800k deaths while spending so much time on a forum about possible future accidents.
 
Last edited:
That is just speculation.
I suppose it's based on my personal driving experience. I feel that I've avoided many collisions that would have been the other driver's fault. Also looking at Waymo's safety report many of the collisions avoided by the safety driver (but not by the system in simulation) appear to be caused by "nut jobs."
What would hard data for this question look like?
 
Queue the "humans are bad drivers" comments to which I ask "compared to what?"

We have to compare autonomous driving to human drivers as human drivers are the ones handing over the keys.

But, we need confidence that the autonomous driving is better than us.

Including things like "single car accident due to high speed or driving while under the influence" in the human data doesn't make any sense to me. Humans aren't really bad drivers, but a small percentage of us are horrible drivers that bring everyone else down.

In Germany and other places with reasonably good drivers it wouldn't phase me too much to include all the data in the comparison. But, in the US that's just crazy.

Now long term this won't really matter much. Once autonomous cars have their own data they'll simply have to improve year over year or at least that would be the expectation. The goal shouldn't be that autonomous drivers are way safer out of the gate, but that over time they'll be substantially better.
 
The autonomous Progress is still quite a challenge as another report of an Autonomous Vehicle accident, Whitby, near Toronto, Canda that landed the backup driver in the hospital in critical condition:

It's an L3+ Olli Autonomous Shuttle. The website relies on "Computer Vision and Analytics" but the picture seems to show that there's an electronic LIDAR (non-mechanically-rotating) on the front top and another one above the front grille. It's operated by Whitby Autonomous Vehicle Electric Shuttle Project.

1639774902510.png

Photo above from localmotors.com.


It happened in clear weather at 4 PM and the car is software limited to 20 km/hr or 12 MPH.

Photos below from Brian Connolly
FGxLNbgXEAAioC8


FGxLNblX0AEZ2Vv
 
  • Informative
Reactions: EVNow
Including things like "single car accident due to high speed or driving while under the influence" in the human data doesn't make any sense to me.
So tell me, what is the correct number ?

This is what Cruise came up with to compare - based on Virginia Tech numbers, by removing rural & highway numbers.

1639775668585.png



ps :

Using the data set VTTI puts out you can do your own analysis to filter out whatever you want and come up with your own numbers. Note there is no driver impairment category called "nutcase" ;)



Data Dictionary
VTTI also produced a series of data dictionaries that are located on the InSight website.
These dictionaries encompass every variable in the database, including the driver assessment
data, time series data, and vehicle-related data. The dictionaries are also “living” documents that
will be updated as more variables are added to the SHRP 2 NDS and refinements are made to
existing variables. When a change is made from one version of the dictionary to the next, the
information will be highlighted on a revision page that notes what was changed and how it was
changed.
Researchers must review these dictionaries before doing any analysis to make sure the
operational definitions meet the requirements necessary to answer their questions. Based on the
name of a variable alone, they could easily misinterpret the data. For example, the SHRP 2
database contains a driver impairment variable, which encompasses the following categories:
 None apparent
 Drowsy, sleepy, asleep, fatigued
 Ill, blackout
 Angry
 Other emotional state
 Drugs, medication
 Drugs, alcohol
 Other illicit drugs
 Restricted to wheelchair
 Impaired due to previous injury
 Deaf
 Other
 Unknown
If a researcher wanted to conduct an analysis on driving under the influence of alcohol or
drugs only, some of these classifications should be excluded. Without reviewing the data
dictionaries, a researcher may incorrectly assume certain information is included or excluded.
 
Last edited:
Waymo hit a pedestrian in San Francisco yesterday. It was an operator manually driving, according to Waymo. Haven't been following the stats (maybe someone who has can chime in), but this may be the first accident a Waymo vehicle is involved in that may be at fault?
Waymo Self-Driving Test Vehicle Hits Pedestrian But Waymo Says A Human Was Driving
They hit someone on a scooter not long ago, also under manual control according to Waymo.

Those operators can be quite aggressive; a Cruise almost hit me by failing to yield at a crosswalk where I was almost halfway across, I bet also under manual control. A Zoox yielded to me at the same crosswalk another time and I wasn’t even crossing yet :rolleyes:
 
Waymo hit a pedestrian in San Francisco yesterday. It was an operator manually driving, according to Waymo. Haven't been following the stats (maybe someone who has can chime in), but this may be the first accident a Waymo vehicle is involved in that may be at fault?
Waymo Self-Driving Test Vehicle Hits Pedestrian But Waymo Says A Human Was Driving
Besides the scooter incident in June Waymo pulled into a bus's path years ago. They went into the median when a safety driver disengaged autonomous mode in his sleep. They were rear-ended by a cop in Phoenix but judged to be at least partially at fault for stopping suddenly without reason (or maybe the cop was just pissed off lol).

I'm sure there are others - they've probably got close to 50 million miles with a safety driver or in manual mode by now.
 
Waymo has collisions all the time. Just clicking randomly they all seem to be in conventional mode though they appear to the other driver's fault (except for one where the Waymo driver backed into a parked car, they need to use that LIDAR for parking sensors!). I guess they're just driving around gathering data for simulations and haven't actually done much autonomous driving in SF?


The Cruise collisions seem to be mostly in autonomous mode.
 
Last edited:
  • Informative
Reactions: EVNow
Besides the scooter incident in June Waymo pulled into a bus's path years ago. They went into the median when a safety driver disengaged autonomous mode in his sleep. They were rear-ended by a cop in Phoenix but judged to be at least partially at fault for stopping suddenly without reason (or maybe the cop was just pissed off lol).

I'm sure there are others - they've probably got close to 50 million miles with a safety driver or in manual mode by now.
This report has details of every Waymo accident - but only from 2010 to 2015.

 
  • Informative
Reactions: Doggydogworld
I suppose it's based on my personal driving experience. I feel that I've avoided many collisions that would have been the other driver's fault. Also looking at Waymo's safety report many of the collisions avoided by the safety driver (but not by the system in simulation) appear to be caused by "nut jobs."
What would hard data for this question look like?



1639786263236.png



1639786227420.png


 
View attachment 745414


View attachment 745413

Weird that substance related impairment is so small when it's involved in 28% of deaths.
Anyway I'm sure I've avoided collisions with people exhibiting all those impairments and behaviors.
ps : I can't figure out how people just accept a mind numbing 800k deaths while spending so much time on a forum about possible future accidents.
I just want a system that reduces the amount of death and injury. An AV that can't avoid collisions caused by those behaviors about as well as the average human won't do that. Is there something I'm missing?
 
I'm not a drunk so why should my driving score be reduced because of drunks?
I don't drive while distracted so why should my driving score be reduced by those who do?
I'm not an unexperienced driver so my score shouldn't be driven down by the unexperienced.
I'm not a million years old so my score shouldn't be driven down by them.
I don't race on the roads so my score shouldn't be driven down by them.
I don't road rage so my score shouldn't be driven down by those who do.
The bar is lower if we replace these drivers with FSD. Should worse than average human FSD be deployed for the above? Can we add people who are driving tired?
 
They hit someone on a scooter not long ago, also under manual control according to Waymo.

Those operators can be quite aggressive; a Cruise almost hit me by failing to yield at a crosswalk where I was almost halfway across, I bet also under manual control. A Zoox yielded to me at the same crosswalk another time and I wasn’t even crossing yet :rolleyes:
The reddit thread has "insiders" that say this may have to do with how some of these companies design their break system, which discourages you from honestly reporting your fatigue levels, because you would get suspended without pay. It's an interesting point. Some people (including in the comments in that article) make a big deal about having trained drivers versus Tesla just crowd sourcing it to owners, but are trained drivers that do this as a full time job really better in terms of safety (taking into account fatigue vs training)?