Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
You have a lot to say and little makes sense or stands up to any sort of superficial scrutiny.

Waymo is L4 (or whatever) so it has some implied and claimed (no disengagements in xxxx miles) safety stats, right? Even one video/instance out of a few dozen is not good sign and raises valid concerns.

It is laughable that you believe that. Waymo is literally a L4 autonomous car that can do all the driving tasks. There is dozens, if not, hundreds of capabilities that Waymo has, that Tesla lacks. For example, Waymo cars can pull over automatically for emergency vehicles, which Tesla's FSD can't do.

You just choose to see what you want to see.

Yes, I saw the video. Waymo messed up. And the Remote Assistance made the Waymo screw up. But you can't judge all of Waymo based on one video. Even you should know that! You have to look at the total picture. You can't just cherry pick one video. There are other videos of Waymo handling construction zones just fine. And there are lots of videos of Waymo driving just great.
 
Only look at the good and ignore the bad -- Isn't that the very definition of cherry-picking? And you call others fanboys... :)

Dozens of Waymo videos -- There's ~2000 people using FSDbeta and probably dozens of FSDbeta videos posted weekly.

No, I look at both the good and the bad. I am not ignoring the bad Waymo videos. It's just that they are very rare.

Yes, there are a lot of FSD Beta videos. There are some really good FSD Beta videos and some not so good. I am not ignoring the good FSD Beta videos.

I've watched Waymo vehicles drive around Mountain View for years.
I've had my Tesla for ~11 months, drive nearly daily, and use FSD regularly. That's why I have confidence in Tesla's AP and FSD, even though I doubt I'll use city autosteer, except when I traveling.

If I could, I would gladly ride in a Waymo instead of using Tesla's FSD. I've had my Model 3 for longer than you. As a driver assist, AP is fine. But I don't have confidence in Tesla's current FSD. I know FSD Beta is a new version. So we will see how FSD Beta performs. Sadly, I don't have FSD Beta yet so I can't comment on how it performs for me.
 
You have a lot to say and little makes sense or stands up to any sort of superficial scrutiny.

Waymo is L4 (or whatever) so it has some implied and claimed (no disengagements in xxxx miles) safety stats, right? Even one video/instance out of a few dozen is not good sign and raises valid concerns.

L4 has nothing to do with disengagements per miles. But yes, Waymo has provided disengagement data that shows an average of 1 safety disengagement per 30,000 miles over about 600,000 total autonomous miles.

Second, you don't seem to understand statistics and how small samples work. One video of Waymo having a problem out of a few dozen does not mean that Waymo has a poor disengagement rate.
 
  • Disagree
Reactions: mikes_fsd
L4 has nothing to do with disengagements per miles. But yes, Waymo has provided disengagement data that shows an average of 1 safety disengagement per 30,000 miles over about 600,000 total autonomous miles.

Second, you don't seem to understand statistics and how small samples work. One video of Waymo having a problem out of a few dozen does not mean that Waymo has a poor disengagement rate.

And as Bladerskb said this wouldn't be classified as a safety disengagement, so it wouldn't show up in those reports. And I think just about everyone here would disagree with that classification and think that it should be included. What would the statistics be like if incidents like this were included in the safety disengagement statistics?
 
  • Like
Reactions: mikes_fsd
And as Bladerskb said this wouldn't be classified as a safety disengagement, so it wouldn't show up in those reports. And I think just about everyone here would disagree with that classification and think that it should be included. What would the statistics be like if incidents like this were included in the safety disengagement statistics?

Safety disengagement = accident

So no they shouldn't be included.
Safety disengagement is used to determine how safe the system is, not whether it will get stuck or not, or how often.

A car stopping in-front of a concrete barrier because it doesn't know how to proceed versus a car ramming into a concrete barrier is two completely different things. Therefore the stats should be kept separate and not mixed together.
 
And as Bladerskb said this wouldn't be classified as a safety disengagement, so it wouldn't show up in those reports. And I think just about everyone here would disagree with that classification and think that it should be included. What would the statistics be like if incidents like this were included in the safety disengagement statistics?

Obviously, the disengagement rate would be less than 1 per 30,000 miles if we included all kinds of non-safety interventions. But the reason we tend to focus on safety disengagements is because those are the ones that matter the most. When trying to gauge the safety of an autonomous car, clearly safety disengagements are a key metric to look at. And there will be some interventions that are so minor that they don't really matter. So there will be a lot of non-safety interventions that don't need to be included.
 
The Chromebook & Chromebit were too damaged to investigate.

That is incorrect, Chromebook wasn't too damaged to read data.

2.2.1. Chromebook Data Recovery Upon arrival at the Vehicle Recorder Laboratory, an exterior examination revealed the unit had sustained impact damage to the screen. Information was extracted using the manufacturer’s software normally, without difficulty.

2.3.1 Chromebit Data Recovery Upon arrival at the Vehicle Recorder Laboratory, an exterior examination revealed the unit had not sustained any damage

2.3.2. Chromebit Data Description The Chromebit could not be imaged due to charging issues. No data was recovered from the device.

so you can say the chromebit might have had something on it, but you can't just wave your hands and say the chromebook did.



It is incorrect to state that he was watching a movie, also incorrect to state he was not watching a movie. NTSB found it to be inconclusive as to whether he was watching a movie.

Occams razor says he was listing to the audio files of Harry Potter soundtrack that were found and not watching some movie file that wasn't found.

But yes I suppose if you want to qualify your statements that far you could say that last quoted part.

That still doesn't make it right to just straight up say he was watching a movie on a portable DVD player like the post I replied to earlier today. Since there was no DVD player in the car and no DVDs either.

1620926320571.png
 
  • Helpful
Reactions: hiroshiy
I'll repeat what I said in 2017



https:// dms.ntsb.gov/public/59500-59999/59989/604759.pdf (note the URL is no longer valid as the database has been moved) The new file is at NTSB Docket - Docket Management System pay attention to Electronic Devices Examination Factual Report in spot 40 on the list.

The
National Transportation Safety Board (NTSB)Vehicle Recorder Division received the following devices

Device 1:
Laptop Computer
Device 1 Serial Number:
ECN0CX305107503

Device 2:
Chromebook
Device 2 Serial Number:
FCNLCX051001518

Device 3:
Chromebit
Device 3 Serial Number:100A
-
CM2XXNF

Device 4:
Micro
SD
Memory Card
Device 4 Serial Number:
n/a



after months (now years) of people saying he was watching a DVD it should be repeated loud and clear that there was no DVD player or DVD media in the car.

The report found a laptop, a chromebook, and some SD cards. None of which had a movie on them.
Thanks for the correction about the DVD player, I haven't followed up on the case, but was just going off my memory of the initial reporting (which I did follow back then), but regardless of which distraction it was, he was obviously driving distracted and was an expert user of the system that was aware the system could disable at any time (not a novice user who may never have experienced AP turning off). My point to the original conversation was that knowing about a specific limitation about cross traffic is irrelevant, as obviously he was not paying enough attention to even know there was cross traffic.
 
That is incorrect, Chromebook wasn't too damaged to read data.

2.2.1. Chromebook Data Recovery Upon arrival at the Vehicle Recorder Laboratory, an exterior examination revealed the unit had sustained impact damage to the screen. Information was extracted using the manufacturer’s software normally, without difficulty.
2.3.1 Chromebit Data Recovery Upon arrival at the Vehicle Recorder Laboratory, an exterior examination revealed the unit had not sustained any damage
2.3.2. Chromebit Data Description The Chromebit could not be imaged due to charging issues. No data was recovered from the device.

so you can say the chromebit might have had something on it, but you can't just wave your hands and say the chromebook did.

Ok, so you clearly skipped right over this:
2.2.2 Chromebook Data Description The Chromebook was too damaged for normal data recovery. Further efforts using chip
removal of the eMMC data storage would not yield usable data.
 
Ok, so you clearly skipped right over this:
2.2.2 Chromebook Data Description The Chromebook was too damaged for normal data recovery. Further efforts using chip
removal of the eMMC data storage would not yield usable data.

How do you reconcile 2.2.1 and 2.2.2?

2.2.1 Chromebook Data Recovery Upon arrival at the Vehicle Recorder Laboratory, an exterior examination revealed the unit had sustained impact damage to the screen. Information was extracted using the manufacturer’s software normally, without difficulty.

They could extract information "normally, without difficulty" but it was "too damaged for normal data recovery" is contradictory.

I suppose you could ask Specialist Jane Foster for a clarification.
 
Last edited:
SAFE just posted a new report entitled "A Regulatory Framework for Autonomous Vehicle Deployment and Safety".

It mentions that several standards are being worked on right now for determining when AVs are safe for deployment. IEEE P8246 is an open working group (meaning that different companies can contribute ideas) and they plan to release a first draft in Q2 of 2021.

Mobileye has contributed their driving policy model, called RSS, which is based on 5 rules:
1) Do not hit the car in front.
2) Do not cut in recklessly
3) Right of way is given not taken (if another car violates the rules of the road, the AV should yield to avoid a collision)
4) Be cautious in areas of low visibility
5) If the AV can avoid a crash without causing another crash, it must.

Mobileye has proposed that AVs should never cause a crash and should reduce the number of crashes caused by other vehicles but they need not avoid every crash. The AV should make "reasonable assumptions" about the "worst case" actions of other drivers even though the human drivers in other vehicles may sometimes make unreasonable decisions that cause collisions with the AV. For example, there could be instances where it is impossible to avoid a crash, like if the AV is boxed in all sides on a crowded highway and another vehicle does something reckless.

I will read the whole report this weekend and give a summary.

You can click here and click on "read the report" to download the full report:

 
Last edited: