Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will Tesla sue Dan O'Dowd?

Will Tesla sue Dan O'Dowd?


  • Total voters
    59
  • Poll closed .
This site may earn commission on affiliate links.
I don't understand the controversy over "Is FSD safe?" Tesla says FSD is not safe by itself, and that there must be an alert driver and that driver must intervene when necessary, even without any warning from the computer.

FSD is at present an unsafe development system that requires constant supervision and occasional intervention. This puts it in the class of all cars, which are unsafe if not operated properly. Operated properly, most cars are relatively safe, but any car can be in an accident due to mechanical failure or driver error.

Why is there any discussion?
You are completely right that FSD(b) is not safe by itself since it's beta software under development and not meant to be left unsupervised. This is precisely why Tesla tries to make sure you're paying attention, holding the wheel, etc.

The issue with DoD is ultimately whether he libeled Tesla by mischaracterizing the system and/or intentionally designing and filming his tests to make the system appear worse than it is.

Interestingly, I looked at the Dawn Project's home page - they state that they "Demand Software that Never Fails and Can’t Be Hacked." Show many any competent software engineer that thinks that's achievable. Going further, the alternative is humans that are imminently fallible and subject to hacking (or corruption.)
 
  • Informative
Reactions: daniel
You are completely right that FSD(b) is not safe by itself since it's beta software under development and not meant to be left unsupervised. This is precisely why Tesla tries to make sure you're paying attention, holding the wheel, etc.

The issue with DoD is ultimately whether he libeled Tesla by mischaracterizing the system and/or intentionally designing and filming his tests to make the system appear worse than it is.

Interestingly, I looked at the Dawn Project's home page - they state that they "Demand Software that Never Fails and Can’t Be Hacked." Show many any competent software engineer that thinks that's achievable. Going further, the alternative is humans that are imminently fallible and subject to hacking (or corruption.)

Yeah! The goal is not perfection. The goal in this case is self-driving software that does not need human intervention beyond setting a destination and has fewer accidents than a human would. And since there's so much variability in human drivers, one could reasonably ask for, say, no more than one-third or one-fifth as many accidents as the average human driver. IMO, one-tenth as many would be a gold standard*. Personally, I don't think we're anywhere near close to even human-equivalent. But that's just my opinion. I've become more pessimistic in the last year or so. Until it can safely drive on South Kihei Road, where bicycles and pedestrians share the narrow traffic lanes (one in each direction) with cars, it's not fully driverless.

* Gold standard: What an odd expression, considering that the gold standard turned out to be a really bad idea!
 
Yeah! The goal is not perfection. The goal in this case is self-driving software that does not need human intervention beyond setting a destination and has fewer accidents than a human would. And since there's so much variability in human drivers, one could reasonably ask for, say, no more than one-third or one-fifth as many accidents as the average human driver. IMO, one-tenth as many would be a gold standard*. Personally, I don't think we're anywhere near close to even human-equivalent. But that's just my opinion. I've become more pessimistic in the last year or so. Until it can safely drive on South Kihei Road, where bicycles and pedestrians share the narrow traffic lanes (one in each direction) with cars, it's not fully driverless.

* Gold standard: What an odd expression, considering that the gold standard turned out to be a really bad idea!
If/When FSD is truly autonomous it needs to be judged differently. Where the bar should be is open to debate. Logically, it just needs to be as good as the average human but we all know that won't suffice. If doesn't matter if it catches 100 cases that humans would miss; as soon as it misses a case a human would have caught the system will be crucified in the public square, because that's the level of rationality we have right now.
 
FSD is a pain for me. I now need to pay attention to the road as well as pay attention that the software is handling itself properly.
Welcome to beta software. That’s what you signed up for when you requested to be a beta tester, although it sounds like you didn’t realize it. There’s a thread about it here. A lot of people view the beta program as a way to get the software early. You do get it early, but you get the early beta version, with all its quirks and flaws, and part of the agreement is to test it to help find these quirks and flaws and report them to Tesla.

Is it cool and exciting? Yes. Is it frustrating? Yes. Is it stressful at times? Yes. I readily admit there are plenty of times I don’t use FSDb because I just don’t have the patience and I just want to drive the car. If that’s the case for you all of the time (or almost all of the time,) you’d probably be better off opting out of the FSDb program and waiting for the software to be more developed because it will just be a source of aggravation for you.
 
Thats not what Elon said

Elon Musk says Tesla's Full Self-Driving tech will have Level 5 autonomy by the end of 2021​


I like how you quoted a headline quoting Elon. And the funny thing is that the article that that headline belongs to also doesn't quote Elon: Elon Musk says Tesla's Full Self-Driving tech will have Level 5 autonomy by the end of 2021

So here's the transcript of the call they're referencing: Tesla (TSLA) Q4 2020 Earnings Call Transcript | The Motley Fool

And here's the quote:

I guess, I'm confident based on my understanding of the technical roadmap and the progress that we're making between each beta iteration. Yes. As I'm saying, it's not remarkable at all for the car to completely drive you from one location to another through a series of complex intersections. It's now about just improving the corner case reliability and getting it to 99.9999% reliable with respect to an accident.

Basically, we need to get it to better than human bio factor at least 100% or 200%.

From that it seems like Elon's definition of "level 5" is the car able to drive in all locations and situations, but not necessarily that it can do so without mistakes or even accidents. May or may not align with the SAE definition, or your personal definition, but only people who take his quotes out of context don't understand what he means.
 
I like how you quoted a headline quoting Elon. And the funny thing is that the article that that headline belongs to also doesn't quote Elon: Elon Musk says Tesla's Full Self-Driving tech will have Level 5 autonomy by the end of 2021

So here's the transcript of the call they're referencing: Tesla (TSLA) Q4 2020 Earnings Call Transcript | The Motley Fool

And here's the quote:



From that it seems like Elon's definition of "level 5" is the car able to drive in all locations and situations, but not necessarily that it can do so without mistakes or even accidents. May or may not align with the SAE definition, or your personal definition, but only people who take his quotes out of context don't understand what he means.
In all fairness, many of Elon's tweets are less than crystal clear. Of course if you take a less than clear tweet out of context then you get to make it mean whatever you want. Which is pretty much what @2101Guy does anyway.
 
And per Elon, thats GOING to happen by 12/31/2022. Correct?

*is this the part where people start making excuses for Elon..again?

Elon has made so many promises that were not fulfilled, never admitting that he was wrong, but just revising his promises, that I think it's time to acknowledge that he's either a serial liar, or that he's incompetent and unable to distinguish between a wish and a fact.

Tesla's cars are the best in the world. Elon gets a lot of credit for heading the company that has accomplished that. But he also must take the blame for making promises that he cannot keep.

... it seems like Elon's definition of "level 5" is the car able to drive in all locations and situations, but not necessarily that it can do so without mistakes or even accidents. May or may not align with the SAE definition, or your personal definition, but only people who take his quotes out of context don't understand what he means.

Must does not get to re-define "Level 5." The five levels of autonomous driving are defined by the SAE, and the public understands that Level 5 means driverless. Musk has promised a car that can drive itself across the country, that can drop the kids off at soccer practice and return home, that can be used as a robo-taxi. And Tesla uses the phrase "Your car," not merely "Our cars," which means that this would happen during the expected life of the cars being sold when he made those statements.

A Level 2 car, which is what all Teslas equipped with AP, EAP, or "FSD" are today, cannot do any of that. And frankly, his latest timeline predictions are as ridiculous as all the missed ones up to now.

Elon Musk is not all one thing or all another. He help to build, and he is leading, a great company. But he lies and slanders and makes some inexcusable decisions (like speculating on Bitcoin with stockholder money!)
 
Must does not get to re-define "Level 5."

I don't think he's redefining it, but he's definitely underestimating what it involves. When he says "level 5," he often prefaces it with "feature complete." Or in other words, the car has all of the features it needs to be level 5 in place, they just don't work flawlessly.

I don't think the SAE definition requires level 5 to never disengage or never cause accidents. I know the SAE says level 5 must "drive everywhere in all conditions," and that in the event of a failure, the vehicle must achieve a "minimal risk condition," but to a lot of other autonomous vehicle companies, the "minimal risk condition" seems to include stopping dead in the middle of the road and putting the hazard lights on.

In theory, Tesla could say FSDb is level 5 tomorrow and let the driver sit in the back seat. It would disengage and cause accidents with relatively frequency, but it could still technically be level 5.
 
Must does not get to re-define "Level 5." The five levels of autonomous driving are defined by the SAE, and the public understands that Level 5 means driverless. Musk has promised a car that can drive itself across the country, that can drop the kids off at soccer practice and return home, that can be used as a robo-taxi. And Tesla uses the phrase "Your car," not merely "Our cars," which means that this would happen during the expected life of the cars being sold when he made those statements.

A Level 2 car, which is what all Teslas equipped with AP, EAP, or "FSD" are today, cannot do any of that. And frankly, his latest timeline predictions are as ridiculous as all the missed ones up to now.

Elon Musk is not all one thing or all another. He help to build, and he is leading, a great company. But he lies and slanders and makes some inexcusable decisions (like speculating on Bitcoin with stockholder money!)
Pioneers & Leaders have always redefined existing concepts and definitions. This will be no different.
 
Elon has made so many promises that were not fulfilled, never admitting that he was wrong, but just revising his promises, that I think it's time to acknowledge that he's either a serial liar, or that he's incompetent and unable to distinguish between a wish and a fact.

Tesla's cars are the best in the world. Elon gets a lot of credit for heading the company that has accomplished that. But he also must take the blame for making promises that he cannot keep.



Must does not get to re-define "Level 5." The five levels of autonomous driving are defined by the SAE, and the public understands that Level 5 means driverless. Musk has promised a car that can drive itself across the country, that can drop the kids off at soccer practice and return home, that can be used as a robo-taxi. And Tesla uses the phrase "Your car," not merely "Our cars," which means that this would happen during the expected life of the cars being sold when he made those statements.

A Level 2 car, which is what all Teslas equipped with AP, EAP, or "FSD" are today, cannot do any of that. And frankly, his latest timeline predictions are as ridiculous as all the missed ones up to now.

Elon Musk is not all one thing or all another. He help to build, and he is leading, a great company. But he lies and slanders and makes some inexcusable decisions (like speculating on Bitcoin with stockholder money!)
 
I don't think he's redefining it, but he's definitely underestimating what it involves. When he says "level 5," he often prefaces it with "feature complete." Or in other words, the car has all of the features it needs to be level 5 in place, they just don't work flawlessly.

I don't think the SAE definition requires level 5 to never disengage or never cause accidents. I know the SAE says level 5 must "drive everywhere in all conditions," and that in the event of a failure, the vehicle must achieve a "minimal risk condition," but to a lot of other autonomous vehicle companies, the "minimal risk condition" seems to include stopping dead in the middle of the road and putting the hazard lights on.

In theory, Tesla could say FSDb is level 5 tomorrow and let the driver sit in the back seat. It would disengage and cause accidents with relatively frequency, but it could still technically be level 5.

Level 5 does allow the car to find a safe place to park and then stop. It must not stop in a place where it will be a hazard. There are conditions where a human driver would not continue to drive, and a Level 5 car may be the same. But I'm pretty sure that Level 5 must at least be comparable to the safety of the average human driver. A car capable of driving itself that hits the first thing that gets in its way would not be level 5. o_O

A car that can mostly drive itself but occasionally needs to call on a human for help is not Level 5. It's Level 3 or 4. Level 3 if the driver must be in the driver's seat, ready to take over on short notice, and Level 4 if the car can pull over on its own and wake the driver to take over.

Tesla would be lying if they declared FSDb to be Level 5 tomorrow, based on the fact that all modules are in place. As long as they require a human in the driver's seat it's Level 2 if the driver is responsible for deciding when to take over, or Level 3 if the car can give the driver reasonable advance notice to take over.

Tesla very clearly promised Level 5 to FSD buyers at the time I bought my car. At present it's claiming that Level 2 would fulfill its promises.

Pioneers & Leaders have always redefined existing concepts and definitions. This will be no different.

This is very different, because Musk is promising something that is widely understood to mean one thing, and trying to say he's fulfilled the promise with something else altogether. Also, pioneers are people who go beyond what others have done. Musk has backtracked from what he promised but refuses to acknowledge that fact. He was a pioneer when he sold cars with AP and EAP to customers. He promised to pioneer driverless cars. Instead he's claiming the car is "driverless" if the human doesn't have to intervene very often. But the whole point of "driverless" is that you don't need to be in the car. Tesla is no longer the leader in the development of automotive autonomy, but he's still selling a pig in a poke. Redefining widely-understood terms in order to run false advertisements is not being a leader or a pioneer.

Apple "redefined" the cell phone by creating the iPhone. Not by selling old-fashioned phones and attaching the "smart" label to the name.

And the sad thing is that he would not have to make false promises to sell these cars. The cars sell themselves. He could honestly say that Tesla is trying to develop driverless cars, without promising each buyer that their car will be capable of driverless operation.
 
Win or lose, constant stream of news from the lawsuit is not beneficial to Tesla: Debate on whether your product kills kids is simply toxic.

Yeah they won’t bother with one, plus there is nothing to sue about.
They say there’s no such thing as bad publicity but I’m not so sure in this case.

If DoD deliberately skewed or biased the results of his test then Tesla may well have grounds to sue for libel. Whether it’s worth it or not is another question.

I looked a bit at the Dawn Project’s web site and concluded DoD is a delusional wacko. (With a lot of money)
 
DoD deliberately skewed or biased the results of his test
He makes it very clear that the dummy was hit three times, but nowhere does he claim they did just three trials (they are crystal clear about this with their language - very clearly left unspecified).

It seems very likely they were legitimate tests for all the reasons discussed elsewhere, no need to rehash here.

So no misrepresentation. Just selective testing as the affidavits make clear.