Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • We just completed a significant update, but we still have some fixes and adjustments to make, so please bear with us for the time being. Cheers!

Here's What's Missing from Self-Driving Cars: TRUST

stopcrazypp

Well-Known Member
Dec 8, 2007
9,911
4,829
What are you talking about?! This is Tesla we are discussing here, they move at breakneck speed, process and testing only gets in the way, whenever they get the new feature they were adding to finally build, they immediately pack it up and ship a new release out.
I wish I was joking. (this is not related to the actual neural net code - that one is done differently).

I would not be surprised if Elon stands behind the back of the poor sap who was assigned to be in charge of the newest build containing $SHINY promised on twitter by Elon 2 seconds after coming up with it and then communicated to the development team with orders to try and keep up wit the self-imposed deadline, but still being way too late (understandably). Once it builds, he then pushes the big red "release" button himself. (Full disclosure - this last paragraph is speculation.)
Any evidence they are releasing nightly builds directly to consumers (especially for feature releases, not the ones without release notes or only say "enhancement")? Doesn't match any of the publicly available information and leaks about Tesla's development process.

Also about the early access program, is there at least one known participant in this program on TMC, have anybody met one ever?
All EAP members are required to sign an NDA, so it is very rare that a participant becomes known.
However, from the 7.0 leak (first pictures leaked mid August, then a second leak in mid September, software was not released until mid October), there was a EAP participant that was identified:
Autopilot lane keeping still not available over 6 months after delivery

Electrek also got 8.0 details from beta release in June 30 2016 (someone in program leaked to them); 8.0 did not get released until September 22, 2016.
Exclusive on Tesla 8.0 update: new Autopilot features, biggest UI refresh since launch and much more
Are you suggesting the early access program doesn't actually exist?
 
Last edited:

whitex

Well-Known Member
Sep 30, 2015
6,401
7,575
Seattle area, WA
While no official cause of the Joshua Brown accident has been released...
Maybe you missed NTSB report on fatal Joshua Brown accident in Florida

Plus, it would be difficult for any other automobile company to have a fatality during level 2 autonomous driving given that none of them have a working, deployed level 2 autonomous system.
EXACTLY! That is my point. They are not ready for public use, hence they are not releasing. And if you think their stuff doesn't work, check out this test drive Nissan did for the press core (I haven't seen Tesla doing anything close to this in front of any press):

Or, checkout what Google cars can do, and yet, not released to the public.
 
  • Informative
Reactions: MikeBur

SomeJoe7777

Marginally-Known Member
Mar 28, 2015
2,165
5,545
Houston, TX

I did not. In fact, I read the entire docket. That docket is a factual docket. It contains no conclusions or primary/contributing causes, only findings of fact.

EXACTLY! That is my point. They are not ready for public use, hence they are not releasing. And if you think their stuff doesn't work, check out this test drive Nissan did for the press core (I haven't seen Tesla doing anything close to this in front of any press):

Or, checkout what Google cars can do, and yet, not released to the public.

And Tesla has showed a full FSD video. What's the point? Unreleased products add up to nothing. Every single automaker says they'll have a "Telsa killer" vehicle in 2020. Whooptee do -- that only means they're 8 years late to the party. Furthermore, how can you say that Nissan's stuff "works" (since they did it in front of the press), yet also say it's not released because it's not ready for public use? Which is it?

You have an assumption that Nissan, Google, GM, etc. haven't released their products because they deem them not yet safe enough for release. But you have no evidence of that. For that matter, neither do they. With the possible exception of Google, they have far too few miles on their product to draw any conclusions. Far more likely is that they haven't released their products because they're happy for someone else to blaze the trail and brave the liability and regulatory issues. They want to hold back and see how Tesla fares in this product space and then enter it when the risk is a lot lower.

Back to what I said in the earlier post, my point was that you cannot say that Tesla is somehow worse than every other automaker because there was a fatality while on AutoPilot. Every other automaker has zero cars on the road with a level 2 autonomous system. That's a great way to prevent fatalities related to level 2 autonomy.

Do you know what company has NEVER crashed a single airplane? Coca-Cola.
 

Canuck

Well-Known Member
Nov 30, 2013
6,125
5,468
South Surrey, BC
While no official cause of the Joshua Brown accident has been released, there may be a few considerations given before a wanton indictment of autopilot is made. Like the stoned 18-wheeler driver, for instance.

Plus, it would be difficult for any other automobile company to have a fatality during level 2 autonomous driving given that none of them have a working, deployed level 2 autonomous system.

I did not. In fact, I read the entire docket. That docket is a factual docket. It contains no conclusions or primary/contributing causes, only findings of fact.

I've read a lot of it because I find it interesting. I agree that the truck driver has culpability but we know there's often contributing causes and we get one of those contributing causes of the accident from the facts: AP can kill you if you're not paying attention. There's no issue that the vehicle drove right under the truck while on AP.

I don't want to put blame on a dead person, and neither does the report, but for some reason or other Mr. Brown was not paying the proper attention required of AP. I'm surprised when they interviewed his family they didn't ask them very many questions, and it was conducted by phone, but someone who knew him (or at least said she did and I believe her) posted here (then regretted it so I won't link to it) -- that he was known to be distracted and doing other things while on AP. In fact, Mr. Brown posted a video of AP reacting for him (that Elon re-tweeted) shortly before his death. He worked in remote areas given the nature of his wifi business, and he drove a lot. He was young, healthy and did not smoke or drink so a medical condition such as a heart attack is unlikely but possible.

Having said that, your point is well taken and I agree that "it would be difficult for any other automobile company to have a fatality during level 2 autonomous driving given that none of them have a working, deployed level 2 autonomous system". I would also add that we don't have the numbers on how many people have been saved by AP because we can't see the accidents that would have happened if they were not on AP. We know to this day that seat belts kill people in some accidents where they would otherwise be thrown from the vehicle and live. But we play the odds and wear them. With AP and a human paying attention I believe we are safer, and we will only get safer, and we have to start somewhere. Good on Tesla for making the start but sorry for Mr. Brown -- that part of AP's initial history is truly tragic.
 
Last edited:
  • Like
Reactions: NerdUno and EinSV

whitex

Well-Known Member
Sep 30, 2015
6,401
7,575
Seattle area, WA
With AP and a human paying attention I believe we are safer
How exactly is AP adding to a human paying full enough attention to take over with full awareness in a split second if AP does something wrong (like drive into opposing traffic) or doesn't do something (like change lanes to avoid construction)? If this human needs to continuously pay attention, how is AP making things safer? At least without AP the human only needs to pay attention to what the car should be doing and not worry about AP doing something it shouldn't.

Having used AP1 btw, I can tell you that it causes inattention because it does so well most of the time. The problem is that while driving with AP the mind wonders because it has nothing active to do. It is much harder to keep paying attention at the level required to safely take over in a split second than just drive yourself. I have AP, but limit it's use to stop and go highway traffic because I don't want to be the next AP statistic.
 
  • Like
Reactions: Swift

Canuck

Well-Known Member
Nov 30, 2013
6,125
5,468
South Surrey, BC
If this human needs to continuously pay attention, how is AP making things safer?

I just had this conversation with someone with poor eyesight like I have, which is worse at night, or when driving in the fog, even with good eyesight. He loves AP for that reason alone and feels much safer with it. AP gives you those extra eyes that help out a lot. Plus, people say they are less fatigued when driving with AP assisting, and there's a number of other arguments as to why AP makes us safer, when used as designed.

Having used AP1 btw, I can tell you that it causes inattention because it does so well most of the time.

Agreed. I had an AP loaner for a week while deciding whether to upgrade for and in addition to what you say above, I also found that it caused me to have "phantom AP" which made me think it was on when in fact was off because I relied on it so much. But I don't know what the alternative is -- not having AP at all for everyone? Also, do we avoid or even ban things because there are negative side effects? We certainly don't do that with drugs that kill a lot more people, while also saving lives. I think Tesla's AP needs to come with a drug-like side effect warning. My concern is that Tesla seems to perpetuate the idea that AP drives for you, starting right with its name. I'd much prefer "driver assist" than AP. And I do know you have to agree to the warnings, etc. before using it. But that's after you bought it and my concern is how Tesla sells it.
 

verygreen

Curious member
Jan 16, 2017
2,903
11,267
TN
Any evidence they are releasing nightly builds directly to consumers (especially for feature releases, not the ones without release notes or only say "enhancement")? Doesn't match any of the publicly available information and leaks about Tesla's development process.
I am not sure they are truly nightly builds, but let's take for example 17.24.28, it was built on June 15th and June 15th according to ev-fw is the first day it became available (The data is a bit wobbly, some sort of timezone issues? I noticed installs move +/- 1 day).
17.22.46 was built on June 9th and made available sometime then too.
Also see 17.18.50 where AP portion crashes very regularly for many people, do you think this is because it was thoroughly tested before release?
 
  • Like
Reactions: NerdUno

malcolm

Active Member
Nov 12, 2006
3,072
1,729
Before you can get to TRUST of autonomous tech (covered in the article), you first must have TRUST in the company that designed it (my point)

Impressive.

Shopping for groceries must take forever. What with all the due diligence.

Category:Food safety scandals - Wikipedia

Maybe Doug_G should have changed the thread title to include the word "Chipotle" because the arguments would be pretty much the same.

Just substitute "wobbly tummy" for "wobbly autopilot" / "Chipotle tried TO KILL ME!!!" etc etc

Chipotle shares tank after company admits it will need to spend more to woo customers
 
Last edited:
  • Love
Reactions: jeffro01

whitex

Well-Known Member
Sep 30, 2015
6,401
7,575
Seattle area, WA
I just had this conversation with someone with poor eyesight like I have, which is worse at night, or when driving in the fog, even with good eyesight. He loves AP for that reason alone and feels much safer with it. AP gives you those extra eyes that help out a lot. Plus, people say they are less fatigued when driving with AP assisting, and there's a number of other arguments as to why AP makes us safer, when used as designed.
According to Tesla the driver has to be aware to be able to correct AP's behavior or lack therefore at any time, which means this person is experiencing a false sense of security. Yes AP may help them, but because they are not able to drive safely themselves, they are no longer able to correct what AP does or fails to do, therefore making the whole situation less safe, not more. It's like saying cruise control helps people who are driving under influence be safer. Sorry, but I really don't buy your argument on this one.
 

Pruitt

Pontificating the obvious
Jun 27, 2014
503
591
Casper WY
Not at all, I am saying alpha or even beta cars should not be sold to the public. Bad things can happen during experimental phases of products operated by regular people all over the spectrum as far as their technical knowledge or driving ability. Testing should be done by qualified test drivers in controlled conditions, not by laymen on public roads. One accident with some shock value and US self driving becomes severely limited.
OK. I get what you're saying, and to an extent I agree with you. The problem, though, is that driving on public roads is UNcontrolled conditions. Try as hard as the testers might, there's no way the can anticipate and plan for every set of circumstances that may occur. They can probably cover 80-90 per cent (and probably more) of the circumstances that will be encountered, but they will never be able to get all. In a very real sense, self-driving systems will always be in beta.
 

Pruitt

Pontificating the obvious
Jun 27, 2014
503
591
Casper WY
According to Tesla the driver has to be aware to be able to correct AP's behavior or lack therefore at any time, which means this person is experiencing a false sense of security. Yes AP may help them, but because they are not able to drive safely themselves, they are no longer able to correct what AP does or fails to do, therefore making the whole situation less safe, not more. It's like saying cruise control helps people who are driving under influence be safer. Sorry, but I really don't buy your argument on this one.
Unfortunately, the NTSB study doesn't support your conclusion. They state that using autopilot is 40% SAFER than not using it. Granted, that was based on HW1. But I don't get the sense that you are restricting your arguments to just HW2. Or am I wrong on that?
 

Pruitt

Pontificating the obvious
Jun 27, 2014
503
591
Casper WY
I guess the question comes down to the right balance.
Tesla's approach is very different from the rest of the industry on what they ship to the end-customer and when.
It is not completely unreasonable to think a risk realized because of Tesla's more risky approach could have negative repercussions.
MobilEye made this case during the Autopilot 1 fatal accident, of course, that Tesla is pushing the tech too far given its ability and maturity.
That said, I don't believe in the 10 year setback. Autonomous is coming. It remains to be seen which approach gets there first.
I believe Audi is a likely leader, Level 3 "read a book" possibly this year.

OMG!!!

I am in total agreement with everything you said here!

NOOOoooo!! KILL ME NOW, PLEASE!!!

:)
 
  • Funny
Reactions: AnxietyRanger

whitex

Well-Known Member
Sep 30, 2015
6,401
7,575
Seattle area, WA
Unfortunately, the NTSB study doesn't support your conclusion. They state that using autopilot is 40% SAFER than not using it. Granted, that was based on HW1. But I don't get the sense that you are restricting your arguments to just HW2. Or am I wrong on that?
NTSB statistics have significantly insufficient sample size. One more death and the death rate doubles. That's the problem with statistics on small number of data points. Also, when they say "safer after installing AP", which version? Initial versions didn't do much at all. Bottom line, insufficient data for conclusive safety stats. That said, they did conclude that most people don't read manuals and the onus is on the manufacturer to ensure the user is using the AP correctly, hence the increased nags.
 

whitex

Well-Known Member
Sep 30, 2015
6,401
7,575
Seattle area, WA
OK. I get what you're saying, and to an extent I agree with you. The problem, though, is that driving on public roads is UNcontrolled conditions. Try as hard as the testers might, there's no way the can anticipate and plan for every set of circumstances that may occur. They can probably cover 80-90 per cent (and probably more) of the circumstances that will be encountered, but they will never be able to get all. In a very real sense, self-driving systems will always be in beta.
There is such as concept as "shadow driving" where the driver keeps on driving the car, but the AI compares what it would have done, and if the 2 diverge significantly, AI should learn (unless of course the driver action results in an accident). Unfortunately for Tesla, nobody will pay for the sole privilege of providing training data to Tesla, so they have to sell people on something. Elon has a tendency of significantly overhyping what's possible in order to sell it, though I think he went a little overboard on the FSD. I understand his reasoning, but I am not buying into the hype. I did buy 3 brand new Model S'es to-date and possibly will but more, just no major spending on unbaked or bleeding edge stuff that is likely to get limited in the future (so no P until is lasts for few years, and no EAP/FSD until it can do it all, drop me off at work and pick me up after work while I am napping on the way home :)). Of course, if the other guys get there, I have no issues jumping to another manufacturer. Prior to Tesla, Porsche and Lexus were my brands of choice (and I currently would trust any of their promises way more than Tesla's).
 
  • Like
Reactions: NerdUno

NerdUno

Member
Dec 18, 2016
652
893
Charleston, SC
Impressive.

Shopping for groceries must take forever. What with all the due diligence.

Category:Food safety scandals - Wikipedia

Maybe Doug_G should have changed the thread title to include the word "Chipotle" because the arguments would be pretty much the same.

Just substitute "wobbly tummy" for "wobbly autopilot" / "Chipotle tried TO KILL ME!!!" etc etc

Chipotle shares tank after company admits it will need to spend more to woo customers

Not sure you're doing Tesla any favors by comparing them to Chipotle. You've obviously missed a few chapters in the Chipotle saga. My point simply was that companies develop a reputation for integrity, or food safety, or whatever. It doesn't mean they never make mistakes. But that reputation carries over into everything they do. GM certainly hasn't been a model company all the time. But at least the current leadership has tried to address serious issues in an honest and forthright way. Still waiting to see some of that from Elon & Co. And I'd want to see it before I took a nap in an FSD Tesla.
 

Buster1

Member
Oct 13, 2016
582
266
Ft Worth
As a professional pilot, this incredible thread makes me think of our aircraft autopilot systems and their inadequacies. Read the second paragraph twice...

"In this respect, Flight 3407 followed a long-established trend. A 1994 N.T.S.B. review of thirty-seven major accidents between 1978 and 1990 that involved airline crews found that in thirty-one cases faulty or inadequate monitoring were partly to blame. Nothing had failed; the crew had just neglected to properly monitor the controls.

The period studied coincided with an era of increased cockpit automation, which was designed to save lives by eliminating the dangers related to human error. The supporting logic was the same in aviation as it was in other fields: humans are highly fallible; systems, much less so. Automation would prevent mistakes caused by inattention, fatigue, and other human shortcomings, and free people to think about big-picture issues and, therefore, make better strategic decisions. Yet, as automation has increased, human error has not gone away: it remains the leading cause of aviation accidents."

The Hazards of Going on Autopilot
 

stopcrazypp

Well-Known Member
Dec 8, 2007
9,911
4,829
I am not sure they are truly nightly builds, but let's take for example 17.24.28, it was built on June 15th and June 15th according to ev-fw is the first day it became available (The data is a bit wobbly, some sort of timezone issues? I noticed installs move +/- 1 day).
17.22.46 was built on June 9th and made available sometime then too.
Also see 17.18.50 where AP portion crashes very regularly for many people, do you think this is because it was thoroughly tested before release?
I was thinking of looking at build dates too, but then I don't think that is an indicator of the testing process. There is clear evidence that the EAP users gets software that is a month to 3 months ahead of the general public. It's possible before the release they build it again before releasing, but that does not necessarily mean that the source code is new. It is possible it is from a branch. It is also possible they make smaller tweaks based on feedback from EAP users.

Look back at the first 7.0 and 8.0 builds (where we know for certain that there was 1-3 months of EAP testing lead time), are the build dates also almost the same as release?
 
Last edited:

verygreen

Curious member
Jan 16, 2017
2,903
11,267
TN
I was thinking of looking at build dates too, but then I don't think that is an indicator of the testing process. There is clear evidence that the EAP users gets software that is a month to 3 months ahead of the general public. It's possible before the release they build it again before releasing, but that does not necessarily mean that the source code is new. It is possible it is from a branch. It is also possible they make smaller tweaks based on feedback from EAP users.

Look back at the first 7.0 and 8.0 builds (where we know for certain that there was 1-3 months of EAP testing lead time), are the build dates also almost the same as release?
It is possible that the build is new but the code is old, though there's now relative boatload of evidence it is not so (see 17.24.30 - a hotfix for something that would have been found in testing if it was not just pushed out with next to no testing, see 17.18.50 crashes).
Since I am relatively new to Tesla, I don't have a lot of back releases to look at.
2.9.154 that I happen to have was built on Jan 8, 2016.
 

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top