Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I'm thinking the system will be able to take advantage of the SD map metadata to still perform a stop in these situations
Assuming the current map / navigation data is one of the inputs to end-to-end control, it could potentially learn to pick up on that signal with sufficient training and appropriate encoding/embedding and architecture, but it's not clear if Tesla would specifically bias the training that way. They know map data can be wrong / outdated, and it would probably be worse to suddenly stop just because somebody incorrectly indicated an intersection has a stop sign.

Tesla can use map data to collect examples of people slowing down ahead of blind corners or more generally finding cases of when humans slow down while end-to-end control would want to accelerate, and this could result in the trained network to pick up on map related inputs if those patterns are consistent enough.
 
  • Like
Reactions: willow_hiller
Cameras focus at long distances, it can't see rain on the windscreen 1 cm away, unlike a human whose eyes are 25 cm back and have variable focus and look over 30cmx30cm of the screen, unlike the cameras which look through 1cm x 1cm of the screen 1cm away with fixed focus. With raindrops there is only a slight smudging of the background image and its indistinguishable from a film of dirt.
1696010668443.png

Any thoughts on the top widescreen image? It's taken from the most recent Dirty Tesla video. He uses an Insta360 ONE RS to get that image and I don't believe he has it in any sort of transparent housing. Those look like raindrops pretty clearly to me.
 
View attachment 978205
Any thoughts on the top widescreen image? It's taken from the most recent Dirty Tesla video. He uses an Insta360 ONE RS to get that image and I don't believe he has it in any sort of transparent housing. Those look like raindrops pretty clearly to me.
How far away is that camera from the windscreen? How far away is the Tesla built-in camera? (i'm guessing O(1 cm)) Can you record simultaneously from those in the above circumstance?
 
How far away is that camera from the windscreen?
It's on the roof. There is no windscreen. FSD YouTubers have been doing that to show their viewers what the cross traffic looks like during their drive.

How far away is the Tesla built-in camera? (i'm guessing O(1 cm))
Between 1 and 7 cm due to the slope of the windscreen. There's plenty of room to see. I'm assuming that you don't own a Tesla.

Can you record simultaneously from those in the above circumstance?
As you're the one saying that they cannot be resolved it falls to you to demonstrate it. I've given you an example of raindrops visible when they're on the lens.

Here's another

20141103_131932_resized-jpg.115178


And another. I see this all the time. It's annoying.

rear-camera-always-blocked-by-rain-drops-common-v0-opaonpg2wxwa1.jpg


And here's a paper studying the appearance of focused and unfocused raindrops.


Here's one that addresses the complexity of the problem by throwing a neural network at it.

 
I read a tweet about Ashok's deposition, and he just answers in a way to minimize liability. I don't see anything wrong with what or how he answered with regard to the case at hand. This is probably how he was coached to answer by the lawyers.



Can you explain how the head of your driver automation program saying under oath he doesn't understand what an ODD is, something fundamental to the SAE levels of vehicle automation, minimizes liability?
 
  • Like
Reactions: pilotSteve
Can you explain how the head of your driver automation program saying under oath he doesn't understand what an ODD is, something fundamental to the SAE levels of vehicle automation, minimizes liability?

Yes because the lawyer was asking him about ODD wrt Tesla's Autosteer. Context matters. If he had said yes, they would go down a deep rabbit hole and expose some detail that would make it seem like Tesla was liable.

I don't know how many congressional hearings you've watched, but "I don't know" and "I don't recall" are par for the course if you don't want any trouble. There's no way someone can prove what you meant when you say "I don't know" or "don't recall" at that moment.
 
It's on the roof. There is no windscreen. FSD YouTubers have been doing that to show their viewers what the cross traffic looks like during their drive.


Between 1 and 7 cm due to the slope of the windscreen. There's plenty of room to see. I'm assuming that you don't own a Tesla.


As you're the one saying that they cannot be resolved it falls to you to demonstrate it. I've given you an example of raindrops visible when they're on the lens.

Here's another

20141103_131932_resized-jpg.115178


And another. I see this all the time. It's annoying.

rear-camera-always-blocked-by-rain-drops-common-v0-opaonpg2wxwa1.jpg


And here's a paper studying the appearance of focused and unfocused raindrops.


Here's one that addresses the complexity of the problem by throwing a neural network at it.


Of course they have a system that tries to detect it and it works some of the time---but it will not do as well and with as few false detections as a rain sensor, which is precise for physical reasons.

Also try the above at night or in glare---I've looked at camera output in rain in the dark from the Model 3 and it's much harder to distinguish, it looks more like glare from point light sources which is hard to distinguish from not so good cameras.

Generally the neural network works fine in California in the day, where rain is closer to binary (heavy or none). Other conditions it does less well.

Take for instance dirt or night time. A rain sensor is active IR illumination and it measures the change in total internal reflection when a raindrop is present on the surface, the light reflects off the back of the raindrop vs the glass boundary. It is less sensitive to dirt as it goes by more than straight attenuation as a camera would, i.e. dirt doesn't change the internal reflection the way a raindrop does. And it works fine at night even in deep darkness because it is actively illuminated. That's very hard for an image based camera to do.

The neural network gets fooled by dirt, and it doesn't calibrate as well for rain quantity/flux like a rain sensor does. This problem has already been solved in the industry but Elon wants to take out a $10 part. (It was there on early S models).
 
Yes because the lawyer was asking him about ODD wrt Tesla's Autosteer. Context matters. If he had said yes, they would go down a deep rabbit hole and expose some detail that would make it seem like Tesla was liable.

I don't know how many congressional hearings you've watched, but "I don't know" and "I don't recall" are par for the course if you don't want any trouble. There's no way someone can prove what you meant when you say "I don't know" or "don't recall" at that moment.

I don't recall sure so they can't nail you on if you discussed bombing civilians or drowning immigrants or whatever on a given day.

But nobody says "I don't know what that is" when they're asked WHAT a bomb or an immigrant is. Especially when it's literally their job to understand the topic.

Again that doesn't reduce risk of liability, it would increase it- since it would be the person in charge of the system admitting he lacks fundamental knowledge to be in charge of such a system.


In fact the twitter thread posted earlier has a follow-up at the end pointing out how ridiculous his perjury is here given the term ODD is used 22 times in the NTSB incident report about the crash the trial is about

Including multiple times it cites Tesla discussing ODD

The thing the guy in charge of the team now (and a high ranking member of it even when the accident happened) claims he doesn't know what it even is.

He also cites six different times Tesla received notification from regulators regarding suggestions around AP and ODD. I guess the guy in charge read 0 of them? (or read them, didn't understand them, and didn't bother to clarify that- which again is even worse)?




Because in the context it would make Tesla liable to lawsuits from older Autopilot which is why he was being asked anyway.

See above- How could understanding an ODD possibly make them liable when they've been clear, half a dozen ways, from day 1, the system requires driver supervision 100% of the time, and is only intended for use in specific, given-to-the-owner-in-writing, situations?

AP has an ODD and Tesla tells it to you in the owners manual.

The only thing his testimony does is make him appear incompetent. And given how many times he'd have had to not read (or read, not understand, and actively avoid doing anything TO understand) multiple government reports addressed to Tesla and the team he runs that's even worse.
 
The only thing his testimony does is make him appear incompetent. And given how many times he'd have had to not read (or read, not understand, and actively avoid doing anything TO understand) multiple government reports addressed to Tesla and the team he runs that's even worse.

It's obvious that he is not actually that incompetent but is likely following advice from lawyers who know the insider details of this case.
 
It's obvious that he is not actually that incompetent but is likely following advice from lawyers who know the insider details of this case.


Yet nobody can cite what that advice would be based on.

Not to mention an actual lawyer would not instruct you to knowingly perjure yourself. On the contrary the rules of ethics for lawers not only require you to NOT do that-- if you realize a client HAS committed perjury or submitted false evidence the lawyer is obligated to disclose that to the court


Which is why "I don't recall if I discussed X" is common on the stand-- Maybe you did or did not discuss it, you just can't recall right now having done so. Nobody's ever going to prove perjury for you doing that, and not even 100% perfect evidence you DID discuss it leaves you (or the lawyer) on the hook legally for having said you don't recall.


While "I don't know what aerodynamic lift is" when you're an aerospace engineer at Boeing is not and no lawyer would ever instruct such a witness to lie like that.

Same deal here. Nobody is going to tell the head of AP to lie about not knowing what a fundamental vehicle autonomy term is, and risk losing not just the case but his ability to practice law... even if he doesn't care about ethics he'd care there's a decent chance there's actual evidence somewhere proving that's a lie because it's definitive.


So either:

You have an entirely corrupt lawyer with no regard for his own career instructing him to lie for reasons nobody can actually explain....and Ashok doing so for reasons nobody can explain.... or... he just genuinely did not know because he's a coding guy and while his title says "Head of AP" he is not (or up through the trial date anyway was not) actually working in a high-level planning capacity regarding any move toward actual autonomy. He was focusing on "How can we code the car to make safe unprotected lefts" not "What elements does an autonomous system need in an overarching architecture sense"

Occam's got a razor about this.
 
Last edited:
I read a tweet about Ashok's deposition, and he just answers in a way to minimize liability. I don't see anything wrong with what or how he answered with regard to the case at hand. This is probably how he was coached to answer by the lawyers.


I’m going to play devil’s advocate here. I grabbed the transcript of the deposition and read the part around the ODD question.


Right after the question/answer cited above, we have:

Q. … During your time at Tesla, have you ever
heard anybody within the Tesla Autopilot software
team refer to an operational design domain?
A. I've heard those words before, but I do not
recall much more than that.


To me, that says Ashok was not involved in the discussions about levels, etc. He was a software guy. In fact, a little earlier, he said.

Q. Well, when you've worked at Autopilot -- or
worked at Tesla on Autopilot, have you had in mind
the notion that humans have some sort of lag time in
processing visual information?
A. I am not the person who is studying human --
whatever time you alluded to. I am a software
engineer on the team.


Now, way back at the start of the deposition, he was strongly instructed not to guess:

Q. I don't want you to guess. I don't want you
to speculate. I obviously don't want you to make
things up in response to my questions.
Do you understand that?
A. Yes.
Q. Do you understand that the instruction I
just gave you about not guessing and not
speculating, that applies for the entire deposition?
A. Yes.
Q. Okay. And do you feel like you will be able
to remember that throughout today's deposition, that
you don't need to be reminded not to guess or
speculate; correct?
A. Yes.
Q. If you don't understand one of my questions,
please let me know, and I will either have the court
reporter read it back if I think the question is
clear, or I will reword the question.
I'm going to do my best to ask you questions
that, from my perspective, are understandable and
answerable, but I don't know if you -- you know, for
some reason you might not understand the way I word
a question; so I'm going to rely on you to let me
know. Okay?
A. Yes


So, I’m thinking that because he wasn’t intimately familiar with what an ODD is/was, he answered truthfully when he said he did not know. Further, in his role as a Director of Autopilot Software, he did not need to know the nitty-gritty details. He just had to have people who did know.
 
So, I’m thinking that because he wasn’t intimately familiar with what an ODD is/was, he answered truthfully when he said he did not know. Further, in his role as a Director of Autopilot Software, he did not need to know the nitty-gritty details. He just had to have people who did know.

That's possible, we can't know for sure.

The way I see it is that Ashok is part of the party being sued. I know it's not possible to plead the fifth for your own company, but essentially it was in Ashok's interest to defend his work and his company, unless he has some vendetta against Tesla, which he likely doesn't.

So it makes sense for him to divulge as little information as possible wrt Tesla's internal decisions about Autosteer.
 
Did Tesla end up losing that case?

If not, then Ashok's answer was successful.


...what?

Teslas attorneys also wore shoes to court, I guess the shoes were successful!


Again, "Ashok legit doesn't know what an ODD is" fits what was actually said and as dtdtdt points out what Ashoks actual job appears to have been..... versus this weirdo conspiracy theory where highly paid lawyers risk losing their careers directing a witness to outright lie about fundamental knowledge (not a recollection- but objective knowledge) for... REASONS...that nobody can clearly articulate beyond "the less he says the better" without explaining how not understanding ODD made any difference to the case given Huang had been using AP within the ODD of autopilot-- that is he was on a divided highway and died while exiting it because he was too busy playing with his phone to pay attention as the system requires.

Tesla didn't win the case because Ashok didn't know a fundamental term of art of autonomous driving.... they won because the plaintiffs lawyers failed to prove Tesla was at fault. One had nothing to do with the other besides they both happened in the same courtroom.
 
Teslas attorneys also wore shoes to court, I guess the shoes were successful!
I get your point, but it reminds me of a story . Back in the mid-90s, we had a court case against some guys who started a new company and took various forms of significant intellectual property on their way out.

Our CEO hired a Silicon Valley legal team who I guess had a reputation of being crack IP attorneys, but they could have done a better job. One issue was that our real star witness was a technician who had a very damning story about what happened (they offered him a job and instructed him what they wanted him to take out with him).

But when he asked the lawyers what he should wear to his court appearahce, they shrugged and said "Just wear what you normally wear to work, that'll be fine."

So he showed up in shorts, sneakers and a striped polo shirt. The eldrrly judge was extremely annoyed, and I think there's no question it made the testimony go badly; their lawyers were sustained on everything and our lawyers were overruled on everything.

My advice is, wear nice shoes and dress up a little for court.
 
  • Funny
Reactions: powertoold
Tesla didn't win the case because Ashok didn't know a fundamental term of art of autonomous driving.... they won because the plaintiffs lawyers failed to prove Tesla was at fault. One had nothing to do with the other besides they both happened in the same courtroom.

I don't think you can say that Ashok's reluctance to share information didn't help Tesla win the case.

It's easy to tell what answering strategy Ashok decided to take... based on his short and non-informational answers. He answers in a way to divulge as little information as possible. It's easy to conclude that based on the transcripts. Never does he go into any casual detail about Tesla's internal decision making. He always left some ambiguity in his answers and tried not to steer himself into a trap.

Check out this excerpt where Ashok defiantly answers about AP safety, which came right before the ODD question. This excerpt marked a change in the tone and content of Ashok's answers:

Q. So is it your understanding that the
Autopilot software team never created tickets to
flag instances where Autosteer left the lane out of
concern for safety?

THE WITNESS: I cannot comment on whether it
was created for safety or not.

BY MR. McDEVITT:
Q. Okay. During your time working at Tesla,
have you recognized that if Autosteer controls the
steering of the Tesla in a way that takes the Tesla
out of the lane it's in, that creates a potential
safety issue?

THE WITNESS: It depends on the situation.
That is normal answer for this question.
BY MR. McDEVITT:

Q. Well, you recognize that there are
situations where if Autosteer controls the steering
of the Tesla out of the lane the vehicle's in, that
can result in a crash. True?

THE WITNESS: Again, it depends on the
situation.

BY MR. McDEVITT:
Q. Okay. And I understand it depends on the
situation, but you -- during your time with Tesla,
you've recognized that there are instances where if
Autosteer controls the steering of the Tesla out of
the lane it's in, that can cause a crash; right?

THE WITNESS: My understanding is that if
the driver was paying attention and watching the
road, I do not believe there is any safety concern.

BY MR. McDEVITT:
Q. And has that been your mentality the entire
time that you've worked at Tesla?
A. Yes.

Q. That you have not felt there's a need for a
safety concern upon learning that Autosteer
controlled a Tesla vehicle out of the lane it was in
because you've always assumed that the driver will
always be able to take over; correct?

THE WITNESS: The system is designed to stay
within the limits of steering and braking. Any
attention-paying human should be able to override
the system with ease and then drive the car safely.
BY MR. McDEVITT:

Q. Is there anybody on the Autopilot team that
is a human factors engineer?

A. I do not know.
 
Last edited: