Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
He answers in a way to divulge as little information as possible.
I thought he did a good job of not falling into the trap that the lawyer was trying to setting up. "Can Autosteer cause accidents?" Then Ashok cut him off at the knees by saying "If the driver is paying attention, there's no problem". Which is the entirety of the case. I wonder how much time and money was spent in getting those statements on the record.
 
I don't think you can say that Ashok's reluctance to share information didn't help Tesla win the case.

And I don't think you can say it did.

But in this case it wasn't reluctance to share info, it was him claiming factual ignorance of a term of art common in the field he claims to be the head of for the company.

It's not like Tesla had some secret proprietary term they were asking about.


Never does he go into any casual detail about Tesla's internal decision making.

But the question had nothing to do with internal decision making.

On the contrary the bits you quote have him specifically speak to his own decision making


BY MR. McDEVITT:
Q. Okay. And I understand it depends on the
situation, but you -- during your time with Tesla,
you've recognized that there are instances where if
Autosteer controls the steering of the Tesla out of
the lane it's in, that can cause a crash; right?

THE WITNESS: My understanding is that if
the driver was paying attention and watching the
road, I do not believe there is any safety concern.

BY MR. McDEVITT:
Q. And has that been your mentality the entire
time that you've worked at Tesla?
A. Yes.


He (and since you keep painting him as speaking for the company, Tesla) did not beleive AP leaving its lane was a safety concern- since the human is required to pay attention at all times.

That's a significant internal decision, with direct bearing on the specific facts of this specific case.

And he answered it.

But somehow you've got a conspiracy theory involving his lawyers illegally coaching him to lie about not knowing what a basic term of art in vehicle automation even is?

Instead of accepting the pretty obvious fact Ashok has a MUCH narrower actual focus and job than his "head of autopilot" title suggests he does?

In fact you cite even more evidence I'm correct right here:


BY MR. McDEVITT:

Q. Is there anybody on the Autopilot team that
is a human factors engineer?

A. I do not know.


The guy who is "head" of the AP team doesn't know who he has working for him, or what tasks they even do.

Why would he lie about something that can be easily debunked with a personnel roster from Tesla?

He wouldn't. As with "what is an ODD" he legitimately does not know the answer because his actual job is much more narrowly focused than his title suggests. He's not literally the head of the team as far as knowing what other parts of the team even do...

He doesn't know what an ODD is because he doesn't run the part of the team that would care. Ditto knowing if the team has human factors engineers- not his job- he isn't a people manager nor does he think at a high enough level of the overall system that he'd be aware that part of the team exists.

He's the head coder for specific aspects of the system and that's it.


There is, of course, one OTHER explanation.

Tesla has no actual understanding of higher level automation and doesn't HAVE human factor engineers because they have no plan to ever actually go past L2 where you'd most need any of those things and they're pretty much just knocking down narrow immediate problems as they find em and every time an approach hits a hard limit they rewrite everything a different way and hope it works, again without any broader consideration of things like ODDs or human factors.

That's the max cynic way to read his testimony... but I think "His title is misleading" is a much simpler explanation for all of it. Certainly more than "Lawyers willing to junk has career and commit crimes and Ashok is too all to deny he knows what an ODD is or who even works on the AP team he supposedly runs"
 
Last edited:
But nobody says "I don't know what that is" when they're asked WHAT a bomb or an immigrant is. Especially when it's literally their job to understand the topic.
One recent instance was an inability to 'define woman.' It probably comes down to intent and a lack of honesty. I wouldn't underestimate legal/court room strategies. The Supreme Court is even hearing cases based on single word usage. Crazy stuff and way off topic.
 
One recent instance was an inability to 'define woman.'

That's more a result of ignorance of the difference between sex and gender- which scientifically aren't the same thing but people either don't understand that or refuse to educate themselves.

It probably comes down to intent and a lack of honesty.

Oh, sure in the example you give there's certainly people who DO understand they're different but are trolling anyway with ill intent.

Not really sure how any of that applies to understanding what an ODD is though.

I'm unaware of any major political party scoring points off the ignorant by trying to conflate its definition.



The Supreme Court is even hearing cases based on single word usage. Crazy stuff and way off topic.

But again nothing in THIS case hinged, at all, on what an ODD is

Ashok simply did not know because his actual job is a lot more narrowly focused than his title suggests. He admits he'd heard the term used by OTHERS at Tesla- so it's not like company-wide they had no idea what it was.... just not him because he's a code guy not a big picture total system design guy.

(and it's hardly new that the court hears cases that hinge on how one reads a single word... heck part of the Marbury v. Madison case in 1803 that established judicial review hinged on how Marshall read the meaning of a specific semicolon.)
 
  • Like
Reactions: APotatoGod
There is, of course, one OTHER explanation.

Tesla has no actual understanding of higher level automation and doesn't HAVE human factor engineers because they have no plan to ever actually go past L2 where you'd most need any of those things and they're pretty much just knocking down narrow immediate problems as they find em and every time an approach hits a hard limit they rewrite everything a different way and hope it works, again without any broader consideration of things like ODDs or human factors.
And possibly a cynical intentional decision that by hiring those types who might flag safety problems they expose themselves to liability before they are completely solved---a long time away if ever. We can see what Elon thinks about safety monitors at twitter. He only values 'hard core coders'.
 
  • Like
Reactions: daktari
It's make one wonder why they would need to collect more chuck UPL data unless it's for the higher res cameras.

I bet v12 will have a much longer development path before they start worrying about chuck's world.
Could also be testing V12 to see how it handles that situation. Theoretically, if the base behavior is good, it wouldn't need trained on that specific situation whereas v11 relied on a bunch more human written and tuned C++ path planning code.
 
Testing Chuck's UPL was one of the steps Tesla took just before v11 went public. Now they're back at it, presumably for v12:

Clicking on that link takes one to an "X" Sign Up For An Account Page.

Is there a way to post a link to something like that without having to join that social network? Especially as (a) X promises to sell your information everywhere, in search of money and (b) there are continual threats to charge for X if one wants to use it for any purpose.

This ain't You Tube.
 
Clicking on that link takes one to an "X" Sign Up For An Account Page.

Is there a way to post a link to something like that without having to join that social network? Especially as (a) X promises to sell your information everywhere, in search of money and (b) there are continual threats to charge for X if one wants to use it for any purpose.

This ain't You Tube.

Here is a screenshot of the tweet.

lQSuKiO.png
 
Purely done to train it for that location & fool the less critically thinking into believing it applies to any UPL anywhere.
Ugh.
*sigh* Because every left turn with a median in the US, of which there are tens of thousands, is unique and must be trained for individually. There is nothing similar about them. Really? 🤦
 
Testing Chuck's UPL...
Wish they would do that here. I can't take the UPL out of my neighborhood half of the time - like when cars are coming.

And maybe v12 will help. The decision to go is not bad but it just dawdles in acceleration making a relatively easy turn into something with a much higher pucker factor. Almost always requires accelerator assistance.
 
*sigh* Because every left turn with a median in the US, of which there are tens of thousands, is unique and must be trained for individually. There is nothing similar about them. Really? 🤦
I’ve been enjoying those UPL video based YouTube channels, too….
Gotta admire Tesla’s willingness to travel so far from head office to get such a software challenging intersection….
Ugh.
 
Could also be testing V12 to see how it handles that situation. Theoretically, if the base behavior is good, it wouldn't need trained on that specific situation whereas v11 relied on a bunch more human written and tuned C++ path planning code.

Gotta hope that isn't v12. Stop short, indecisive approach to stop sign, and jerky braking/lunging when deciding when/if to proceed. I bet the driver hit the brakes at the end when that semi approached. Looks like v11 Junk to me.


 
  • Funny
Reactions: willow_hiller
Gotta hope that isn't v12. Stop short, indecisive approach to stop sign, and jerky braking/lunging when deciding when/if to proceed. Looks like v11 Junk to me.



This is literally just a 23 second clip of the car braking at a stop sign, and creeping a little bit. During the entire 23 seconds, it doesn't have the right of way due to the turning white SUV. There are no conclusions to be drawn from this video.

You would call it junk regardless of what the video showed.
 
This is literally just a 23 second clip of the car braking at a stop sign, and creeping a little bit. During the entire 23 seconds, it doesn't have the right of way due to the turning white SUV. There are no conclusions to be drawn from this video.

You would call it junk regardless of what the video showed.

If it stops short like junk, indecisively creeps like junk, brakes and lunges like junk, sticks its front end into traffic like junk, then its the same JUNK we've had for too long now.

Your white SUV right of way red herring has no impact on FSD's decision to stop short, indecisively creep, brake/lunge. FSD will do that whether the SUV was there or not.
 
If it stops short like junk, indecisively creeps like junk, brakes and lunges like junk, sticks its front end into traffic like junk, then its the same JUNK we've had for too long now.

Your white SUV right of way red herring has no impact on FSD's decision to stop short, indecisively creep, brake/lunge. FSD will do that whether the SUV was there or not.

What video are you watching? None of what you describe is shown.