Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Speculate: what the heck happened to Chris Lattner?

This site may earn commission on affiliate links.
What's the issue here? I put 30K plus miles on per year - I have my streaming stations which I like - and I have my phone. On my phone is audible, google play music, etc. If I want an audio book I select "phone" as the audio source and bring up a book on my phone - meanwhile autopilot is keeping me in lane on the freeway. What do you wish the car did that it doesn't?
Why should I use another device when the car has a media player already? I don't want to be distracted by my phone or the media player. I just want it to work. And don't act like streaming doesn't have it's problems..... It does.
 
  • Like
Reactions: X Fan
I want it to play my music in the highest quality of which the sound system is capable, and that should be no less than CD quality. Currently, high-quality audio formats like FLAC can only be played from a USB-connected storage device. But the car actually does play high-quality music from USB, so that isn't the point. The real issue is that I wish the random shuffle feature actually worked. :eek:

This I did not know, actually. Okay you got me excited now for higher audio quality - so can I use my phone as the source via the usb cable it charges with? Are there any 4G LTE streaming services which you can subscribe to that pull down CD quality audio onto your phone, and then you serve that audio via the usb charging cable into the Tesla's audio system?
 
Why should I use another device when the car has a media player already?
For the same reason that all-in-one software suites almost never have the same quality of functionality as individual best-of-breed packages. It's not realistic to expect an automaker's entertainment software to be as good as the latest you can get on a tablet or phone. Phone nav systems and entertainment options have always been superior to what is built into cars - which makes sense. You can pound your head against the wall on an internet forum or you can start enjoying the high quality interfaces available for your tablet/phone and just breathe. Then, if what you guys are saying about USB connectivity and higher quality audio data is true - just feed USB from your phone/tablet and use the phone as the controller. Or keep hoping Tesla gives what you want. Maybe they will but since it hasn't affected their sales I wouldn't hold my breath.

On the other hand Model 3 is coming shortly so it does seem reasonable that a revamp of the media player is in the works.
 
@supratachophobia - the other things is that compared to the German, British and Italian cars I migrated from, Tesla's interface and media quality is simply light years ahead. The interfaces on what I'm used to were pure torture but I bought the cars anyway because of their other virtues. Apple Carplay would be nice in a Tesla but, for whatever reason it seems like it ain't coming.
 
  • Informative
Reactions: supratachophobia
Lattner is not a cloud/edge computing guy either. He developed LLVM and Swift, which are a compiler framework and computer language respectively. He's justifiably famous in his own domain of expertise, but it makes him suitable for managing a general software development effort, not a domain specific one. If they wanted a cloud expert, there are plenty to poach from Amazon AWS, MS Azure or elsewhere.

In the case of computer vision, just throwing people at it is not enough. It's a new, rapidly evolving field where Karpathy is among the pioneers. He's an expert on dense image classification, something Tesla would find valuable in Autopilot. On the other hand, Karpathy is probably not suited to general software engineering.

He didn't need to be a cloud expert, or a computer vision expert to lead the software development efforts. My argument was there was enough room to have both a general software development lead, and someone leading up vision computing. Where the vision computing lead was actually under the general software development lead.

His compiler and computer languages expertise made him perfect for Tesla's efforts to design their own chips. So he could lead the development efforts for the compilers and tools needed to use those chips.

I'm also of the belief that he did accomplish a lot in the 6 months or less that he was there. Sure people say that AP2 is behind, but I don't blame him for that. I firmly believe Tesla purposely misled people on AP2, and they were nowhere near where they needed to be on the SW when they released the HW.
 
For the same reason that all-in-one software suites almost never have the same quality of functionality as individual best-of-breed packages. It's not realistic to expect an automaker's entertainment software to be as good as the latest you can get on a tablet or phone. Phone nav systems and entertainment options have always been superior to what is built into cars - which makes sense. You can pound your head against the wall on an internet forum or you can start enjoying the high quality interfaces available for your tablet/phone and just breathe. Then, if what you guys are saying about USB connectivity and higher quality audio data is true - just feed USB from your phone/tablet and use the phone as the controller. Or keep hoping Tesla gives what you want. Maybe they will but since it hasn't affected their sales I wouldn't hold my breath.

On the other hand Model 3 is coming shortly so it does seem reasonable that a revamp of the media player is in the works.

For now.

However, I suspect that Tesla sees this as a future growth opportunity. Imagine a whole suite of Tesla Apps available on your primary screen (a la Apple products). From driver compatible software (voice activated, or Siri-like read outs, etc....) or even just "regular" apps for future use with FSD (Netflix like streaming?).

A large driver for Apple profitability is their closed ecosystem that people buy into. Imagine that in your car, then put that capability and make it portable into your phone via your Tesla App.
 
Reading the CV is very informative though.

The AP2 features that Chris has delivered in 6 months could not have ALL been developed tested and implemented from scratch in that time.

So that suggests that the AP2-AP1 parity work has been an emulation/data migration task, which points to the same DNN being used for both AP1 and AP2 hardware.
 
He didn't need to be a cloud expert, or a computer vision expert to lead the software development efforts. My argument was there was enough room to have both a general software development lead, and someone leading up vision computing. Where the vision computing lead was actually under the general software development lead.
Well... this is a thread on why he left, not what he could do there still :) Chris never looked like a good fit to begin with, in terms of technical alignment . Plenty of people including me wondered "what's the LLVM guy doing running Autopilot ?"
S4WRXTTCs said:
His compiler and computer languages expertise made him perfect for Tesla's efforts to design their own chips. So he could lead the development efforts for the compilers and tools needed to use those chips.
Umm no. They have the famous Jim Keller, who's a lot more suited to chip development , than a compiler guy . In fact Keller has been assigned Lattners responsilities along with Karpathy since Lattner left .

As for whether Tesla intentionally misled people about Autopilot capability, as an engineer who works on ConvNets as a profession, it sounds too sensationalist unless someone also spells out the precise technical basis for the claim. This is a new field, and what Tesla demands is a tough problem.

When your product is essentially at the bleeding edge of current state of the art and theoretical knowledge, setting timelines doesn't work out all that well, especially Elon Standard Time ones .
 
When your product is essentially at the bleeding edge of current state of the art and theoretical knowledge, setting timelines doesn't work out all that well, especially Elon Standard Time ones .
This is an excellent point. Most people have no appreciation of the size of the problem. Even the techniques (ie deep learning methods used) are developing and changing rapidly. The vision contests show rapid improvement in accuracy. But applying them to real world problems is another layer of difficulty. This dxplains why having patience regarding results is a good quality to have :cool:
 
  • Like
Reactions: neroden
@supratachophobia - the other things is that compared to the German, British and Italian cars I migrated from, Tesla's interface and media quality is simply light years ahead. The interfaces on what I'm used to were pure torture but I bought the cars anyway because of their other virtues. Apple Carplay would be nice in a Tesla but, for whatever reason it seems like it ain't coming.

I refer you to the Empeg. *THE* best media player interface of all time. Done with a handful of developers 15 years ago on far less hardware than is available on the Tesla. It is the benchmark to which I measure all players, including apps on the phone. They all fall short from an easy-of-use, minimal interaction reference point. If I could install one of my players in the Tesla, I would have done so 3 years ago.
 
  • Love
Reactions: neroden
For the same reason that all-in-one software suites almost never have the same quality of functionality as individual best-of-breed packages. It's not realistic to expect an automaker's entertainment software to be as good as the latest you can get on a tablet or phone. Phone nav systems and entertainment options have always been superior to what is built into cars - which makes sense. You can pound your head against the wall on an internet forum or you can start enjoying the high quality interfaces available for your tablet/phone and just breathe. Then, if what you guys are saying about USB connectivity and higher quality audio data is true - just feed USB from your phone/tablet and use the phone as the controller. Or keep hoping Tesla gives what you want. Maybe they will but since it hasn't affected their sales I wouldn't hold my breath.

On the other hand Model 3 is coming shortly so it does seem reasonable that a revamp of the media player is in the works.

I still respectfully disagree that we are not asking Tesla do create a media player with outrageous off the wall features. We just want something that works, bug-free. And then maybe we can ask for hot-linked tag information. Imagine if you were listening to an audio source and you could tap the album name and hear another song from that album? Or tap the artist name from the Now Player screen, and another song from that artist would show up? I find that with satellite radio, while you can "like" a song, it is still up to the algorithm to pick the song. And more times than not, I just find myself clicking next next next trying to find something I like. And then there is the failure of content with kids in the car. Chris Rock talking about race is not in the same genre as Weird Al......
 
This is an excellent point. Most people have no appreciation of the size of the problem. Even the techniques (ie deep learning methods used) are developing and changing rapidly. The vision contests show rapid improvement in accuracy. But applying them to real world problems is another layer of difficulty. This dxplains why having patience regarding results is a good quality to have :cool:
Not just detailed image classification, but in essentially real time. Elon could get all his Autopilot promises worked out right away, provided we all agree to drive at no more than 0.01 mph :) ILSVRC training takes days or weeks. Of course the average Tesla problem set is not to distinguish 1000 categories from millions of images, but it's still a complex problem that not only needs a strong theoretical solution (Karpathy is the industry expert on image tagging and classification) but needs a robust real time solution.

Most people not in this field don't quite grasp the magnitude of the scientific and technological problem we're dealing with. And Elon doesn't help matters by making face palm worthy Twitter schedule claims. But I work in the field and admire the rate at which things are developing. I don't really expect certain proclamations to bear out within the timelines claimed, but the rate of progress of the technology is quite astonishing regardless.
 
This is an excellent point. Most people have no appreciation of the size of the problem. Even the techniques (ie deep learning methods used) are developing and changing rapidly. The vision contests show rapid improvement in accuracy. But applying them to real world problems is another layer of difficulty. This dxplains why having patience regarding results is a good quality to have :cool:

Which makes it all the more irresponsible for Tesla and Elon specifically to make it sound like progress will be swift, especially if they are already taking their customers money. They should lay out a realistic timeline and put in the proper context of how difficult the problem really is for their engineers to solve and where they currently are at.
 
Which makes it all the more irresponsible for Tesla and Elon specifically to make it sound like progress will be swift, especially if they are already taking their customers money. They should lay out a realistic timeline and put in the proper context of how difficult the problem really is for their engineers to solve and where they currently are at.

Lol. This is why you are not a billionaire.
 
Well... this is a thread on why he left, not what he could do there still :) Chris never looked like a good fit to begin with, in terms of technical alignment . Plenty of people including me wondered "what's the LLVM guy doing running Autopilot ?"

Umm no. They have the famous Jim Keller, who's a lot more suited to chip development , than a compiler guy . In fact Keller has been assigned Lattners responsilities along with Karpathy since Lattner left .

As for whether Tesla intentionally misled people about Autopilot capability, as an engineer who works on ConvNets as a profession, it sounds too sensationalist unless someone also spells out the precise technical basis for the claim. This is a new field, and what Tesla demands is a tough problem.

When your product is essentially at the bleeding edge of current state of the art and theoretical knowledge, setting timelines doesn't work out all that well, especially Elon Standard Time ones .

Umm, Jim Keller is a chip designer. He's not a programmer so I have no idea what you're talking about. What I was referring to was designing the compiler and the tools for a new chip.

As to Tesla misleading people about the timeline that really comes down to the sheer amount they had to do, and that it was completely unrealistic that they would have it done within the timeline they gave. You can conclude what you want from it, but to me it was a calculated lie. They had to lie because of the position they were in. They tried doing the RIGHT thing by using both the MobileEye Technology, and the NVidia technology on HW2 but MobileEye wouldn't let them. So to me they were stuck between a rock and a hardplace.

What we do agree on is neither of us feels he was a good fit for leading the Autopilot effort. I don't think he was ever a good fit for that role.
 
Last edited:
Umm, Jim Keller is a chip designer. He's not a programmer so I have no idea what you're talking about. What I was referring to was designing the compiler and the tools for a new chip.
Your previous post states "His compiler and computer languages expertise made him perfect for Tesla's efforts to design their own chips", which is easy to misinterpret..Regardless, you don't need either of those when designing a new chip, unless so also want to design a new ISA, in which case you need someone like Jim more than someone like Chris. In reality, a new ISA is completely unnecessary, and most companies with their own silicon (e.g. Apple with their Ax chips, Qualcomm's Snapdragon, Nvidia Tegra, Huawei Hisilicon...) all license reference designs and/or ISA from ARM. They then amortize costs over millions in sales.

On the GPU side, you'd likewise pick CUDA from NVidia or AMD Mantle, or OpenGL/Vulkan to address programmability. None of these require reinventing the wheel, and in fact, since AP2 is NVidia DrivePX2, Tesla is implicitly committed to CUDA anyway. There's a wealth of CNN/DNN work leveraging CUDA (cuDNN), and most recent deep learning and computer vision projects leverage NVidia gear, starting from AlexNet, which used 2x GTX580s (trained for 2 weeks!) for image recognition using ConvNets. Tesla could create their own chips, but it's a pointless capital sink when NVidia builds kit like Drive PX2, and its follow-on, the Xavier SOC.

In my opinion, this entire discussion about building compilers, languages, chips etc is pointless. The hard problem of deep learning for computer vision in ADAS is a) accurate image recognition and discrimination and b) real time results to car controls. Lots of things can be recognized with great accuracy today, given adequate training and adequate time to convolve choices with high confidence and then drive the car accordingly. For that, you need deep learning experts to help you train the cars accurately using the wealth of shadow-collected data from the cars. That's where Karpathy comes in. Tesla should have hired someone like him, Yann Le Cun or another well known member of the computer vision community much sooner.
 
It's worth mentioning: They neither hired Chris Lattner to build toolchains, nor did he end up doing so per his resume. The initial statement from Tesla was that he was able to understand complex systems. And the statement from Chris himself was that he was looking for a new challenge. That all sounded like a career pivot to me.
 
If you look at the timeline and consider the fact that Elon could have still hired Andrej Karpathy as his head of AI and left Lattner in charge of the overall software, then I think it's likely true what Business Insider reported:

A source close to the situation tells Business Insider that Lattner loved his job at Tesla but that he and Tesla CEO Elon Musk didn't get along. Elon Musk was the reason one of Apple's most famous developers left Tesla after only 6 months

SPECULATION:
The timeline is especially interesting considering how quickly he left after the rollout of the supposed "silky smooth" version, which people have reported as only a marginal improvement. Either Elon is delusional about the level of improvements, or perhaps Lattner made a command decision and removed one of the features/behaviors in the "silky smooth" version for testing, stability, or other reasons without first telling Elon, and that was the last straw. What better way to piss off the boss than to pull the rug out from under him after he's publicly bragged about something?