Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model S REST API

This site may earn commission on affiliate links.
It works just fine. I wrote a couple of apps and have open source them. See Windows Tesla Auth Token Generator and My Tesla Model 3 “Keyfob”.
As another reply mentioned, apparently the streaming API isn't working on Model 3 (yet?).

We recently took delivery of a Model 3 and my existing logger kept marching along, adding the 3 to the cars recognized on the account. Sure enough, the base/pull API is working on the 3 but the streaming is not.
 
As another reply mentioned, apparently the streaming API isn't working on Model 3 (yet?).

We recently took delivery of a Model 3 and my existing logger kept marching along, adding the 3 to the cars recognized on the account. Sure enough, the base/pull API is working on the 3 but the streaming is not.

It’s the same for any 2018 S and X, which I find quite bizarre. You’d think the software would be the same on them all.
 
As another reply mentioned, apparently the streaming API isn't working on Model 3 (yet?).

We recently took delivery of a Model 3 and my existing logger kept marching along, adding the 3 to the cars recognized on the account. Sure enough, the base/pull API is working on the 3 but the streaming is not.

Forgive me, new to Tesla, but what does the streaming API provide? Say if I was thinking about building a data logger for myself (similar to TeslaFi) - would I use the streaming API for that to get a lower latency stream, or is that something else?

And has anyone written a geo-location based code that can trigger IFTTT or other rules based workflow?

Thanks.
 
Forgive me, new to Tesla, but what does the streaming API provide? Say if I was thinking about building a data logger for myself (similar to TeslaFi) - would I use the streaming API for that to get a lower latency stream, or is that something else?

And has anyone written a geo-location based code that can trigger IFTTT or other rules based workflow?

Thanks.

It provides quickly-updating location information and other data used by the app's map feature while the car is driving. You can get all the same info, plus a lot more, from the REST API. You just won't be able to get it at the same frequency.
 
  • Like
Reactions: bahree
what does the streaming API provide?
For (apparently only older) Model S and X...

REST API Tools and Apps
Tesla Model S JSON API · Apiary

They don't seem to talk much about the streaming API.

Random web search hit upon...
hjespers/teslams

which includes...
Code:
.describe('v', 'List of values to collect')
.default('v', 'speed,odometer,soc,elevation,est_heading,est_lat,est_lng,power,shift_state,range,est_range,heading')

I don't know if that list is current or not.

IIRC the data rate from the streaming API is (at most) 4 samples per second. When the car "sleeps" (i.e. you put it in park and a few minutes passes), the streaming API stops reporting. You can invoke commands to wake the car up (separately) to resume the streaming and/or (I think) prevent it from "sleeping".
 
How are you reverse engineering the API, anyway? I probably have the skills and would be willing to assist. I'd like to figure out the API for the summon feature.
Using the Android app I do it in 2 ways - sniffing network traffic and decompiling.

To sniff the network traffic of the official Tesla app you will have to bypass the certificate pinning the app performs. I do this by using a custom Xposed Framework plugin I wrote that stubs out the certificate pinning at runtime.

To decompile the app I run the APK through an online decompiler, then run the resultant minified obfuscated React javascript through a beautifier
 
So I've noticed the streaming tokens returned by the APIs lately are sometimes invalid. This is easily observable by sniffing the official app's network traffic and every so often the tokens returned are invalid such that attempting to summon or access telemetry (location screen) will repeatedly fail with 401 unauthorized "can't validate password".

I'm still trying to safely conclude what the root issue is, but after extensive testing all signs point to the conclusion that this is a relatively new issue introduced on Tesla's server-side in the past few months. If this is the case then it sucks as the best you can do now is cache all your streaming tokens and hope one works eventually. Would love to be wrong on this one!
 
I have problems with the tesla api
You send me error 503, is there a problem?
When they think it is solved
Thank you

That was happening a lot yesterday. It was actually completely offline (connection timeout) for a long time yesterday, and then when it started accepting connections again, there were lots of error messages, including 503, but some other codes at times, too. I have not seen any errors since 5:02 PM PDT yesterday, though.
 
It’s been years now, does anyone know of a Tesla API implememtation in python that includes the streaming implememtation? teslajson seems to work with the new encoding stuff, but pytesla that claims streaming support does not (same oauth token error as node/teslams).
If you find a solution let me know.

Also note there's a pull request for the fix for the hjesper's teslams code, though it still isn't working for me. But that could be something on my side.
 
If you find a solution let me know.

Also note there's a pull request for the fix for the hjesper's teslams code, though it still isn't working for me. But that could be something on my side.

I got around it temporarily with using token authentication instead. In the latest release of teslams thw code supports also that instead user and pass you give token. I used teslajson to authenticate myself and print out the token I got, then used said token in teslams and it works, seems the new encoding stuff only affects getting the token.
 
  • Like
Reactions: tornado
I got around it temporarily with using token authentication instead. In the latest release of teslams thw code supports also that instead user and pass you give token. I used teslajson to authenticate myself and print out the token I got, then used said token in teslams and it works, seems the new encoding stuff only affects getting the token.
Odd. I am pulling a fresh token, cloning teslams fresh, and putting it in the config.json (with no username), and still getting

3 Sep 13:41:49 - 1 of 6 REST requests since 1536007299679
Error parsing response to oauth token request
3 Sep 13:41:49 - Error: unable to parse vehicle data response as JSON, login failed. Trying again.

Not sure how it's working for you. You're not using the code from the PR that just happened in the last day or so right?