Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
Wait, are you saying you regularly get that on an S85?
Mar_1_2019_jerry.jpg
 
In our house if anyone is more than 5 minutes late we send out the National Guard and then have to listen to an hours long lecture about why that’s not acceptable behavior, followed by an exhaustive list of no less than 83 of the most horrifying possibilities that didn’t happen but could have in that five minutes. I have my doubts, though, that you can Scarface a body in that length of time.

Meanwhile, there was an Oyama concert last night announced for 8:30 PM, so I left home at 10:05 PM, got there at 10:20 PM, and still had to wait about 15 minutes for the band to play ;) Lateness is normal here.

(Okay, to be fair, there was an earlier band that I didn't care for and deliberately skipped, but even they probably didn't start until at least 9:30-9:45 or so... an hour or so after the nominal concert time is pretty typical, and I've had as late as 2-3 hours after)
 
I wasn’t thinking about that scene, which clearly can make a mess in 5 minutes. I was thinking of the bathroom scene. That I believe takes more than 5 minutes to complete. But I don’t have any real world experience. Anyone?
Yeah, I figured.... but there's no iconic lines to go with that scene.

In other news I recently bought a new electric chainsaw. Very quiet.... you know... in case you wanted to … uh... avoid er… "raising suspicion".
 
  • Informative
Reactions: Krugerrand
I expect that Elon's confidence comes from a mix of A) radical improvements they're seeing in development versions vs. deployed versions, and B) the fact that increasing the number of neurons in a neural net isn't a O(N) gain, it's a O(N²) gain.

More neurons isn’t necessarily better, it’s the weights that store information. To see this imagine adding extra neurons to the network without any weights associated with them.

Would like to see a source for the O(n^2) ”gain” whatever that means.
 
  • Informative
Reactions: VValleyEV
FSD represents a lot more than just the Tesla Network. Also, I would challenge the statement that "most people think FSD is a long way off". Neither you nor I have any way of knowing what "most people" think or don't think so that argument becomes nothing more than speculation unless you have data to back thus up?

Dan

Where do you think the stock price would be if "most people" believed Teslas would be capable of earning money for their owners in three year's time?
 
  • Helpful
Reactions: lklundin
More neurons isn’t necessarily better, it’s the weights that store information. To see this imagine adding extra neurons to the network without any weights associated with them.

Would like to see a source for the O(n^2) ”gain” whatever that means.

Time complexity - Wikipedia

Number of connections between neurons rises in proportion to the number of neurons per layer squared - e.g. O(n²). The more complex the problem, the more complex the network you need to represent it.

Weighting is about training, which I already mentioned.
 
More neurons isn’t necessarily better, it’s the weights that store information. To see this imagine adding extra neurons to the network without any weights associated with them.

Would like to see a source for the O(n^2) ”gain” whatever that means.

Per level:
number_of_weights = number_of_neurons * connections_per_neuron
For each object type you want to identify, you need a neuron. Each of those need some number of lower level feature neurons which need sub feature neurons which need....
 
Where do you think the stock price would be if "most people" believed Teslas would be capable of earning money for their owners in three year's time?
Stock price has consistently proven it is not related to public opinion in any way. Institutional investors/speculators manipulating the stock to their advantage. Not that this is in any way unique to Tesla. This is also why I think the news (whatever it is) on Monday will have little effect on the stock in the long run, unless it is true full feature FSD, which I doubt. News will be spun...media markets will run with it...FUD will fly. In the end, maybe $15 to $20 swing, either way. Nothing earth shattering in the long run.

Just my opinion of course...but that's the whole point, it's all speculation at this point and thus a fruitless argument.

Dan
 
Last edited:
Per level:
number_of_weights = number_of_neurons * connections_per_neuron
For each object type you want to identify, you need a neuron. Each of those need some number of lower level feature neurons which need sub feature neurons which need....

Um, no, that's not how it works.... :) Information in a neural network is not localized; it's distributed. There is no "car neuron" in HW3, sitting next to a "truck neuron", next to a "pedestrian neuron", etc ;)

As for the number of connections, at least in the cases I'm familiar with, each layer involves a connection between each neuron in that layer and each one in the next adjacent layer. That said, while I've worked with the output of neural nets in programs before, I've never written one myself, so there's probably a diversity of architectures out there.
 
Last edited:
I'm interested to see the webcast Monday because it doesn't matter what my Model 3 can do. We have no idea what kind of progress has been made on HW3. Of course, it would be unlike Elon to actually show off a finished product but hopefully it can give us more confidence that Tesla is actually in the lead on FSD.
 
I think EM really got major improvement in autopilot to show off this time: the neural net architected from scratch by Karpathy which can only run off hw 3.0 nn chip, as mentioned in Q3/Q4 ER.

His recent tweet is not making guesses about when it will be done, he just can't wait to show off what has been achieved already. He is in a really good mood as seen on twitter - so good that the horrible S&X delivery number did not seem to bother him one bit. My guess? They hit a major milestone in FSD development recently.

re: autonomy investor day being scheduled right before Q1 ER -
They will have major plans announced in Q1 ER (well, has to be Tesla network) which requires proofs of FSD progress from the autonomy day.
I'm optimistic about FSD progress as well. If Monday's event goes well, Q1 CC indicates further ramping of battery cells by Panasonic along with model 3 production now around 6,000/week, increased orders for model S/X, reiteration of guidance for the rest of the year, and SEC resolution later in the week, I think there is a very good chance we get out of this long downtrend.
 
Hmm, wouldn't referral prizes show up as losses on the balance sheet in the quarter that they're earned, due to them being unmet financial obligations? Both GAAP and non-GAAP? How much of an effect on the balance sheet would this "rounding error" have - maybe $1-2k per referral(? - 2% of the price of a Roadster, minus margins, reduced by the (probably high) rate of people who won't cash in), times... how many per quarter?

My initial question was about how the obligations are accounted for. Logically, the expense would be accrued on the Income Statement and as an Accrued Liability on the Balance Sheet when the underlying reservation was converted to an order. However, Tesla does not accrue regulatory credits when earned, but only when sold; so maybe it will not recognize the obligations until the referers actually receive the benefits. The initial rationale for the referral program was that it would reduce the need for additional physical sales locations so the expense should be part of SG&A (or possibly Other Income/(Expense)).

Slim chance some analyst would try to clarify on the next CC.
 
  • Love
Reactions: NicoV
So what's your specific technique? Speed management primarily?
Mostly a lot of practice on when to speed up and when to slow down. It helps if you have a commute that allows a different route coming and going. About half the miles are trip miles, which doesn't help the average any. The last couple of months have seen almost no miles added due to loss of employment.
 
  • Like
Reactions: scaesare
Stock price has consistently proven it is not related to public opinion in any way.

I agree with this to a point, but am not pounding the table as vigorously as you.

Let's hope that the FSD investor demo proves 1) me correct by changing the public's perception of FSD capability and timeframe and 2) you correct by inducing so much fear in shorts/manipulators/etc. that they finally see the writing on the wall.

Fingers crossed!
 
Time complexity - Wikipedia

Number of connections between neurons rises in proportion to the number of neurons per layer squared - e.g. O(n²). The more complex the problem, the more complex the network you need to represent it.

Weighting is about training, which I already mentioned.

I think my (and perhaps @heltok 's) confusion about your comment regarding O(n) or O(n^2) is 1: what does it mean to measure "gain"?

And 2: time-complexity / big-O notation to refer to an improvement here is weird: an algorithm that runs in O(n) time is far preferable to one that runs in O(n^2) time.
 
I think my (and perhaps @heltok 's) confusion about your comment regarding O(n) or O(n^2) is 1: what does it mean to measure "gain"?

And 2: time-complexity / big-O notation to refer to an improvement here is weird: an algorithm that runs in O(n) time is far preferable to one that runs in O(n^2) time.

Fair enough. The point was that amount of encodable information rises quadratically with the number of neurons per layer. The use of big-O notation was misleading in this regard.