Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
It feels like 2019 to me right now. FUD is high and price not reflecting tremendous advances. Demand problems narrative circulating, even among previously vocal bulls. Me looking at the chart and seeing it could go lower and feeling it in the pit of my stomach. This is how I felt throughout the summer of 2019 after June lows. We all know what happened in the following months and into 20. It is true that macros are much different right now, but I will do what I did back then. Keep picking off shares when appropriate, load up on OTM leaps and keep the faith that this team can and will execute. WWIII notwithstanding, we will level up again here soon. It has been a healthy consolidation period and soon we will see another run to ATH IMHO.

Been around here for a minute and don't post much but thanks to the regulars who work hard (and those who don't work that hard) and contribute value here.....you know who you are. Cheers.
 
I can see a technical argument for how you can do what sensors did with came5ras better, given enough data, and enough AI. However, the correct way to do this, is for tesla to enable this for 100,000 cars that still have sensors, and verify that at no point the cameras were ever wrong, THEN disable those sensors, THEN, announce that they wont fit sensors in future.

My Tesla model Y will cost me £73,000. For tesla to prematurely remove parking sensors from a car at that price is REALLY bad optics. By all means remove it once everyone says 'sheesh, why do they still install those legacy sensors'. Don't nickle-and-dime people buying luxury cars.

Its not like profit margins are razor thin, and I doubt those sensors cost much. You only need one person to reverse over their beloved pet because Tesla disabled parking sensors and you have bad PR for another month. Not worth the saving.
 
No use getting my feathers in a ruffle over the foolishness of people with lots of money and power to play the market for any potential for short-term gains. (or other more nefarious purposes)

The company is solid and can weather drops in the SP better than many, many (all?) of its rivals.

As a shareholder this is but a queue to play dead for a little while. I have more patience than they have money to burn.
 
Last edited:
Btw, as a resident China manufacturing bear (predicting that over the next ten years manufacturers will leave China for global manufacturing … note that doesn’t mean Tesla since it will stay in China to manufacture for the Chinese market at the very least), I just wanted to post this good video by a Phd economist. Yes, China has all sorts of economic problems, property bubble, mal-investment, inflation, capricious government, but even with all that, it doesn’t mean their economy will “collapse”. More likely a slow managed decline similar to Japan’s.

 
I can see a technical argument for how you can do what sensors did with came5ras better, given enough data, and enough AI. However, the correct way to do this, is for tesla to enable this for 100,000 cars that still have sensors, and verify that at no point the cameras were ever wrong, THEN disable those sensors, THEN, announce that they wont fit sensors in future.
I can’t speak to the timing and optics but I would imagine Tesla has had the vision version running in shadow mode in the fleet for a while now. The software can compare sonar distance measurements to the visual distance measurements and learn with that data.
 
Still 5 pages to catch up, but it doesn't look like anyone has made this connection as of yet.
It's nice that Tesla will save a few bucks and a miniscule amount of weight by removing ultrasonic sensors.
It's a hell of a lot more vital that they eliminate the necessity for them from the coming Optimus.
Did people miss that they stated during AI day demo that the 'bot was utilizing the FSD stack straight out of Auto?
Vision will eliminate a dozen or more sensors, one would imagine, and countless hours of engineering and a fair amount of complexity in manufacturing.
Plus, it would require a separate development of the software stack, custom tailored to the sensor input.
Same reasoning eliminated Lidar/Radar.
Nobody realized this is actually for Optimus?
 
I can’t speak to the timing and optics but I would imagine Tesla has had the vision version running in shadow mode in the fleet for a while now. The software can compare sonar distance measurements to the visual distance measurements and learn with that data.
My take as well. This is not an arbitrary change. They already know it works.
 
Here is the challenge for cameras that are positioned where it is positioned today: How will it see how many inches it is from an adjacent car at highway speeds? How will it know how far is the low curb parallel to the car?.

Overloading the processing of vision data for stuff that are easily done by cheap sensors will go the way of removing wiper sensors - how did that work out? :) .

I will tell you how that worked out: Broad daylight with sun beating down and it hasn't rained in weeks and no cloud in the sky. Bone dry. Out of the blue, my wipers start going FULL speed and it won't stop. It is making screeching noise. My passenger is laughing uncontrollably. I had to turn off Autopilot and then turn off wipers. Of course 50 grams weight savings would have increased the range by 1 cm.
This is the occupancy network and occupancy flow network in vector space. This is the way.
 
Until the Q3 financials are released I think we are at the mercy of the twitter debacle.

Financing environment is much worse now vs when the deal was announced so it’s unclear to me if Elon has to put up more $$ to complete the takeover.

Plus all the distractions once it closes.
If it’s not the Twitter issue, it will be something else there’s always an issue. Tesla trades in opposite world. I have been here for a long, long time. The Q3 financials mean nothing – – how they match up to someone’s expectations of what they should make is meaningless. Those people making those expectations aren’t making the cars, shipping them, providing an order page, taking orders, etc. What we do know is that every car that is made is already sold. With two new factories coming on board, that will mean more volume down the road and fewer people to buy gas…
 
I have a friend who owned an ID4 for half a year, but recently sold it and now has a Model Y on order. Every time we'd talk about our EV's he'd be complaining about his ID4 while I'd be gushing about my Model Y. I was surprised to hear all of the software issues he'd had in six months, it sounded terribly frustrating. At first he did not like the Tesla interiors, but after a few times in my car he's gradually come around to the spartan design. The fact Tesla software always works and is snappy also made him incredibly jealous.

He loved the way his ID4 drove, but the constant UI issues and lack of any updates just wore him down.

Europe being so far behind the US and Asia in electronics and software is ultimately going to be what hurts it the most in the EV transition.
 
Still 5 pages to catch up, but it doesn't look like anyone has made this connection as of yet.
It's nice that Tesla will save a few bucks and a miniscule amount of weight by removing ultrasonic sensors.
It's a hell of a lot more vital that they eliminate the necessity for them from the coming Optimus.
Did people miss that they stated during AI day demo that the 'bot was utilizing the FSD stack straight out of Auto?
Vision will eliminate a dozen or more sensors, one would imagine, and countless hours of engineering and a fair amount of complexity in manufacturing.
Plus, it would require a separate development of the software stack, custom tailored to the sensor input.
Same reasoning eliminated Lidar/Radar.
Nobody realized this is actually for Optimus?
I was hoping that the robot will 'see' outside of human visible wavelenghts. This can only happen with sensors, AFAIK. The robot would have been more universal when it could 'see' UV, infra-red, and all that sensors could detect.

Hope this will still be the case.
When stonk go up?
Yes.
 
  • Like
Reactions: Nocturnal