Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla autopilot HW3

This site may earn commission on affiliate links.
They'll do both.
Some regions won't be covered by TN, and those need converted to sustainable energy too.

Selling cars gets them cash now, TN gets them debt now and cash in the future.
Thus the cashflow (profit?) neutral answer Elon gave, sell cars to make money, use that money to expand robo fleet.
As to car sales price, $275k doesn't work since it prices in all future revenue. It would also be crazy to insure without a 'FSD transfer to new body at normal cost' replacement clause.
Without getting too OT, I posted about this earlier. $275k still allows 20% IRR. With TN Tesla can be cash flow positive in the first year itself after covering COGS etc. So, not much of debt in TN. Also, there is some good discussion on the insurance aspect of this in that thread. The main reason I think Tesla is starting insurance.

Tesla, TSLA & the Investment World: the 2019 Investors' Roundtable

Problem of not having TN in some places is interesting. You can't sell the cars there (for cheap) because people can just bring the car over to the place which has TN and arbitrage.

ps :

No, Tesla will continue to sell cars. And they won't cost $275K. The first driverless-capable Model 3 will go for around $75K because of required "options." (As in the upgraded interior we had to buy if we wanted a first-production Model 3.) In fact, Tesla won't put any cars on the ride-sharing app itself. It makes far more sense to sell the cars and let owners take the loss. Uber and Lyft drivers barely make back the depreciation on their cars, if that, and driverless ride share won't be able to charge more than those.
Not according to the model ARK published, which I've linked above. If you see any problems with that model, we can probably talk about it in another thread.
 
Last edited:
  • Informative
Reactions: CarlK
Just had the mobile ranger out for some minor stuff. I asked him about the HW3 upgrade. He asked me if I have HW2.5 or HW2.0

As for HW2.5 the change is very quickly but for HW2.0 they are still figuring out what to change. In all likelihood the camera's in cars with HW2.0 will be changed with the upgrade.

Enjoying the ride.

BTW he changed the cabin air filter. Was surprised how thick it was and how cheap ($50 including installation), considering it is a carbon filter
 
Just had the mobile ranger out for some minor stuff. I asked him about the HW3 upgrade. He asked me if I have HW2.5 or HW2.0
Just a heads up. Regular service personnel do not know what's going on internally at Tesla and any prediction they give you regarding the hardware upgrade, or anything to do with autopilot development in general, is likely a complete guess and wrong.
 
A possible AP2.0 to 3.0 retrofit spotted looking at the fw updates (see image).
 

Attachments

  • Screenshot_20190701-143023_Chrome.jpg
    Screenshot_20190701-143023_Chrome.jpg
    368.2 KB · Views: 115
I guess in theory the day might come when an AP2/2.5 car gets warranty replacement of APE and Tesla decides to put in HW3/FSD computer to save a later retrofit.

Then again recently this has worked the other way around: HW3 car received AP2.5 as a warranty replacement, so who knows how likely or not this might be...
 
This aged quite well!

It's like you've seen this before, and listened to what Elon said way back when HW3 was announced. :D

Europe is gonna be late to the party as usual. Hope people in EU will start waking up and do away with the draconian system

Having read the EU regulations about autonomous vehicles, I found them completely reasonable. Limits on acceleration forces when turning at various speeds seemed the most limiting and prescriptive, and that was still somewhere around 0.3G if I recall. That's still a pretty aggressive turn at any speed.

Honestly, I think the US is playing fast and loose with autonomous vehicles, which is why the shittiest states were the first to allow them carte blanche. It's not a surprise someone died in Arizona, because Arizona basically lets these companies operate with zero oversight. I'm fairly certain we want to be sure these systems are safe before exposing the general public to them, and we want to make sure operators know WTF they're doing so we don't end up with a car plowing into the side of a semi trailer or highway divider.
 
It's not a surprise someone died in Arizona, because Arizona basically lets these companies operate with zero oversight. I'm fairly certain we want to be sure these systems are safe before exposing the general public to them, and we want to make sure operators know WTF they're doing so we don't end up with a car plowing into the side of a semi trailer or highway divider.
As we all know - absolutely nobody dies on roads unless an AV is involved.
 
As we all know - absolutely nobody dies on roads unless an AV is involved.
Come on now. Fatality rate is about 1 per 100 million miles. The Uber accident did not look like a fluke especially considering their very irresponsible testing in California as well. Their test driver was watching TV! It doesn't sound like they had any system in place to ensure that their test drivers were paying attention.
States should really require two test drivers in the vehicle for AV testing.
 
Honestly, I think the US is playing fast and loose with autonomous vehicles, which is why the shittiest states were the first to allow them carte blanche.
Elon has explained before why Tesla is doing things this way; and states that allow him to go full speed are, in effect, agreeing. Elon believes that even if more people die this way, that FSD will get here faster and start saving lots of lives sooner. Therefore it is immoral to do things any other way. As it turns out though, I suspect that even if beta FSD causes several deaths, it is even now actively preventing more of them than it causes. They are just less visible, as they are things that don't happen.
 
  • Like
Reactions: bhzmark
Elon has explained before why Tesla is doing things this way; and states that allow him to go full speed are, in effect, agreeing. Elon believes that even if more people die this way, that FSD will get here faster and start saving lots of lives sooner. Therefore it is immoral to do things any other way. As it turns out though, I suspect that even if beta FSD causes several deaths, it is even now actively preventing more of them than it causes. They are just less visible, as they are things that don't happen.
I'm definitely hoping that California won't be one of those guinea pig states. It's completely possible to test autonomous vehicles in a safe manner. Other companies have training and strict rules for their test drivers and it seems to be relatively safe (except for all the rear end collisions they cause).
 
Elon has explained before why Tesla is doing things this way; and states that allow him to go full speed are, in effect, agreeing. Elon believes that even if more people die this way, that FSD will get here faster and start saving lots of lives sooner. Therefore it is immoral to do things any other way. As it turns out though, I suspect that even if beta FSD causes several deaths, it is even now actively preventing more of them than it causes. They are just less visible, as they are things that don't happen.

It might be statistically true that even a prototype FSD is safer than the average human driver but I think FSD testing still needs to have some regulation in order to ensure safe and responsible testing. We can't just unleash possibly unreliable FSD cars on the road and hope for the best. And frankly. it is small comfort for a family who loses a loved one because the FSD car was not ready yet: "Sorry your kid died when our FSD car accidentally ran him over while he was walking home from school but take comfort, many lives will be saved when we finish the FSD software in 10 years."

It is about finding that sweet spot of reasonable regulations that are sufficient to ensure safety but not burdensome. So no, I don't want to stifle FSD development with excessive regulation but I don't think it is too much to ask that FSD cars should have a safety driver until they can prove that they are reliable enough to self-drive solo.
 
Uber was irresponsible, which is not surprising. Uber has always shown a disregard for rules. During the testing phase, the safety drivers have to be trained, competent, and alert. The driver in the Uber car that killed a pedestrian was falling asleep, if we can judge by the video. Uber's attitude was "Our cars can do this. All we need is to have some schlub in the driver's seat to satisfy the regulators." They probably just offered a bunch of high-school drop-outs minimum wage to sit in the seat and told them not to worry about it, they just had to be there.

I hope and believe that Waymo, Tesla, and the others will be more responsible about having competent, intelligent, AWAKE safety drivers.
 
I hope and believe that Waymo, Tesla, and the others will be more responsible about having competent, intelligent, AWAKE safety drivers.

I agree. But we need to keep in mind that Tesla will undoubtedly release FSD to the fleet when they think it is ready for the public, requiring of course driver supervision and nags like with AP. I certainly hope that Tesla owners will be competent, intelligent and awake as safety drivers. I think the vast majority will be. But how do you guarantee that they all will be. Past experience might suggest otherwise. Maybe Tesla will continue to rely on AP nags to ensure that owners are responsible?