Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
This logic makes the flawed assumption that driving a car safely is an "unknowable" task, that there's an almost infinitely long tail of edge cases.

This is not so.
Yes, it is.

Firstly, the bar of "driving 10x-100x times safer than a human" is in fact very low:
You don't have to be safer than the average human, you have to be safer than the best human.

The rest of your writing is irrelevant because you are proceeding from false assumptions.

Now, there is an important distinction. As long as there is a driver to blame, "self-driving" can be deployed very quickly.

Try to run a car with no driver at all, and you run into a brick wall of human psychology. One crash will get you shut down, unless you can prove that you're better than the best human.

This is the problem with the robotaxi business model.

There are more and more areas which are essentially totally automated but have human "babysitters" who don't do much but act as someone to blame. This is going to be one of them.
 
We should also consider, that while the road to the "perfect" NN with correct handling of all corner case might be a long one, another path to full autopilot is to simply avoid the problematic cases.

If you can't handle the 'bike on a car' situation, fall back to a safe distance and continue the journey at a slower pace, reroute or wait for another (human driven) car to lead you through the situation. That's still full self driving.

And while it might not get you many friends, slowing down or stopping (situation permitting) and simply waiting for the situation to resolve itself is also a possible way around unknown issues.

This is probably a good approach. It won't make for a popular taxi ride, but it might work for getting the taxi to the pickup point.
 
It's worth noting that Musk is now absolutely confident on autopilot, so he has likely decided to encourage leasing on a larger scale than in the past, so that Tesla will have a large taxi fleet coming off lease in 3 years.
I'd look for a stronger word. If there's one sentence to remember from Musk on the Autonomy Day conference, it's:
[Autopilot] is basically our entire expense structure
 
  • Funny
Reactions: ZsoZso and neroden
Oh my dear God! If this is true, the new raise plus this extends Tesla's runway to nearly 5 years even if it continues to lose money at this rate, by which time the semi truck will definitely make a killing with limited FSD on highway.

The Semi can make a killing even if it doesn't have any FSD at all. The economics of electric trucks are overwhelming vs. diesel.

Furthermore: Even if you have to pay a human driver with a CDL to sit in every truck, you can probably pay less if the truck mostly drives itself, since it's a less stressful job.
 
The Semi can make a killing even if it doesn't have any FSD at all. The economics of electric trucks are overwhelming vs. diesel.

Furthermore: Even if you have to pay a human driver with a CDL to sit in every truck, you can probably pay less if the truck mostly drives itself, since it's a less stressful job.
I think Semi will be bigger than Model 3. I don’t know why this is rarely discussed
 
This post shows the first example of the diagnostics reporting an issue before it causes a problem and has the part automatically shipped to your service center that I have seen.

img-1ec975b20fae80719d58383c920383af-v-jpg.403818


Hopefully they can do more predictive failure detection like this and that it actually streamlines service.


We are living in the future. :)

How many Teslas car been named 'HAL' ?

 
This is probably a good approach. It won't make for a popular taxi ride, but it might work for getting the taxi to the pickup point.

I think this is exactly what Elon is envisioning when he talks about launching Tesla Network.
The car will have a very cautious approach when travelling to the pick up point. It may take longs routes to avoid difficult junctions. Tesla will have remote operators on call if the car gets confused.
Once it picks up the passenger, they have to be in the passenger seat (have a license!) and be ready to take over. This is what Elon means when he says " There will be a little bit of an amphibian phase.” I doubt he plans to require hands on wheel detection though.
 
It's worth noting that Musk is now absolutely confident on autopilot, so he has likely decided to encourage leasing on a larger scale than in the past, so that Tesla will have a large taxi fleet coming off lease in 3 years.

My prediction is that 3 years from now when the leases expire, there will be no robotaxi fleet yet. Tesla will then allow customers to buy their cars. Nothing wrong with that.
 
None of this is true. absolutely none of it.

I completely agree with Fact Checking. I learned driving 23 years ago, spent 30 minutes to practice on parking lot, then 1 hour to study the rules so I can pass the test. I didn't learn millions of edge cases. Since then I have driven more than a quarter million miles under all kinds of conditions without accidents or speeding tickets.

I'm just an average driver. Safe driving is a relatively simple task. You need to sleep well, know where you are going, pay attention while you are driving.

Almost all accidents happen due to the following reasons:
1. Driver didn't pay attention to the road (reading a map, texting, talking to someone in the back seats). AI doesn't make this mistake.
2. Driver stepped on wrong pedal.
3. Driver followed the other car too closely.
5. Driver didn't see the car in the blind spot.
6. Driver miss judged the other car's speed.
7. Driver fell into sleep.
8. Road rage.
The list goes on... AI doesn't make all kinds of mistakes that humans do. A well designed robot car should be much safer than human drivers.
 
To make some unpleasant calculations -- suppose Tesla does go full-in on robotaxis before they're ready (which they shouldn't), and starts losing jury awards. How much are they exposed to?

Well, the value of a human life turns out to depend on *how* the person gets killed. Among other phenomena, the more robotic the method of death, and the "deeper pockets" the person/company being sued has, the higher the awards are! Traffic wrongful deaths tend to average around $10 million. But that's with a human to blame. I'm going to guess it would average $20 million/death. Tesla would be found negligent for releasing "self-driving" cars without a driver when they weren't ready, which would push it to the higher number -- corporate unsafe-practices can run $30-$100 million/death so I may be underestimating.

The political fallout would be worse. Uber was allowed back on the road after killing someone, but most states will not be as laissez-faire as Arizona.

The deaths so far in Teslas are different because the driver was supposed to be supervising (and wasn't), so Tesla will win those cases. It's a huge difference.
 
I think we can now get a rough idea of the Tesla/Fiat credit deal with your model and the new info from FCA:
  • FCA says in 2018 global credit purchases and non compliance fines cost €600m. Tesla sold $316m (€280m) of US Ghg credits in 2018, presumably all to FCA. I would guess there was no EU fine last year? If so, they had €280m US ghg purchases and €320m US fine.
  • They expect 2019 to be moderately up from €600m.
  • FCA expected a €390m fine this year (well done @generalenthu with your €430m estimate!). This sounds like they are referring to EU, but not completely clear. With the new credit deal they now expect to be close to compliance in 2019.
  • EU compliance costs are now expected at around €120m this year. I think we can presume this is all credit purchases from Tesla. It also aligns with the $140m deferred reg credit revenue in Tesla's Q1 report.
  • We also know total FCA cost of EU and Nafta credit purchases is €1,800m. As far as we know this is all from Tesla.
  • From your model FCA's EU fine would be €2.5bn in 2020 and 2021.
  • The transcript isn't clear if they are talking 2020 or 2019, or global/EU, but it sounds like FCA expects 80% of its EU fine will be reduced by credits in 2020 (with 20% from other tech) and 15% in 2021 (with 40% NEV tech & 45% conventional tech).
So assuming a continued €280m per year credit purchases from tesla in the US - this is €840m in the next 3 years. This leaves €1bn credit purchases in the EU. We can estimate this €1bn EU credit purchase offsets €390m 2019 fine, €2.5bn *80% = €2bn 2020 fine and €2.5bn *15% = €375m 2021 fine. Or a total €2,765m. This makes the EU Tesla payment 36% of the estimated EU fine - so this is roughly 3x more cost efficient for FCA. This number also roughly aligns with the €120m credit purchase for a €390m fine reduction in 2019. I think 80% EU compliance would need about 170k EVs in 2020, so this looks like c.€4.2k per car in the EU ($4.7k or likely +c.10% to Tesla's EU Model 3 gross margin).

If this is correct, Tesla should get c.€120m from FCA for the EU and c.€280m from FCA for the US in 2019 (total $450m). In 2020 they should get c.€720m for the EU and €280m in the US (total $1.1bn). Tesla should be able to record this as straight profit.

Anyone agree/disagree with these assumptions?

Agree
 
Indeed, quantifying uncertainty and responding with appropriate caution is really important. For example, the car can't easily classify the object that it is following or if that object is behaving in atypical ways, then the car should not follow closely.

Human drivers do this all the time. The car ahead is doing something strange, not driving smoothly down the center of the lane, so you give it wide birth. You don't need to know if the driver is drunk, having a heart attack, or the car has blown a tire. There are many edge case reasons why the object is not smoothly conducting itself down the road. But it does not matter whether you can accurately diagnose the root cause. You simply know enough to back off.
If this is being implemented, it'll help a lot. (No evidence of implementation yet.) I know people who, facing weird situations, have simply decided "time to pull over and call the cops for a rescue". If this is implemented, they could get to pseudo-full self-driving much quicker. (Again, no signs that they're doing this yet.)

So NN can definitely develop expectation about typical behavior or objects typically seen about the roads. When something is observed to depart substantially from these expectations, this should raise the level of uncertainty. In simple statistical modeling, we watch the residuals to see when the model is fitting poorly. There are many ways to measure departures from expectations. So it is possible to measure uncertainty in real time. The real-time response should be to back off and increase various margins of safety.
Yep. I look forward to the announcement where they say they're doing that. They didn't mention it at autonomy day, therefore they're not doing it yet.


So where is Tesla FSD on the Dunning-Kruger curve? Is it at the peak of Mt Stupid? Has it finished descending to the Valley of Despair? Or is it now climbing the Slope of Enlightenment?

images

Based on Karpathy's talk? I'd say they're just starting to descend from Mt. Stupid. It'll take a while. (Everyone else is still CLIMBING Mt. Stupid, so Tesla's still ahead.)
 
OT
Wheee! I am blocked on his twitter....he must not have liked the power of my 22 followers!
I think I have more $TSLAQ people that block me than I have actual followers.

Interesting thing, I have never followed one $TSLAQ member and have only engaged them a few times. I do follow a lot of the “FUD fighters” and maybe that is my link to being blocked. Not exactly sure but good riddance.
 
No, they can't. As you work out the far end of the curve, it's just more and more edge cases of lower and lower frequency.
Lets assume the probability distribution of edge cases causing crashes in geometric. We know the sum total of all these probabilities is 1.

Then, using the simple formula we can solve this.

thumb_540_50.jpg


I assumed first 100 cases cause 90% of the crashes and tweaked a1 to get to 90%. You can use any number, it doesn't matter, you will just get different a1 and r. As you can see each extra 100 cases gets you to next 9.

FSD9s-2.png


You could argue, the probability distribution is not geometric - and it is more lumpy. I'd agree - some of the cases (what we would call common cases, not edge), probably account for far more of the crashes. But once we start getting into edge cases and the long tail, geometric progression is more likely.

BTW, I tried normal distribution - where for a given number of edge cases solved per unit time, march of 9s is much faster.
 

Attachments

  • FSD9s-N.png
    FSD9s-N.png
    6.4 KB · Views: 29
Last edited:
Yes. They had this stupid conspiracy theory called #CRCL - Can’t Raise; Can’t Leave - which was stoked and encouraged by reporters like Charley Grant and Russ Mitchell.

According to this theory, SEC had sent a Wells Notice to Tesla, which they will have to disclose if they were to go for a cap raise. I don’t know about the Can’t leave part.. I can’t be bothered anyway. All these theories hit a solid wall when Elon raised cash like a Boss (or an Absolute Unit). :)

This is going to be the final nail on the coffin



Subscribe to read | Financial Times

As Elon tweeted on Feb 25th, the day the FCA deal was signed - “Fate loves irony” - very apt
Link doesn't work - instead cut paste this into search to get the article: "Fiat Chrysler Automobiles has said it will pay electric carmaker Tesla close to €2bn to help it meet tough new emissions targets"

yes but the article i was directed to by FT was about fca earnings and didn’t quote anything about them paying tesla, or what amount.

what am i missing? where is the source of this info?

thanks in advance