Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Does Tesla have a big math problem?

This site may earn commission on affiliate links.
I watched the video with the car pulling into the road unsafely, leaving itself stuck in the road (not median) as it waited for oncoming traffic to pass - but forgetting it could get annihilated staying in the road like it did, etc. Did you even watch the video?
ok, but is that worse than in prior releases? And I would say any "objective" analysis would require a somewhat deeper investigation than "I watched a YouTube video".
 
  • Like
Reactions: Yelobird
Lol. It’s objectively worse is your “anyone with a brain” review yet you don’t Have beta, have not Used it, and Choose to be a dozen software updates behind because your on strike agains the v11 UI. Lol. Your input has zero value sorry.
You realize I'm not on V11, I'm on V10 and there's no such thing as a permanent strike on my version. Which, if you were honest you would know. What benefits do I get by not updating? No eye camera tracking, smooth-consistent-predictable highway AP, radar based cruise/AP that is better than vision, AP that can exceed 85mph even though I never do, etc.

But hey, you get Spotify I guess.
 
Last edited:
ok, but is that worse than in prior releases? And I would say any "objective" analysis would require a somewhat deeper investigation than "I watched a YouTube video".
How many of his videos have you watched? I've watched them since they came out. The last version made him turn right every time instead of even attempting to turn left, that's a regression back to the BEGINNING.
 
  • Like
Reactions: 2101Guy
Please tell me the misquote:

"Next year for sure, we will have over a million robotaxis on the road," said Musk on October 21, 2019. "The fleet wakes up with an over-the-air update. That's all it takes."
Link ?

BTW, this is what you wrote above. Showing you are careless. So when you say dumb things like “anyone with a brain“, I just ignore you as a troll.

A MILLION robot axis on the road in 2021!
 
Link ?

BTW, this is what you wrote above. Showing you are careless. So when you say dumb things like “anyone with a brain“, I just ignore you as a troll.
Why don't you just google, are you scared to find the truth? You work for Tesla?

Minute 4:10, straight from the horse's mouth - all Teslas being made at that time have the necessary hardware for FSD.
Minute 6:00, straight from the horse's mouth - first operable robotaxi 2020
Minute 9:20, straight from the horse's mouth - next year for sure we'll have over a million robotaxis on the road

Tesla investor day 2019 that got me hyped for my car though I didn't trust his robotaxi BS. He also says FSD feature complete end of 2019, which is why he then claims 2020 for sure over a million robotaxis on the road in 2020. Regardless if you believe it's some sort of white lie that he really meant they could produce a million Teslas that could LATER become robo taxis, he specifically is selling the robotaxi concept and telling INVESTORS they'll have a million capable robotaxis that are just waiting on the government - which obviously is bending the truth badly and regardless, FSD is still not complete. Do you walk through life ignoring reality?

 
Last edited:
It's actually 3:13:30 in for the quote. This link should go there.

The next version should fix the Chuck's unprotected left turn and we can close this thread. :p
I think the perception NN is simply not good enough to measure the speed and distance of vehicles with enough accuracy to do the turn. I can't understand how it could be a planner issue. It would be nice if they extended the visualization beyond 200 feet. Of course if they fix it the next step for Chuck is nighttime turns!
 
I think the perception NN is simply not good enough to measure the speed and distance of vehicles with enough accuracy to do the turn. I can't understand how it could be a planner issue.

It can be a planner issue because if the perception module can't accurately gauge speed and distance of the traffic in order to make an unprotected left, then the planner should be smart enough to abandon the left turn plan, and tell the car to turn right and then make the next available u-turn. That's my thinking , at least.
 
It can be a planner issue because if the perception module can't accurately gauge speed and distance of the traffic in order to make an unprotected left, then the planner should be smart enough to abandon the left turn plan, and tell the car to turn right and then make the next available u-turn. That's my thinking , at least.
Certainly. However, the perception NN determining what it can't see sounds like an even harder problem to me. Hard to say though since my current assumption is that FSD Beta is optimized to make impressive YouTube videos not minimizing risk of collision.
 
Why don't you just google, are you scared to find the truth? You work for Tesla?
Dude, I don't have to "Google". I don't even use Google - I worked for a long time in MSFT, so I'm used to Bing.

Are you part of TSLAQ ?

I remember all those days. We have discussed this distortion by media umpteen times. You go read the market thread during those days. He was talking about "cars with hardware capable of FSD". He repeated that in multiple interviews that year. You go listen to those as well.

It's actually 3:13:30 in for the quote. This link should go there.

"by the middle of next year we will have over a million Tesla cars on the road with full self-driving Hardware ..."



1655142843409.png
 
Last edited:
Dude, I don't have to "Google". I don't even use Google - I worked for a long time in MSFT, so I'm used to Bing.

Are you part of TSLAQ ?

I remember all those days. We have discussed this distortion by media umpteen times. You go read the market thread during those days. He was talking about "cars with hardware capable of FSD". He repeated that in multiple interviews that year. You go listen to those as well.



"by the middle of next year we will have over a million Tesla cars on the road with full self-driving Hardware ..."



View attachment 816160
1655153420885.png

Sorry, I put the time code a little earlier to make sure there was context. You seemed have to stopped listening after the word hardware.

Full quote: "by the middle of next year we'll have over a million Tesla cars on the road with full self-driving hardware feature complete at a reliability level that we would consider that no one needs to pay attention, meaning you could go to sleep in your from our standpoint. If you fast forward a year, maybe a year and three months but next year for sure we will have over a million robotaxis on the road. The fleet wakes up with an over-the-air update that's all it takes."
Is "next year for sure" a promise? It's hard for me to say given the speaker.
I don't think he's backed away from this except that the schedule keeps slipping. He's still predicting reliability suitable for robotaxi operation next year.
 
  • Love
Reactions: AlanSubie4Life
I think the perception NN is simply not good enough to measure the speed and distance of vehicles with enough accuracy to do the turn. I can't understand how it could be a planner issue. It would be nice if they extended the visualization beyond 200 feet. Of course if they fix it the next step for Chuck is nighttime turns!
Interesting, I've always felt it was the perception NN having trouble deciding if there was a gap in the median that allowed the car to "pause" there during the ULT (or even make the ULT at all).
 
View attachment 816213
Sorry, I put the time code a little earlier to make sure there was context. You seemed have to stopped listening after the word hardware.

Full quote: "by the middle of next year we'll have over a million Tesla cars on the road with full self-driving hardware feature complete at a reliability level that we would consider that no one needs to pay attention, meaning you could go to sleep in your from our standpoint. If you fast forward a year, maybe a year and three months but next year for sure we will have over a million robotaxis on the road. The fleet wakes up with an over-the-air update that's all it takes."
Is "next year for sure" a promise? It's hard for me to say given the speaker.
I don't think he's backed away from this except that the schedule keeps slipping. He's still predicting reliability suitable for robotaxi operation next year.
Yeah I don’t see how this can be construed as only meaning “capable” vehicles. It’s pretty clear.

And I love the promise, even if it was a bit optimistic on timing. Looking forward to my silky smooth omnipotent robotaxi with current hardware. Not quite seeing the path, but must keep the faith!

It’s objectively worse is your “anyone with a brain” review yet you don’t Have beta, have not Used it,

To be fair, using the beta would make a typical user even more concerned about the current state of the system than watching a YouTube video.
 
Last edited:
Tesla just made a deal with Samsung to buy much higher resolution cameras for future cars. They also have HW 4, which is more powerful, in the wings. The OP may be right that there are situations where the current system cannot react quick enough to avoid every single situation but is is powerful enough to avoid the vast majority of collisions. The future system will be able to avoid an even higher percentage of them.
 
"by the middle of next year we'll have over a million Tesla cars on the road with full self-driving hardware feature complete at a reliability level that we would consider that no one needs to pay attention, meaning you could go to sleep in your from our standpoint. If you fast forward a year, maybe a year and three months but next year for sure we will have over a million robotaxis on the road. The fleet wakes up with an over-the-air update that's all it takes."
Let us take a look at that.

He starts by saying there will be "a million Tesla cars road with full self-driving hardware".

Just because he omits "hardware" next time doesn't mean he has changed his position in seconds.

You guys behave like dubious political operatives who take quotes out of context and misrepresent as much as possible.

ps : In the AGM that happened after the AI day - they clearly show they are talking about 1 million "robotaxi capable" cars. But haters continue their disinformation campaign.

Here are the receipts.


1655226101614.png



1655226375349.png


1655226648188.png


1655226681253.png


1655226432727.png


 
Last edited:
Second question, is 1280x960 enough resolution? People see at significantly higher resolution and we can differentiate objects at further distances than a camera of this resolution. That means Tesla has less frames in which it even has a chance to identify a car coming at you in T intersection at 60-70mph.

Can someone tell me how I’m wrong?

Well, here’s something. I don’t want to say wrong, “wrong” is too strong a word.

The optic nerve doesn’t have anything like the bandwidth necessary to carry all signals from the entire retina (sensing portion of the eye) so there’s a lot of preprocessing that goes into it before the signals are sent to the brain. That’s the basis for a lot of optical illusions. Your high resolution vision is only for a tiny tiny portion of the visual field, at the very center of where you are looking. The brain fills in the rest making you think you have high resolution throughout your visual field. That and the brain fills in color for you. You can only sense color in the center, otherwise it’s grayscale and your brain helpfully fills that in for you as well. And there’s a blind spot, the brain fills that in as well. There are certain types of visual information that immediately cause you to move your eyes. Movement is one, if you see unexpected movement out of “the corner of your eye” you’ll immediately look toward that movement, then you get the high resolution view.

Eyes are neat. You have 3 colors of receptors in the normal human eye, but only a tiny fraction of them are blue receptors. By far most are red followed by green then blue. Why? Blue light is scattered just as it is in the sky, so more receptors wouldn’t help. And the rods, people think they‘re for night vision only but the rods are also used for daylight vision. The rods provide virtually all your off axis vision. They don’t carry color information.

And air, it’s opaque over most light wavelengths. There’s a band in which air is transparent, your eyes are evolved to precisely see in that just that transparent wavelength band. For fun, look at the curve of light transmission through air vs wavelength. Then look at the sensitivity of the eye vs wavelength. It’s amazing… well until you think about it. It had to be that way.

Back to the subject, the camera will have the same resolution throughout the frame. Comparing the whole frame resolution of the camera to the resolution of a pair of human eyes just isn’t valid. And we didn’t even touch on 3D perception. That’s a whole other discussion. And there are brain circuits that keep your eyes accurately pointed even as you move your head. And there are inputs from the middle ears.

Did I say “neat”? Neat doesn’t even begin to describe just how stunningly amazing our eyes are.

OK, you guys are here for the Tesla stuff, not to hear me go on about eyes. But if I’m comparing Tesla to eyes, there’s no comparison, none at all.

Best,
David