As is true for every other human on the face of the earth.It’s a shame that we cannot take Elon’s statements at face value.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
As is true for every other human on the face of the earth.It’s a shame that we cannot take Elon’s statements at face value.
There’s a concept called credibility.As is true for every other human on the face of the earth.
That's still no different from everyone else selling products.There’s a concept called credibility.
Of course there are no guarantees. But that is not the point!
As a world-class (best ever?) sales/marketing guy who places less importance on technical correctness, Elon definitely uses creative license from time to time!
And I did not say otherwise!That's still no different from everyone else selling products.
Um. I've been reading, listening to some fairly interesting podcasts, and most of this sourced, one way or the other, by Real Researchers. Which of course, I'm not.Not clear that what we call machine learning/NNs bears any resemblance to the human brain.
It’s just a term of art - it’s not meant to suggest we are duplicating the function of the brain!
No one knows how our brain actually works, exactly, yet. Know even less about it than we know about NNs.
As the CEO of the only tech company in the world with an ADAS working on city streets in an entire country, Elon’s credibility is very high.There’s a concept called credibility.
Of course there are no guarantees. But that is not the point!
As a world-class (best ever?) sales/marketing guy who places less importance on technical correctness, Elon definitely uses creative license from time to time!
Elon said on the stream & I quote "We got FSD 12 test drivers around the world"Those of us in the RoW are stuck on the 7 year old stack
If the system is told that driving on the road is good, why does it have to be told that driving off a cliff is bad? Because it may have enough weak positive cues (e.g. poorly marked roads, etc) that it would entertain driving off the cliff?
If so, can't that be addressed by complaining that the system doesn't have enough confidence in good actions that it requires the driver to take over?
Isn't that part of the whole SAE driving levels thing?
In light of the V12 livestream, I have some questions about Tesla's end-to-end video training approach:
- In the US, different areas have different road infrastructure, different driving behaviors, different traffic rules. Won't be hard to truly generalize one NN to handle all of those differences? Won't Tesla need to collect training data from basically everywhere to train the NN on all the differences? Will Tesla need to backtrack on the "all nets" and use some special rules for certain areas?
And some time back the previous versions were a "kludgy mess" and V10 or V11 was going to be the awesome sauceMaybe the most refreshing info from this stream was Elon finally admitting V11.x is a kludgy mess.
Yours is GA, he refers to a couple of test mulesElon said on the stream & I quote "We got FSD 12 test drivers around the world"
Patience! This is a hugely positive news that the “ADAS test operators” that Tesla has been recruiting all over the world are testing FSD V12!
(Damnit, I should have applied for the job)
I would not argue with this at all (with the caveat that I have no experience with other systems with exception of primitive Toyota).City Streets ADAS is SHIPPING FUNCTIONALITY WAY ahead of everyone else.
Hallucinations have been used to describe when large language models put together words that sound like facts but are false. More generally and more relatable to what we've experienced with FSD Beta is false positives, e.g., believing there's a pedestrian at a mailbox when there's nobody there. With traditional control code, it tells the vehicle to go around pedestrians, but potentially with control network, it could determine that the mailbox doesn't require swerving around based on other signals over time.I wonder how they will prevent hallucinations with this method?
In fact, he refers to a few countries.. And these countries have test mules in all the major cities.Yours is GA, he refers to a couple of test mules
Yeah that is what I am referring to. I think exactly how this sort of thing will manifest in driving is unclear (to me at least). But I also think that occasional unexplained (or easily explained by a human) errors are likely with an unconstrained approach (as advertised in the video - this is what I specifically wonder about with Elon’s claims)., now with end-to-end networks, any incorrect control behavior could somewhat be considered a hallucination if you want to call it that.
I always took Ashook as a yes man guy who wants to be liked by Elon compared to Andrej.No surprise, the head of Tesla's FSD says FSD will be the bestest thing ever. lol.