Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Absolutely. Of course I’ll still think it wasn’t their original plan.
I’ll always think that FSD beta is a beta version of FSD robotaxi and not a driver assist.

I mean, we know for a fact that isn't true. Tesla themselves told us it's not true in the CA DMV documents- where they make clear it was explicitly intended to be an L2 solution and not anything higher.


Unless you're one of those "I take what Tesla says literally except for those times I disagree and then they were just saying it for legal reasons and we can ignore THOSE times" people?
 
  • Like
Reactions: spacecoin
I mean, we know for a fact that isn't true. Tesla themselves told us it's not true in the CA DMV documents- where they make clear it was explicitly intended to be an L2 solution and not anything higher.


Unless you're one of those "I take what Tesla says literally except for those times I disagree and then they were just saying it for legal reasons and we can ignore THOSE times" people?
Yep. Now that Dan O’Dowd has run two Super Bowl ads this year Tesla will have no choice but to sue. Then we will see the internal emails and see who is right. I’m expecting that Elon told the team to develop the worlds best robotaxi software.
 
Yep. Now that Dan O’Dowd has run two Super Bowl ads this year Tesla will have no choice but to sue. Then we will see the internal emails and see who is right. I’m expecting that Elon told the team to develop the worlds best robotaxi software.


We already have seen the internal emails.

Tesla discussing FSDb with the CA DMV said:
we do not expect significant enhancements in OEDR or other changes to the feature that would shift the responsibility for the entire DDT to the system. As such, a final release of City Streets will continue to be an SAE Level 2, advanced driver-assistance feature

FSDb, by intent and design is an L2 feature-- and the final release is ALSO by design and intent L2. It lacks the OEDR to be higher and they do not expect that to change.

It's NOT a beta test of an L3 or higher level system. Unless you think Tesla is just outright lying to the government repeatedly.
 
  • Like
Reactions: spacecoin
FSDb, by intent and design is an L2 feature-- and the final release is ALSO by design and intent L2. It lacks the OEDR to be higher and they do not expect that to change.

It's NOT a beta test of an L3 or higher level system. Unless you think Tesla is just outright lying to the government repeatedly.

That quote wasn't about FSD Beta as a whole. Only the "Autosteer on City Streets" feature. Here's the very next sentence from the communication with the California DMV, that makes it clear that Tesla intends to eventually develop L3 and higher levels of autonomy:

... As such, a final release of City Streets will continue to be an SAE Level 2, advanced driver-assistance feature.

Please note that Tesla’s development of true autonomous features (SAE Levels 3+) will follow our iterative process (development, validation, early release, etc.) and any such features will not be released to the general public until we have fully validated them and received any required regulatory permits or approvals.
 
That quote wasn't about FSD Beta as a whole. Only the "Autosteer on City Streets" feature.

That's literally what FSDb is, and it's referred to that way (including Tesla and CA DMV scheduling a demo of it in late 2020) in the email chain.



Here's the very next sentence from the communication with the California DMV, that makes it clear that Tesla intends to eventually develop L3 and higher levels of autonomy:

yes, they intend, in the future, to develop DIFFERENT software to do that.

Which is not the fsdb software.
 
  • Funny
Reactions: powertoold
yes, they intend, in the future, to develop DIFFERENT software to do that.

Which is not the fsdb software.

Also worth noting that the same communication contains this line:
Again, a full deployment of City Streets to the customer fleet is not expected in the immediate future.

So I think it's fair to say that the information is outdated, and not too useful for informing this conversation.
 
How do you know this?
Because Chat GPT told me so: :):):)

=======================================
Waymo vehicles, which are part of the Waymo One service, are capable of driving in specific areas that have been extensively mapped and where they have been trained to operate. These areas typically include regions where Waymo has conducted extensive testing and mapping efforts, such as certain cities and suburban areas.

While Waymo's autonomous vehicles are highly sophisticated and can handle a wide range of driving scenarios, they are not designed to drive anywhere autonomously. Factors such as road conditions, local regulations, weather conditions, and infrastructure play a significant role in determining where autonomous vehicles can operate safely and effectively.

Waymo continually expands its operational domains through rigorous testing, mapping, and validation processes. However, the ability of Waymo vehicles to drive in new locations depends on various factors, including regulatory approval, mapping efforts, and the development of advanced algorithms to handle diverse driving environments.
 
  • Like
Reactions: powertoold
Because Chat GPT told me so: :):):)

=======================================
Waymo vehicles, which are part of the Waymo One service, are capable of driving in specific areas that have been extensively mapped and where they have been trained to operate. These areas typically include regions where Waymo has conducted extensive testing and mapping efforts, such as certain cities and suburban areas.

While Waymo's autonomous vehicles are highly sophisticated and can handle a wide range of driving scenarios, they are not designed to drive anywhere autonomously. Factors such as road conditions, local regulations, weather conditions, and infrastructure play a significant role in determining where autonomous vehicles can operate safely and effectively.

Waymo continually expands its operational domains through rigorous testing, mapping, and validation processes. However, the ability of Waymo vehicles to drive in new locations depends on various factors, including regulatory approval, mapping efforts, and the development of advanced algorithms to handle diverse driving environments.
Obviously they need a safety driver when driving in locations where they haven’t validated safety. That doesn’t answer the question of how well they would work with a safety driver in your neighborhood.
 
  • Like
Reactions: spacecoin
Obviously they need a safety driver when driving in locations where they haven’t validated safety. That doesn’t answer the question of how well they would work with a safety driver in your neighborhood.
That wasn't part of my original statement.

I just said if you dropped a Waymo vehicle off on my driveway tomorrow it would not be able to drive itself around my block. A Waymo is not equipped to do that without a lot of expensive preparation and help. But a Tesla is.

@spaceco Claimed that Waymo is 10x more advanced in every way. But upon delivery, Waymo can't tackle the task of driving around my block. This is something a Tesla does with ease.
 
That wasn't part of my original statement.

I just said if you dropped a Waymo vehicle off on my driveway tomorrow it would not be able to drive itself around my block. A Waymo is not equipped to do that without a lot of expensive preparation and help. But a Tesla is.

It's not though.

The tesla can NOT drive itself around your block.

It can assist an actual human driver in doing so though.

Waymos can drive infinitely more miles by themselves than Teslas can, because no Tesla can drive itself at all. As Tesla themselves tells you when you buy FSD, or when you read the manual, or when you turn FSD on.
 
  • Like
Reactions: DrGriz
All fun and games aside, Tesla and Waymo are taking two very different approaches.

The question for Tesla has always been, "Will it's system ever get good enough to use for a robotaxi? "

The question for Waymo has always been, "Will its approach lead to a profitable business?"

The answer to both questions are unknown. And the answer to both could be "yes". Or the answer to both could be "no". Or some combination of the two.

But there is little doubt that Tesla's system, if it works, will be cheaper and easier to deploy in new locations. It will be cheaper and easier to operate. The upside for Tesla is far greater. And as the low-cost leader, it would eventually drive Waymo out of business.

As a Tesla investor, Waymo doesn't really matter. All that matters is Tesla. If Tesla succeeds, Tesla wins.
 
  • Like
Reactions: FSDtester#1
It's not though.

The tesla can NOT drive itself around your block.

It can assist an actual human driver in doing so though.

Waymos can drive infinitely more miles by themselves than Teslas can, because no Tesla can drive itself at all. As Tesla themselves tells you when you buy FSD, or when you read the manual, or when you turn FSD on.
Semantics.
 
Alright pixel-peepers, I did some sleuthing. It seems the leaked V12 video from a month ago is using HW4 (check out the pics below). So the concern right now is: HOW GOOD IS HW3 with V12? We have no real evidence V12 is working well with HW3 yet; Omar is on HW4. Let me know if you think I'm wrong!

Here is the leaked video I'm referring to:
Screenshot 2024-02-12 at 10.06.46 AM.png

Screenshot 2024-02-12 at 10.07.51 AM.png

Screenshot 2024-02-12 at 10.08.27 AM.png
 
I guess the number one thing I want fixed is the lane changes to "follow the route" that are actually away from the route. But that is likely a map data related problem, so no FSDb update is likely to resolve that.
Now that Navigation shows upcoming traffic lights and stop signs, it should be easier to tell when certain aspects of map data are wrong and confusing FSD Beta. For example, 11.x currently slows down on my routes looking for an incorrectly mapped traffic light at a crosswalk and a stop sign that did exist during construction, so hopefully 12.x neural networks learned to ignore wrong map data when it's clearly visible that the mapped traffic control doesn't exist.

If it is capable of doing that, it would be a good sign of 12.x might also be able to ignore incorrect "follow the route" map data. These can be from map data with incorrect number of lanes or wrong turn lanes designations or just incomplete data where 11.x "knows" there's an upcoming right turn not realizing a dedicated lane will appear for that turn. So far 12.1.2 videos seem to show it preparing for turns relatively late, and that might happen to avoid unnecessary lane changes when it actually needs to go straight, but that might mean more problems when needing to turn.

These unnecessary lane changes are one of my top interventions that probably result in other drivers being confused as I try to quickly cancel the turn signal or pressing the turn signal button too many times that I then need to cancel again just to stay straight. At least they're pretty consistent in which locations it happens at, so this is pretty important for disengaging to prevent taking the wrong fork at high speeds.
 
Now that Navigation shows upcoming traffic lights and stop signs, it should be easier to tell when certain aspects of map data are wrong and confusing FSD Beta. For example, 11.x currently slows down on my routes looking for an incorrectly mapped traffic light at a crosswalk and a stop sign that did exist during construction, so hopefully 12.x neural networks learned to ignore wrong map data when it's clearly visible that the mapped traffic control doesn't exist.

If it is capable of doing that, it would be a good sign of 12.x might also be able to ignore incorrect "follow the route" map data. These can be from map data with incorrect number of lanes or wrong turn lanes designations or just incomplete data where 11.x "knows" there's an upcoming right turn not realizing a dedicated lane will appear for that turn. So far 12.1.2 videos seem to show it preparing for turns relatively late, and that might happen to avoid unnecessary lane changes when it actually needs to go straight, but that might mean more problems when needing to turn.

These unnecessary lane changes are one of my top interventions that probably result in other drivers being confused as I try to quickly cancel the turn signal or pressing the turn signal button too many times that I then need to cancel again just to stay straight. At least they're pretty consistent in which locations it happens at, so this is pretty important for disengaging to prevent taking the wrong fork at high speeds.
I've wondered for a while how/why FSD follows the map more than it does the cameras. It seems like it's primarily using the cameras to confirm where it is on the map rather than for driving. If this is the case then it's really hardly better than Waymo
 
  • Funny
Reactions: Daniel in SD
I've wondered for a while how/why FSD follows the map more than it does the cameras.

Because blowing through an obscured stop sign the cameras misses but is on the map is a much worse mistake, safety-wise, than slowing down to stop at a sign the map says is there but the camera does not... (the second one is also a lot easier for a driver to notice and correct for safely).


best not to engage knighty - if it's not semantic with him it's pedantic. often it's both. Most often it's pointless.


Do you legitimately find the distinction between a system that can self drive and one that can't to be only semantic or pedantic, rather than significant? (both functionally and legally)

Or did you just want to insult someone again?