Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I have to admit the number one thing I like about Elon is his ability to take risk. Will the risk pay off? We will have to see. I'm going to guestimate this will be a big win for Tesla. I think the car is unlikely to hit anything. Does drive bad, but will not likely hit anything. Yes, can get hit because of erratic behavior.

This is a terrible approach to AV safety. Risk-taking is fine in certain situations, not with AV safety. First, I don't think we can say that the car is unlikely to hit anything. What do you base that on? Data? Second, driving badly is a bad thing. It can lead to hitting something or causing another to hit you or someone else. If the erratic behavior causes another human driver to hit something, the Tesla owner would still be liable. At a minimum, it would be terrible PR to have news stories about Tesla FSD driving badly and causing other drivers to hit things. So that is not something you want for Tesla. Lastly, "not hitting stuff" is too low of a standard. We also want AVs to have good road citizenship. That means AVs need to try to drive properly and respect other drivers. I don't think you want Tesla FSD to cause road rage incidents and upset other drivers, or cause accidents That would not be a win for Tesla. So no, I do not agree with this approach to take the risk and hope everything will be fine.
 
I have to admit the number one thing I like about Elon is his ability to take risk. Will the risk pay off? We will have to see. I'm going to guestimate this will be a big win for Tesla. I think the car is unlikely to hit anything. Does drive bad, but will not likely hit anything. Yes, can get hit by others because of erratic behavior.

One more thing. There is a concept in AV safety called "absence of unnecessary risk". The idea is that there is always some unavoidable risk in driving but the goal of true safety should be to minimize unnecessary risk. This is important because safety is not just about minimizing collisions. Absence of collisions does not necessarily mean that your AV is safe because it is possible the AV just got lucky when it avoided a collision. You do not want to base safety on your AV being "lucky" in not getting into collisions. So you want to minimize unnecessary risk in order to truly reduce the risk of collisions and therefore ensure higher safety. Bad or erratic driving increases unnecessary risk and therefore can decrease safety. Maybe the erratic driving does not always cause an accident every time but it unnecessarily increases the chance of an accident and therefore should be minimized.

Here are a few examples of driving maneuvers that are considered unsafe because they increase unnecessary risk:
- driving too close to the car in front
- driving too close to cyclists/pedestrians
- cutting in too close/without using blinker
- speeding
- unnecessary lane changes
- stopping too fast
- swerving into the turn only lane at the last minute because you almost missed your turn
- "gunning it" through an intersection to beat the red light

This is just a sample, I am sure there are more examples of bad driving we could think of. And I am not saying that FSD beta does all of these. My point is that we want FSD beta to avoid erratic driving that would increase unnecessary risk.
 
I have to admit the number one thing I like about Elon is his ability to take risk. Will the risk pay off? We will have to see. I'm going to guestimate this will be a big win for Tesla. I think the car is unlikely to hit anything. Does drive bad, but will not likely hit anything. Yes, can get hit by others because of erratic behavior.
1. He is not taking any risks. You are. Tesla can hide under "this is a L2 system" "the driver is always in control" you on the other hand will get stuck with the bill and or injury or die.

2. There are hands free L2 systems deployed currently. It's not a matter of taking risk, its matter of designing a robust system that mitigates risks.
 
One more thing. There is a concept in AV safety called "absence of unnecessary risk". The idea is that there is always some unavoidable risk in driving but the goal of true safety should be to minimize unnecessary risk. This is important because safety is not just about minimizing collisions. Absence of collisions does not necessarily mean that your AV is safe because it is possible the AV just got lucky when it avoided a collision. You do not want to base safety on your AV being "lucky" in not getting into collisions. So you want to minimize unnecessary risk in order to truly reduce the risk of collisions and therefore ensure higher safety. Bad or erratic driving increases unnecessary risk and therefore can decrease safety. Maybe the erratic driving does not always cause an accident every time but it unnecessarily increases the chance of an accident and therefore should be minimized.

Here are a few examples of driving maneuvers that are considered unsafe because they increase unnecessary risk:
- driving too close to the car in front
- driving too close to cyclists/pedestrians
- cutting in too close/without using blinker
- speeding
- unnecessary lane changes
- stopping too fast
- swerving into the turn only lane at the last minute because you almost missed your turn
- "gunning it" through an intersection to beat the red light

This is just a sample, I am sure there are more examples of bad driving we could think of. And I am not saying that FSD beta does all of these. My point is that we want FSD beta to avoid erratic driving that would increase unnecessary risk.
I still find it mind-blowing that there are many parts of the US where NOT doing some of these actions results in drivers around you being pissed and honking. Such as driving the speed limit, coming to a stop at a stop sign, and gunning it on a yellow light (because the guy behind you wanted to run it too).
 
still find it mind-blowing that there are many parts of the US where NOT doing some of these actions results in drivers around you being pissed and honking. Such as driving the speed limit, coming to a stop at a stop sign, and gunning it on a yellow light (because the guy behind you wanted to run it too).
Ah, so you're the guy ACTUALLY driving 10 mph up 5 floors in the parking garage when I have a meeting to get to ;)
 
  • Like
  • Funny
Reactions: DanCar and Dewg
Ah, so you're the guy ACTUALLY driving 10 mph up 5 floors in the parking garage when I have a meeting to get to ;)
I'm the guy that drives 65 mph in the left lane of the freeway, because Google tells me there is a red zone coming up. Yes I get honked at but find it amusing when I catch up to some a minute later when the stop and go traffic appears.
 
One more thing. There is a concept in AV safety called "absence of unnecessary risk". The idea is that there is always some unavoidable risk in driving but the goal of true safety should be to minimize unnecessary risk. This is important because safety is not just about minimizing collisions. Absence of collisions does not necessarily mean that your AV is safe because it is possible the AV just got lucky when it avoided a collision. You do not want to base safety on your AV being "lucky" in not getting into collisions. So you want to minimize unnecessary risk in order to truly reduce the risk of collisions and therefore ensure higher safety. Bad or erratic driving increases unnecessary risk and therefore can decrease safety. Maybe the erratic driving does not always cause an accident every time but it unnecessarily increases the chance of an accident and therefore should be minimized.

Here are a few examples of driving maneuvers that are considered unsafe because they increase unnecessary risk:
- driving too close to the car in front
- driving too close to cyclists/pedestrians
- cutting in too close/without using blinker
- speeding
- unnecessary lane changes
- stopping too fast
- swerving into the turn only lane at the last minute because you almost missed your turn
- "gunning it" through an intersection to beat the red light

This is just a sample, I am sure there are more examples of bad driving we could think of. And I am not saying that FSD beta does all of these. My point is that we want FSD beta to avoid erratic driving that would increase unnecessary risk.
It’s so erratic that it’s incomprehensible some days. I was finally fed up with mine and requested the service department go through the car’s computers and cameras from front to back. I have had so many dangerous issues that the technology is all but unusable because it’s become a safety hazard to drive. I simply can’t be ultra-focused 100% of the time and with FSDb you have to be, more so than any other car on the road! Even detrimenting the technology doesn’t help. Driving to the Tesla service department today I simply had TACC. The most basic of technology. Simply maintain a set speed and don’t slam into the car in front of me. And what happens? With cruise set at 65mph in a 60 zone the car slows for no reason to 44mph. I accelerate as the traffic behind me approaches and then a few hundred yards later the car slows to a crawl. At 32mph I finally pulled to the shoulder and as we slowed to almost 20mph I stopped before we ran over some debris on the shoulder! Simply unfathomable actions! It does this constantly in NoA, TACC or FSDb. These kinds of things are just unforgivable.
 
I was finally fed up with mine and requested the service department go through the car’s computers and cameras from front to back.

Please let us know if they find any issues with your hardware. FSD Beta experiences can vary so widely, I do often wonder if there are some unlucky vehicles that are having hardware issues.

I would understand if my experiences were inconsistently inconsistent. But I've never had a day with FSD Beta as bad as some people on this forum describe they experience on a daily basis.
 
  • Like
Reactions: dloomis and DWtsn
Well, he does not mention the 10k miles requirement. So who knows with Elon?! Maybe the 10k miles requirement is gone?
I think the "10K miles requirement" was only an offhand proposal by Whole Mars when he first tweeted his request to remove the nag. Elon never said anything specific about such a requirement, so I don't think there's any reason to believe that's the plan.

Maybe so, but personally I doubt it - it would mean that the average new FSD user would have to wait the better part of a year to have the nag removed.

So I would caution against building up the 10K miles thing as any kind of an established plan.
 
  • Like
Reactions: diplomat33
Instead of chasing 0.99999 for L4 why not a SOLID L3 that works >95% of the time from on to off ramp and then back to L2 (FSD Beta) would seem very useful and more doable soon.
I for one wound like to own a L4/L5 car where I can sleep while it drives the long drive for me. That could replace flying and hotels on many occasions. Especially if the car would have beds built in.

In a way this could be seen as a "private jet" for the middle class.
 
It’s ok, it is coming soon!

In broad terms, the answer was much simpler than they thought!


My interpretation is that they are just going to let people drive.

So somehow Tesla figured out FSD, one of the hardest AI problems ever and it was actually simple all along, but he can't tell us?" LOL. I can't believe anyone falls for that. But if Tesla has indeed solved FSD, then it should be easy to show us.

My guess is maybe the Tesla FSD team made some of kind of AI breakthrough and Elon thinks that means they have figured out all of FSD. Maybe Tesla did build an end-to-end AI in simulation and since Elon believes E2E is the right path, he is concluding that Tesla has figured out FSD. But it is unlikely that Tesla has actually figured out the whole thing.
 
So somehow Tesla figured out FSD, one of the hardest AI problems ever and it was actually simple all along, but he can't tell us?" LOL. I can't believe anyone falls for that. But if Tesla has indeed solved FSD, then it should be easy to show us.

My guess is maybe the Tesla FSD team made some of kind of AI breakthrough and Elon thinks that means they have figured out all of FSD. Maybe Tesla did build an end-to-end AI in simulation and since Elon believes E2E is the right path, he is concluding that Tesla has figured out FSD. But it is unlikely that Tesla has actually figured out the whole thing.
Yeah - I mean choosing the correct lane appears to be the largest computing challenge of our time.
 
My guess is maybe the Tesla FSD team made some of kind of AI breakthrough and Elon thinks that means they have figured out all of FSD.

It's most likely the "Foundational World Model" that Alluswamy spoke about at his CVPR keynote. Somewhat of a general purpose computer vision model that can be adapted to many different tasks.

It will have solved FSD in the same way that the 2017 paper "Attention is all you need" solved natural language. It still took 6 years from that until ChatGPT as a viable product.
 
  • Like
Reactions: diplomat33
Yeah - I mean choosing the correct lane appears to be the largest computing challenge of our time.

Yes, Tesla tackles easy problems because they can transform them into the most difficult to solve. Steering, turn signals, braking, acceleration. The way it's going Tesla will require Congress to pass a multibillion dollar bill like the semiconductor chips act.
 
Last edited: