Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
The thing that confuses me is that in October FSDbeta looked kinda shitty. And now it’s good enough to roll out to a wide release in a matter of weeks. How is that. Or the cusp of a massive change in perception?
Really depends on which video you watch. In this video, you'll see how FSD almost ran into a car, and driving into a wall.

There are some simple logic problems Tesla is refusing to fix as of now. The most dangerous one that should be fixed asap is to decouple car logic from navigation because navigation is not always right. In the video where the car is going straight for that wall, you can tell that in the UI the road is clearly marked for a right turn as it's the only possible drivable space, however the navigation still thinks there's road in front(you can tell that navigation is set for this car to make a right turn in 400 ft..which is beyond the fence). Here you see the car's path finding wants to go straight, but right, but straight, but right..it can't make up its mind because the navigation and what it sees are different.

In another video the car wants to run into a "road closed" blockage because the navigation said there's road there(not accounting for construction and closed roads).

 
I’m sure it’s a coincidence that every time a GM car crashes they don’t get waves of articles speculating about supercruise.

That's because GMC vehicles with Supercruise never get in violent crashes. I bet you're wondering how I know?

Simple. I Googled it and Google said "No results found for GMC "violent crash" "supercruise". But when I replace "GMC" with "Tesla" and "Supercruise" with "Autopilot" I get a whole bunch of results! :oops:

/s
 

Really depends on which video you watch. In this video, you'll see how FSD almost ran into a car, and driving into a wall.

There are some simple logic problems Tesla is refusing to fix as of now. The most dangerous one that should be fixed asap is to decouple car logic from navigation because navigation is not always right. In the video where the car is going straight for that wall, you can tell that in the UI the road is clearly marked for a right turn as it's the only possible drivable space, however the navigation still thinks there's road in front(you can tell that navigation is set for this car to make a right turn in 400 ft..which is beyond the fence). Here you see the car's path finding wants to go straight, but right, but straight, but right..it can't make up its mind because the navigation and what it sees are different.

In another video the car wants to run into a "road closed" blockage because the navigation said there's road there(not accounting for construction and closed roads).

Unless 8.3 is much, much better than 8.2, this is not ready for general public release. I really hope Elon isn't seriously going to give just anyone The Button with the software in still such an early state. It needs a lot of iteration before it can actually try and tackle major cities. It only takes one inattentive beta tester driving straight into a wall to completely ruin everything they have accomplished so far, and also end my chances of ever retiring instead of working until I die when it crashes the stock price.
 
  • Like
Reactions: Fred42 and wnorris
The thing that confuses me is that in October FSDbeta looked kinda. And now it’s good enough to roll out to a wide release in a matter of weeks. How is that. Or the cusp of a massive change in perception?
YouTube videos prove it has improved massively. That said, it's still far from perfect and makes mistakes.

The perception change is justified. Big improvements coming soon in v.8.3 and v.9 according to Elon.
 
Unless 8.3 is much, much better than 8.2, this is not ready for general public release. I really hope Elon isn't seriously going to give just anyone The Button with the software in still such an early state. It needs a lot of iteration before it can actually try and tackle major cities. It only takes one inattentive beta tester driving straight into a wall to completely ruin everything they have accomplished so far, and also end my chances of ever retiring instead of working until I die when it crashes the stock price.
Just being devils advocate here...how is this scenario you describe any different than what we have now. Many of us are using an FSD build inferior to the current FSD Beta release. It’s not as if there isn’t FSD software in use currently.

My car would easily run into objects if I let it run unattended. But I’m aware of it’s limitations and importantly am liable for retaining control and remaining prepared to take corrective action at all times.

It’s not as if Tesla is releasing it as level 5, calling it finished, and accepting liability for its efficacy.
 
The Ford F-150 Police Pickup is not an EV. Can you imagine Tesla Cybertrucks eventually becoming the dominant police vehicles? :cool:

Benzinga - hour ago: Ford Police Pickup Truck

Excerpt:

Ford first introduced the F-150 line for police in 2017, and the new model includes an improved 120 mph top speed and automatic four-wheel-drive mode with torque-on-demand transfer case to enable a smooth transition to off-road capability.
Good to know that I'll be able to outrun it in my Cybertruck. Although it will be difficult to hide in it. 😇
 
I hate to type the name of this s..ker Gordon J who said VW will overtake Tesla in EV sales this year based on VW claim that it plans to deliver 1M of EV in 2021. My body temp raised because last year total VW BEV sales in 2020 was only 134K. Read below

Analyst Says Volkswagen Will Overtake Tesla in Electric Vehicle Sales This Year


Total VW BEV sales in 2020 is nearly 134,000 battery electric vehicles vs Tesla ‘s 500K

https://www.volkswagenag.com/en/news/2021/01/Volkswagen-brand-triples-deliveries-of-all-electric-vehicles-in-2020.html#
 
Just being devils advocate here...how is this scenario you describe any different than what we have now. Many of us are using an FSD build inferior to the current FSD Beta release. It’s not as if there isn’t FSD software in use currently.

My car would easily run into objects if I let it run unattended. But I’m aware of it’s limitations and importantly am liable for retaining control and remaining prepared to take corrective action at all times.

It’s not as if Tesla is releasing it as level 5, calling it finished, and accepting liability for its efficacy.
You and I both know the answer to this question.

The media will seize upon every single incident and use it to crucify Tesla and Elon. They already do this every time any Tesla crashes anywhere, like with the recent one in Detroit. Imagine this but multiplied a hundred fold when the first FSD beta driver literally drives the car straight into a wall in the middle of NYC at 80 mph.

As long as the media is our enemy, we do not have room for any mistake which could be blamed on us. There is no sign the media will ever not be our enemy, so we must always operate on a siege footing and releasing such a rough FSD beta to a lot of people is absolutely creating an existential threat to the company yourself and is easily avoided by not doing it.
 
Unless 8.3 is much, much better than 8.2, this is not ready for general public release. I really hope Elon isn't seriously going to give just anyone The Button with the software in still such an early state. It needs a lot of iteration before it can actually try and tackle major cities.
I don't understand the logic you're using here. It's a driver assistance feature which means it requires 100% driver attention. This is no different from driving a car without any driver assistance features. If anything, it get's more dangerous in irresponsible hands as it becomes more reliable (until it exceeds the safety of a human).

From Tesla's perspective, the safety of the system would have to be looked at in the aggregate. I'm sure they have already developed conceptual models to to help estimate the expected overall safety as the system improves. There is a point of maximum danger where the system is good enough that some users will start to trust it more than it deserves. For arguments sake, let's look at a single point in the development when FSD is 5 times as likely to have an accident as an average human driver (if it were unsupervised). Let's also assume that, when it's this good, 10% of all users will trust it explicitly and never monitor it in an effective manner and 50% will be monitoring it "pretty well" and 40% not trusting it all and treating it like they are driving it and always ready to take over. Yes, I know these numbers are not based on any data and we can't assume they will be reality but they are just to illustrate a point.

This would mean the 10% that don't monitor it at all will have an accident rate 5 times human drivers. The 40% of users who monitor it at all times, as if they were manually driving the vehicle, should have an accident rate lower than the average human driver (because FSD will notice and react to some accidents that would have happened without it). And the middle 50% that monitor it "pretty well" might have an accident rate about the same as the average human driver. The net result of all this would be an accident rate higher than the average human driver and the entire increase in the accident rate could be attributed to those who monitored it so poorly they hardly ever took over.

So, it's really about preventing the stupidest people from using it in an unsupervised manner. As much as I would like Darwin's "survival of the fittest" to take care of things naturally, the resulting innocent carnage would be unacceptable. The takeaway here is that once the system starts getting to be so good that there is a growing body of users that would abuse the system, Tesla must limit access to the best versions to prevent the overall safety rate from dropping below that of the average human driver. They should really shoot for keeping the average mile travelled under FSD twice as safe as the average human driver (without driver aids) while in beta.

It only takes one inattentive beta tester driving straight into a wall to completely ruin everything they have accomplished so far, and also end my chances of ever retiring instead of working until I die when it crashes the stock price.

I don't see it that way. As long as it's a drivers assistance aid, the driver is responsible for not driving into a wall. Humans drive into walls with alarming frequency. They don't require FSD to help them do it. My in-law drove through the window of a mattress store because she thought she was pressing on the brake instead of the accelerator. FSD should greatly reduce this kind of accident. Sure, the media will make a big noise about it but, in the end, it will come down to whether they system is making life safer or more dangerous overall. Insurance companies will see to it that this is the metric used.
 
I’m glad that Yahoo shows this clarification that the Tesla was not in Autopilot mode in the Mar 11 accident. I’m also pissed off because the “violent crash” term was included. You only use the term violent if the car was cut in half. How young is this Assistant Chief?( the term assistant usually indicates like administrative assistant, a type of the old time front desk secretary). He could well be a Tesla short.

Quote “the driver, who was previously hospitalized, is being charged with reckless driving, LeValley said, adding that speed was a significant factor in the crash”. So the young dude was trying to show off his Tesla in the early morning or under influence?.

I also blame people and Tesla failed to make people aware that the current FSD beta 8.2 is WAY BEYOND all previous old versions. The latest hot video should be made public to advertise the superior version to squelch all the bad rap with previous accidents using the old version. Had the dude subscribed to FSD and downloaded the 8.2 it would save him and his passenger injury and possible life and a lawsuit which costs him way more than $10K (the cost of FSD)

Police say Autopilot not believed in use in Detroit Tesla crash
 
  • Like
Reactions: Drax7 and FireMedic
I've been thinking about what you said.

After pondering it for a long time, I finally realized there might be one thing that could lead to Gordo rating TSLA a buy with a bullish price target. It occurred to me in a vision I had when I was cleaning the grease out of the range vent hood. The only way I see this happening is if Gordo was dating a Tesla owner and he accepted his marriage proposal. o_O

View attachment 645111
Just to be clear, straight, and for the record, I think Gordo is a paid troll, I only mention him for comedic value, he has no relevance otherwise
 
Really depends on which video you watch. In this video, you'll see how FSD almost ran into a car, and driving into a wall.

There are some simple logic problems Tesla is refusing to fix as of now. The most dangerous one that should be fixed asap is to decouple car logic from navigation because navigation is not always right. In the video where the car is going straight for that wall, you can tell that in the UI the road is clearly marked for a right turn as it's the only possible drivable space, however the navigation still thinks there's road in front(you can tell that navigation is set for this car to make a right turn in 400 ft..which is beyond the fence). Here you see the car's path finding wants to go straight, but right, but straight, but right..it can't make up its mind because the navigation and what it sees are different.

In another video the car wants to run into a "road closed" blockage because the navigation said there's road there(not accounting for construction and closed roads).

This video is excellent in showing the need for much more improvement. I still have doubt anybody can do the FSD level 5 defined as it can do ALL cases like a human can, like an accident on freeway or street or police pursuit with road closure. Have to see to believe.
 
I know OT (but it is after hours and relevant to the thread)..
It is a long quote attached but speaks to the mistakes and those that highlight them.
Combined the quote with an alternative background image.

75740E3E-DFFA-4AC9-88A3-E397A4DCF80F.jpeg