Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD V9 First Impressions - General Public Access Seems Way Off

This site may earn commission on affiliate links.
Uber's spared no expense and their program had some very notable expertise. They took a lot of people from Google/Waymo and Carnegie Mellon, partnered with them, and had a site in Pittburgh. You would have figured that working at Waymo and placing in the DARPA Grand Challenge since it started would be applicable to autonomous vehicles, but somehow all of that money and expertise amounted to little.

The Uber program was run by people with a financial interest in making a buck. That goal failed and everyone has learned. Obviously "training" $15 an hour people wasn't a good idea. Active monitoring was needed.
 
2017? FSD was way cheaper then. FSD went up to $10k in late 2020.

I ordered my MY and got FSD for $5k.

I'm just disappointed. I bought the car in Europe with both EAP/FSD. Every time Tesla needs money, Elon will tweet some sh1t that FSD is feature complete, only regulations, bla bla. But in reality, I don't think my car, delivered in 2017, will receive any form of autonomy (level 3+) before I trade it in after 5 years. And that makes it the worst $10000 I have ever spend on software. Its like buying a Ferrari without the ignition key. Everything to keep the stock owners happy. And probably it will be the end of Tesla, because some day FSD owners will want there money back.
 
I'm just disappointed. I bought the car in Europe with both EAP/FSD. Every time Tesla needs money, Elon will tweet some sh1t that FSD is feature complete, only regulations, bla bla. But in reality, I don't think my car, delivered in 2017, will receive any form of autonomy (level 3+) before I trade it in after 5 years. And that makes it the worst $10000 I have ever spend on software. Its like buying a Ferrari without the ignition key. Everything to keep the stock owners happy. And probably it will be the end of Tesla, because some day FSD owners will want their money back.

Many already do, if these forums are an accurate representation.

I don't know the FSD take rate, say, pre-2019 (or pre-Autonomy Day), but we may bit hitting a critical mass soon...X number of people who paid for FSD and never got to use it (some already have traded their vehicle in). Tesla has said they won't transfer, but will be offering a higher trade-in value to represent the FSD that was purchased. No idea what those logistics will look like but I can't imagine too many people in your situation are happy about paying $5k for software they never got to use...and have to re-purchase again at a higher price point on a new vehicle.
 
  • Like
Reactions: edseloh
They're both operated by humans. Why couldn't the Uber AV testing fatality occur in a Tesla?
That single fatality pushed Uber's safety record to far worse than the human average of 1 per 100 million miles. Obviously it could be a fluke or it could be evidence that testing autonomous vehicles on public roads is relatively dangerous and needs strict oversight by manufacturers.
Tesla has done a far better job of enforcing driver attention. In addition to the steering nags, Tesla also monitors the driver and kicks them off the Beta program if they don't pay attention. The Uber car had all the standard Volvo safety features (including AEB) disabled, while Tesla have not. Uber did 3 million miles previously before the accident. I didn't find the miles FSD Beta have travelled so far (for timeline AFAIK it hit EAP on October last year), but it shouldn't be too difficult to surpass that.
 
Stilgoe in the @diplomat33 posting said 100 people a day die from vehicle accidents in the USA.

And yet we are still talking about a handful of deaths in three years with computer controlled cars.

But keep in mind that those "computer controlled cars" had safety drivers in them who intervened to prevent accidents. It does not measure what the true accident rate would be if the autonomous car was driverless. Also, the number of fully autonomous cars on public roads is very low compared to human driven cars. The lower number of fully autonomous cars also reduces the total number of accidents. That is why the accident rate is so low. It is not an accurate measurement of the true safety of autonomous cars.

I do think that autonomous cars drive on the same roads as humans and I haven’t seen any of them pass DMV human license testing, arbitrary left turns, right turns, stop signs, stop lights, parking, speed limits, expressway ramps, etc. I think that is a minimal standard that we require for human teenagers. I could argue that teenagers have so many accidents that the test is insufficient as well.

I've read papers about having AVs pass a "driving test" of sorts before being deployed but the tests would need to be way more elaborate than the standard DMV driving test. The DMV driving test would not be good enough to certify AVs for deployment. Plus, with AVs, we can do more testing than we do with humans. So why would we limit ourselves to just the simple DMV test? One advantage of AVs is that you could do a series of detailed tests that include real world driving, track testing and simulation testing that would be way more comprehensive than the DMV test.

I do like the idea of standardized tests for AVs. I do think standardized tests will be one part of the deployment process. But I think proving AV safety will rely on a combination of real world safety data and comprehensive standardized safety tests.

I also like the idea of holding AVs accountable for failures just like we do with human drivers. AVs should be required to follow the same traffic laws as humans do. So if an AV breaks a traffic law or causes an accident, there should be consequences. The manufacturer could be fined and/or the AV could be temporarily suspended from public roads until the problem is fixed, depending on the severity of the offense. For example, if an AV breaks a minor traffic law and there was no incident and it was a first offense, maybe the manufacturer gets a warning or pays a fine. But if the AV was at fault in a serious accident, then the manufacturer would be liable for the accident, and the AVs would be recalled until the manufacturer proves that they have fixed the problem that caused the accident.
 
..., how do we expect untrained, non-employees to react to FSD testing on public roads? I'm sure everyone on this forum is an above average driver and will not succumb to complacency
People will be people and will become complacent. Body has automatic low power modes when not doing anything.

... When FSD shows up in their car, how prepared will they be?
Hopefully they will have read the disclaimer that the car can do the worst thing at the worst time. Hopefully Tesla will add to the disclaimer that this can result in death or serious injury if not paying attention.

What is the potential for tragic accidents?
Good question. Another interesting question: How many lives will it save?

What would the government response be to several of these accidents?
Tesla should have some statistics at that point. Hopefully Tesla will remove FSD Beta from those that don't pay attention. I can think of many schemes for FSD beta jail that vary depending on how egregious the situation is. Tesla can also rotate who can use it, so people don't become too comfortable with it.
 
Last edited:
We're expecting better performance from untrained and unpaid people?
People have learned from the Uber incident. So from that alone I would say yes. Other factors to consider:
  1. Tesla has a good disclaimer that can be made better: The car can do the worst thing at the worst time. Hopefully that will scare people.
  2. People love their car and are not interested in seeing it crash
  3. Uber did not have active driver monitoring. Tesla does. I'll give it a rating of poor, so hopefully Tesla will improve it.
  4. What kind of accidents with FSD will be most common? Running over curbs? Scraping bushes? What kind of accidents will be avoided by using this kind of system? Running over pedestrians seems like something the Tesla system can handle.
 
Hopefully they will have read the disclaimer that the car can do the worst thing at the worst time. Hopefully Tesla will add to the disclaimer that this can result in death or serious injury if not paying attention.
A written disclaimer that drivers can easily dismiss will not improve the safety of the car. This is legal maneuvering to protect Tesla, not to improve safety. If stuff like this worked, then every workplace that puts up a poster about safety would have spotless records. If the Uber accident is any example, even three weeks of task-specific training is not enough.

Good question. Another interesting question: How many lives will it save?
How many lives have been saved in the FSD videos we have already watched. Perhaps we can can extrapolate from that? :)

People will be people and actual dead bodies will outweigh theoretical hand-waving "lives saved" even if it is 100 to 1 and absolutely true.
 
...
How many lives have been saved in the FSD videos we have already watched. Perhaps we can can extrapolate from that? :)
Yes, and while you at it, extrapolate number of serious accidents and lives lost.

People will be people and actual dead bodies will outweigh theoretical hand-waving "lives saved" even if it is 100 to 1 and absolutely true.
Got to love the paranoids. Only the paranoid survive, although this might give you pause in that line of thinking:
 
receive any form of autonomy (level 3+) before I trade it in after 5 years.
That's a concern to me too. Temporarily inflated used prices at the moment are really tempting me to get out of my MS R FSD. In the next 2 years (based on the developments so far) seem unlikely to deliver a solid FSD product, especially in the UK. Even in climates more open to public testing of FSD there isn't really evidence of tacking the numerous edge cases. Testing hasn't even got started here. Every bus stop road marking has the car re-centre itself incorrectly on the carriageway. Speed changes all over the shop. Road catagories incorrectly identified.

So should I hang on while watching my car depreciate over the next couple of years in the hope that fsd might come good? Or get out now and try again once Tesla (or someone else) has a functional product?

There is still the charging network which I use rarely but do value. Is that reason enough to ignore the realities of UK FSD?
 
They're both operated by humans. Why couldn't the Uber AV testing fatality occur in a Tesla?
That single fatality pushed Uber's safety record to far worse than the human average of 1 per 100 million miles. Obviously it could be a fluke or it could be evidence that testing autonomous vehicles on public roads is relatively dangerous and needs strict oversight by manufacturers.
Tesla has done a far better job of enforcing driver attention. In addition to the steering nags, Tesla also monitors the driver and kicks them off the Beta program if they don't pay attention. The Uber car had all the standard Volvo safety features (including AEB) disabled, while Tesla have not. Uber did 3 million miles previously before the accident. I didn't find the miles FSD Beta have travelled so far (for timeline AFAIK it hit EAP on October last year), but it shouldn't be too difficult to surpass that.
Uber at the time of the accident had a passive driver monitoring system that involved the driver's supervisor randomly reviewing footage of the drivers. Uber admitted that this did not occur, or was not effective. That equated to no driver monitoring.

Tesla had steering wheel nags of up to 5 minutes that was easily fooled. They've changed the time period over the years to ~15 seconds depending on speed and driving conditions, again this is easily fooled.

Tesla has started to introduce driver monitoring via camera and new S models have two IR LED's to provide some illumination at nighttime. The effectiveness of this is slowly being documented.

As far as kicking off drivers from the Beta program, we have been told that it's happened but no evidence has been given to support this point. I don't doubt Tesla is doing this but we have no actual proof it's any more than a threat.
 
  • Like
Reactions: gearchruncher
Got to love the paranoids. Only the paranoid survive, although this might give you pause in that line of thinking:

YES! As a person who was hit by a vehicle while I was walking in a crosswalk, this article makes perfect sense to me. I jaywalk now, and if ticketed, I’ll just pay the fine. It will be way less than hospital co-pays…

it argues for FSD beta release in the near future.
 
Got to love the paranoids. Only the paranoid survive, although this might give you pause in that line of thinking:
Interesting article, although I think it makes a better argument for not having FSD at all than it does for allowing more FSD testing. I mean, there is no greater personal responsibility for safety than if one has to do all the driving themselves instead of letting a computer make the decisions for them.

I think the low accident rate (by FSD testers) has more to do with the low number of testers and the fact that most of them are putting up videos on YouTube than it says anything about the safety aspects of the FSD software by itself. The software may not yet be good enough to foster complacency in the drivers. If my car is trying to crash into something every couple of miles, I'm for damn sure paying closer attention than if it tries to crash into something every 10,000 miles.
 
  • Like
Reactions: Matias and DanCar
Yes, and while you at it, extrapolate number of serious accidents and lives lost.


Got to love the paranoids. Only the paranoid survive, although this might give you pause in that line of thinking:
One of the most interesting articles I've ever read. Thank you for sharing that!
 
  • Like
Reactions: DanCar
I think the low accident rate (by FSD testers) has more to do with the low number of testers and the fact that most of them are putting up videos on YouTube than it says anything about the safety aspects of the FSD software by itself. The software may not yet be good enough to foster complacency in the drivers. If my car is trying to crash into something every couple of miles, I'm for damn sure paying closer attention than if it tries to crash into something every 10,000 miles.

C’mon man... There are supposed to be 2,070+ FSD Beta 9 users, most of them Tesla Employees. And there are what, 20 YouTubers?

Every accident of a Tesla gets reported if the media get a whiff of it.

So with 2,000 testers at 20 miles a day? Forty thousand miles a day total. In ten days Four hundred thousand miles streamed back to Tesla Servers. Every two weeks likely a million miles reported back. The FSD crew should be able to factor that video back in, and release better driving neural nets every three weeks. So maybe 6-8 weeks for a larger Beta 9 FSD release?
 
  • Like
Reactions: Terminator857
C’mon man... There are supposed to be 2,070+ FSD Beta 9 users, most of them Tesla Employees. And there are what, 20 YouTubers?

Every accident of a Tesla gets reported if the media get a whiff of it.

So with 2,000 testers at 20 miles a day? Forty thousand miles a day total. In ten days Four hundred thousand miles streamed back to Tesla Servers. Every two weeks likely a million miles reported back. The FSD crew should be able to factor that video back in, and release better driving neural nets every three weeks. So maybe 6-8 weeks for a larger Beta 9 FSD release?
There are only 71 non-employee testers are far as we know. The rest of the employee testers may have additional limitations imposed on them that we are not aware of. For example, I am not aware of any of them posting on public forums such as this one or posting videos online. I don't really know how many YouTube/Twitter/Reddit/Instagram testers there are, but it would not surprise me if there are more than 20.

I think it is fair to say that we don't really have enough data to make any determinations on how many miles of data are being collected. You are assuming that each tester is using FSD for 20 miles per day. If FSD regularly fails on my commute to work, I may not even turn it on for that drive. Most of the YouTubers I've seen are making special trips to test the software. My guess is they are averaging far less than 20 miles per day of FSD driving.

We have yet to see an updated neural net for the current beta, so it is unlikely they are going to see an update every three weeks. The previous version did not adhere to that schedule for very long, either. As much as I would like to play with the FSD Beta myself, I don't see a wide release that quick unless we see large jumps in reliability with updated software in quick releases. They aren't going from where they are today to wide release without some updates. How long that takes is anybody's guess. I have heard some say "two weeks." ;)
 
I believe everyone here are responsible drivers and would use FSD safely. As will majority of drivers out there. However, there are some people who want to do things because they can, or because they want a video to go viral. Assuming V9 is released to public without much improvement from what we seen this week. Has anyone thought of scenarios where someone want to replicate Chuck's unprotected left turn and let the car do what it wants to do and not intervene? How about someone wanting to try it in a school zone during morning drop off and see how well it can navigate around kids and parents?

None of us on this forum will do the scenarios I listed above, but we can't say the same for every driver out there. What happened if a FSD test results in a pedestrian or other driver fatality? I don't think that's a scenario even Elon is willing to risk.
 
2017? FSD was way cheaper then. FSD went up to $10k in late 2020.

I ordered my MY and got FSD for $5k.
Yes FSD was way cheaper in 2017 but don’t forget EAP was not free. Combined cost of purchase of EAP/FSD after car was delivered wasn’t much cheaper than today’s price. My cars got it for less because I bought for both during the Spring 2019 “sale” — a total of $12k for the two cars. That $6k/car was a real bargain at the time. But what do I have to show for it today?
 
everyone here are responsible drivers and would use FSD safely.
I agree we probably all believe we are!

However, with aids like fsd, could anyone say they would not (sooner or later) become more likely to let their guard down and use an electronic device or become less vigilant as your confidence in fsd grows?

Even knowing it's beta, and accepting all the warnings and previsos, I can't see how to avoid complacency creeping in eventually.
 
Last edited: