Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD V9 First Impressions - General Public Access Seems Way Off

This site may earn commission on affiliate links.
Maybe. As a counter example we can look at Uber's self driving development program that involved a trained safety driver.
I'd be curious what was the training involved. Uber was paying close to minimum wage and the "trained" safety driver wasn't paying attention, instead watching something on tablet. Tesla will have active driver monitoring and hopefully improve on monitoring over time. Tesla car owners are hopefully more attentive than low paid safety drivers.

That did not help further Uber's goals towards self driving robotaxis.
The Uber program was run by people with a financial interest in making a buck. That goal failed and everyone has learned. Obviously "training" $15 an hour people wasn't a good idea. Active monitoring was needed.
 
Until you can trust the system to not run into a concrete pillar or oncoming traffic if you take the eyes off the road for one second, it should not be released in any capacity to the public. Anything less than that is insanity.

It doesn't appear to me that this is the bar that Tesla is going to use for an L2 system. They have a much lower bar for released-to-the-public Autosteer (see the video below). It's unclear to me whether they'll use such a strict requirement as you outline for Autosteer on City Streets. Clearly the consequences of a mistake are likely to be far higher for City Streets, but since the driver is supposed to be paying attention at all times, why would the system need to be trusted to not do these things occasionally? That's the big question here, of course. How conservative is Tesla going to be? It's inevitable that someone who is not the driver will be killed by this feature, eventually (I have no idea when). But it's not clear whether that inevitability will prevent its release to the public.

I really have no idea what the metric will be. It's not even clear to me what it should be. I tend to be more conservative and agree with you, but I'm just saying that doesn't appear to be what prior experience indicates will happen.

But, if I had to guess, release to the public will be gated by whether it's a usable feature at all, not whether it can be trusted. (Though that too is not a metric backed by the history, if you look at the Smart Summon release.) That still seems pretty far off in any case. Autosteer itself is only useful in a very narrow set of use scenarios, so it seems to me the set of useful scenarios for Autosteer on City Streets given the current apparent limitations will be even narrower...to the point of being possibly non-existent.

 
Last edited:
  • Like
Reactions: Matias
To be a useful driver assist feature, you have to be able to give it some trust. Autopilot on freeways is fairly unstressfull, same with autopilot in heavy stop and go traffic. In these situations, the feature is reliable enough to take stress off of driving while remaining safe; making the feature useful.

But FSD looks incredibly stressful... far, far, more stressful than just driving. Yeah you can "use it responsibly", but that requires too much vigilance right now.
Youtubers are trying to show interesting and challenging situations. On my typical drives there is nothing challenging. For most I suspect it will be a useful tool like TACC, and people will know where to disengage FSD, like unprotected left turns on to a 6 line highway.

... if you take the eyes off the road for one second, ...
People should be trained that way for normal driving.
 
I think its early to speculate, we dont even know how their DOJO computer will process all the data to better the versions to come. I do think its promising, i know these early beta videos are still showing the same challenges as before, but I think its closer than people realize. I dont think its 6 months, i think it'll be sooner, but Christmas time might be when we get to experience it. Im hoping anyway. Ive been waiting a while too with experiencing what FSD should be as it was promised so long ago now. For the first time in a while, i have real hopes that we will get to see it.
 
  • Like
Reactions: rxlawdude
Monitor them well, take away FSD if they fail to pay attention.
If it could also apply to existing ICE/EV drivers.… With all the annual accidents measured, and the anecdotal experiences with bad drivers on the roads today that we all have, wouldn‘t Beta 9 potentially be a vast improvement even with its limitations?

So just back of the envelope calculations, 1-5% of all drivers could have their license revoked for bad driving. Are we really going to hold autonomous driving to better levels of safety than human drivers?

Let Beta 9 take the DMV human driving test. If it’s supposed to replace human drivers, then it should be able to pass all driving tests. If it has an accident, or reckless driving ticket, do we ban it State-wide or just that particular car? Obviously it won’t get speeding tickets, DUIs Or fall asleep while driving.
 
  • Like
Reactions: rxlawdude
Now we're using Uber as a proxy for Tesla? SMH.
Tesla is not releasing any data on their FSD program (safety or otherwise), therefore we have to look to analogs to see what problems Tesla may have with their current plan of limited public (non-employee) testing and future wide release testing program (e.g. the topic of this thread). Uber is an appropriate analog because there is a fair amount of data available on what is essentially a "worst case" scenario. From there, we can look at the circumstances of that accident, and see if Tesla's testing program may run into the same problems.

The report I will be referencing can be found here:

Collision Between Vehicle Controlled by Developmental Automated Driving System and Pedestrian (download of pdf is on the right side of the screen).

Starting on page 22, we get some background information on the driver. Quick summary is:

44-year-old female
No alcohol or other drugs in blood
In the 10 years preceding the accident, she had 4 traffic violations, the last one in April 2016 for speeding.
Worked the day before crash
Slept for 7 hours the night before crash
Started her shift a 7:30 pm.
Crash happened at 9:58 pm on March 18, 2018.
She worked for Uber in the autonomous car division (ATG) since 2017.

I would say that her profile is pretty typical of a Tesla owner. If a hypothetical Tesla owner was doing some testing on FSD, it probably would happen after work at maybe the same time. We have already watched videos of non-employee testing happening at 3 am.

Now let's look at the training the operator took prior to participating in testing.

Training program was three weeks long, with the first two weeks in Pittsburgh. The first week was three days of classroom instruction and two days of familiarization with the vehicle. The second week was closed-course and on-the-road training. Training included encounters with aggressive drivers and jaywalking pedestrians. During some of the training, motorized dummies were used to simulate pedestrians. Drivers were also trained on the limitations of the system. The final week of training was at the home base with a mentor. Once that was complete, the driver was approved for testing duties.

Now let's consider Tesla. What sort of training program are the Tesla employees receiving? Do the current non-employee testers receive any training? I have not seen anything on the employee testing or on non-employee testing, so my assumption is that there is no training. Uber trained their drivers for three weeks and still had an accident after the driver had gained a year's worth of experience. A rhetorical question at this point; Is Tesla handling this better or worse than Uber?

About six months before the crash, Uber went from a two safety driver model to a single driver model. Previously, there was a driver in the drivers seat watching for problems. The second monitor was recording any problems or issues where the car did not do something correctly. When they consolidated to the single driver monitor, the driver was tasked with watching for problems and also recording issues encountered.

Prior to the crash, the driver had driven this same route 73 times. The driver was watching a video on her cell phone prior to the crash. She was looking down when the pedestrian walked in front of the car. She claims that she was interacting with the display in the dashboard, but evidence shows she was looking at her phone. She looked up one second before the crash, and tried to turn away 0.02 seconds before the crash. She hit the pedestrian at 39 mph.

This was not her first day on the job, she had traveled this same route many times. It was her fault the pedestrian died. However, the automation was a contributing cause, as it says in the NTSB report,

"Research pertaining to automation monitoring and operator interaction with automated systems is comprehensive. Across domains, automation complacency is identified as a critical consequence of automation—a decrement in performance that results from less-than-adequate monitoring of an automated system by a human operator."

Now, as that relates to Tesla's automation testing plan, how do we expect untrained, non-employees to react to FSD testing on public roads? I'm sure everyone on this forum is an above average driver and will not succumb to complacency when using FSD beta software, but think about the people that do not follow this forum. When FSD shows up in their car, how prepared will they be? What is the potential for tragic accidents? What would the government response be to several of these accidents? How good does FSD have to be before that happens? Is it good enough already?

Just some things to think about as we all eagerly await our chance at trying this software out.
 
wouldn‘t Beta 9 potentially be a vast improvement even with its limitations?
Not from what I can see. What do you think the accident rate of FSD beta would be if left unsupervised? What is the average accident rate of humans?
Let Beta 9 take the DMV human driving test. If it’s supposed to replace human drivers, then it should be able to pass all driving tests.
Those driving tests are designed for humans. They don't work on self-driving cars because self-driving cars operate nothing like humans. Self driving cars could have passed human driving tests a decade ago.
So just back of the envelope calculations, 1-5% of all drivers could have their license revoked for bad driving. Are we really going to hold autonomous driving to better levels of safety than human drivers?
Yes. I don't think the suggestion of removing only the worst drivers and replacing them with AVs is going to happen. I am curious what the collision rate is for the bottom 1% of drivers...
 
  • Like
Reactions: diplomat33
After watching a ton of V9 vids, I actually do think we might get a release by the end of the year, but one that either requires driver confirmation before attempting complex maneuvers like unprotected turns, handling construction, driving around stopped vehicles in the road, etc, or one that blares out alerts and alarms and requires the driver to take over to handle those complex maneuvers instead.
 
  • Like
Reactions: edseloh
Not from what I can see. What do you think the accident rate of FSD beta would be if left unsupervised? What is the average accident rate of humans?

Stilgoe in the @diplomat33 posting said 100 people a day die from vehicle accidents in the USA.

And yet we are still talking about a handful of deaths in three years with computer controlled cars.

I do think that autonomous cars drive on the same roads as humans and I haven’t seen any of them pass DMV human license testing, arbitrary left turns, right turns, stop signs, stop lights, parking, speed limits, expressway ramps, etc. I think that is a minimal standard that we require for human teenagers. I could argue that teenagers have so many accidents that the test is insufficient as well.

The past years seem to show that geofenced Waymo tech is acceptable in good weather with high quality road in parts of Phoenix.

The Beta 9 software seems to work well in low traffic reasonable painted highways.

The Tesla Autopilot direction along with GM Cruise of a driver with a high quality L2 system can also work.

From those successes it seems likely that geofenced L5 systems could exist using either the Tesla Vision approach or Lidar with high quality maps.
 
  • Love
Reactions: rxlawdude
Stilgoe in the @diplomat33 posting said 100 people a day die from vehicle accidents in the USA.

And yet we are still talking about a handful of deaths in three years with computer controlled cars.

I do think that autonomous cars drive on the same roads as humans and I haven’t seen any of them pass DMV human license testing, arbitrary left turns, right turns, stop signs, stop lights, parking, speed limits, expressway ramps, etc. I think that is a minimal standard that we require for human teenagers. I could argue that teenagers have so many accidents that the test is insufficient as well.

The past years seem to show that geofenced Waymo tech is acceptable in good weather with high quality road in parts of Phoenix.

The Beta 9 software seems to work well in low traffic reasonable painted highways.

The Tesla Autopilot direction along with GM Cruise of a driver with a high quality L2 system can also work.

From those successes it seems likely that geofenced L5 systems could exist using either the Tesla Vision approach or Lidar with high quality maps.
What do you think the collision rate for unsupervised beta FSD is? What is it for humans?

Check out the DARPA Grand Challenge from 2009, it was basically a driving test.
When you make a test you need to validate it somehow. To do that you'd have to test many AVs to make sure the results of the test correlate well to real world results.

There are plenty of States where Tesla could release FSD as a L5 system today if they wanted to.
 
funniest thing I noticed from the videos I watched was the number of times they would cancel FSD because it would have taken them on a route that they didn't want to drive.
One guys said he wanted to go a particular route so he took over until the nav figured it out, then re-engaged FSD.
Elon insists that waypoints aren't needed - but FSD is the biggest example of why they are essential.
With FSD the options are:
a) FSD with no option to choose the route
b) Don't use FSD
 
There does seem to be an issue with the navigation system. Watching the videos you see FSD unnecessarily change lanes back and forth, and force @Chazman92 into a navigation loop around the block.
Separately, I also think that navigation causes issues with smart summon. I think FSD can avoid hitting most things in a parking lot but navigation can't figure out how to properly route the path.
 
People should be trained that way for normal driving.
People take their eyes off the road all of the time in low risk situations. I'm including glancing at the GPS to adjust your location as an example of "taking your eyes off the road". There are many legitimate situations where your gaze is off the road for one or two seconds.

The video showing a Tesla swerving into a pillar really concerns me. It happened so fast that I would trust FSD 9 much less than I would trust no driver assist at all. Normally, if I'm going straight at a constant speed with no traffic, I can glance at my console for 1 second without the car trying to murder me.

If your software is bad, throwing hardware in does not help. I understand that we're not anywhere close to exceeding HW3's capabilities.
I thought they've already cannibalized the redundancy in order to squeeze out the last bit of compute of HW3?

A theoretical HW4 would give room for more (and more complex) NNs. The fact is that Tesla released HW3 without knowing how much compute they actually needed, and we all know Tesla errs on the side of inexpensive when it comes to hardware in production.

They should have a better idea of how much compute they really need now.

My prediction is that they will come out with HW4 once they have release-ready software. They'll need to _prove_ that they got the HW right this time in order to mitigate the upcoming disaster that is going to be "sorry guys we got the HW requirements wrong... again".
 
I'm just disappointed. I bought the car in Europe with both EAP/FSD. Every time Tesla needs money, Elon will tweet some sh1t that FSD is feature complete, only regulations, bla bla. But in reality, I don't think my car, delivered in 2017, will receive any form of autonomy (level 3+) before I trade it in after 5 years. And that makes it the worst $10000 I have ever spend on software. Its like buying a Ferrari without the ignition key. Everything to keep the stock owners happy. And probably it will be the end of Tesla, because some day FSD owners will want their money back.
 
Last edited: