Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
"It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road"
The car did the wrong thing but fortunately not at the worst time. I don't see how you can be so confident that there wouldn't have been a collision if it was at the worst time? It should be obvious that a car that does the wrong thing at the worst time is not safe when driven by someone not holding the steering wheel correctly.

I'm not disagreeing that he was being careless. But at what point are you going to hand the car over to drivers like him? Tesla apparently are confident enough that they are doing this now. I'm not saying they are right, but if they are not right, then when should they do this? "When the car is reliable" isnt an answer, since it cannot be made reliable for bad drivers without testing it with bad drivers.
 
  • Like
Reactions: WattPwr
It applies because in the case of the airplanes they were looking at a biased sample. All the calls here for "trained drivers" etc are, in effect, creating a biased sample of drivers. That's useless when training the car (and NNs in general).
It seems like I'm missing something very important. How do the beta testers contribute to training the car? Why does having lower skilled beta testers help?
 
  • Funny
Reactions: AlanSubie4Life
I'm not disagreeing that he was being careless. But at what point are you going to hand the car over to drivers like him? Tesla apparently are confident enough that they are doing this now. I'm not saying they are right, but if they are not right, then when should they do this? "When the car is reliable" isnt an answer, since it cannot be made reliable for bad drivers without testing it with bad drivers.
When the car does not require a driver to monitor it.
Maybe it can be safe with drivers like him but I doubt it. I think any car that randomly does the wrong thing at the worst time cannot be safe in the hands of most drivers.
 
Last edited:
How do the beta testers contribute to training the car?
There's some things an active beta tester can better provide in terms of video, internal Autopilot data and text feedback that Autopilot team can manually investigate for reports and disengagements. Clearly Tesla is able to collect data from the fleet with shadow mode triggers, and this could be useful for sending back discrepancies from neural network outputs of 1 second ago vs now, e.g., a destination turn lane predicted from before the intersection doesn't match up with what the predictions show as the non-FSD vehicle went through the intersection.

Non-early-access 2020.40.* versions could have the same set of neural networks available but the early-access versions specially have code to turn on the new driving behavior and visualizations. I would guess a good number of fixes in the recent early access versions are "just" software 1.0 control behavior, e.g., get into the appropriate lane sooner.

Specifically for the late disengagement of FSD beta steering into oncoming traffic lane. I've had plenty of cases even with current Autopilot where it will correct itself as it gets closer to the wrong lane. Most of the time it's on relatively empty streets, so I let it do its thing to see if it would correct itself. One could imagine a shadow mode trigger detecting last-second Autopilot corrections, so avoiding too early of a disengagement could potentially be useful. But then again, maybe that's why I wasn't selected for "safe driver" early access… :p
 
Yeah, you're saying that the purpose of the beta is to determine how safe the beta is when tested by select Tesla owners with a random sampling of skill level.
That seems contrary to what Tesla is saying though and seems like it could be detrimental to the ultimate goal of achieving driverless operation.

If the only goal is to reach driverless operation, you logic is fine. Since the driver is never involved (and doesnt exist), keep him out of the equation, and use trained testers to monitor the car during development. That’s Waymo.

But that is not Teslas only goal .. they also want to ship FSD as a driver assist package (regardless of what Elon tweets). So they need to understand how that package works when supervised by actual drivers.

Let’s take a (hypothetical, but not absurd) example. The car can probably safely drive by a parked car with only a few inches clearance, since it knows far more accurately where it is than a human. Tesla explain that to the special trained drivers, who, after a few panic attackers when seeing this happen, settle down and take it for granted, admiring the car. Then, they ship the software, and ordinary drivers freak out when they see this “scary” behavior, grab the steering wheel, swerve over and hit another car. And Tesla get blamed.

So, you need to tune the car for how to keep average drivers happy, which mean you have to test with average drivers.
 
But, aren't 90% of all drivers above average? :)

I certainly am. ;)

What I don't get is WHY the car would be allowed to use up the margin for error..... either on the part of another driver, pedestrian or even.... the FSD system.

A car door on a parked car could open just a fraction... or more by a kid or elderly person... or heck any one not concentrating, and blam!

And honestly, knowing how poorly the car warns me when I'm closer than a foot to any object, I don't trust that FSD is really making a sensible call.

In the UK I believe it is law that you allow 1.5m (around 5') when you pass a cyclist.

Especially during these provisional beta tests on public streets, if the car is capable of such precise control, shouldn't it be precisely leaving a larger margin for error?
 
Great news for Canada & Norway!
https://twitter.com/elonmusk/status/1323360818336092167?s=19

upload_2020-11-3_0-50-8.png
 
If the only goal is to reach driverless operation, you logic is fine. Since the driver is never involved (and doesnt exist), keep him out of the equation, and use trained testers to monitor the car during development. That’s Waymo.

But that is not Teslas only goal .. they also want to ship FSD as a driver assist package (regardless of what Elon tweets). So they need to understand how that package works when supervised by actual drivers.
I suspect that is the case but the second goal seems like it will distract from the first goal. They are already generating a huge amount of disengagement data. Using free labor to do so seems much better than the Waymo approach as long as you can ensure that beta testers are vigilant and skilled.
When I see a driver unable to stop the car from randomly turning the steering wheel 90 degrees while going straight it makes me think that it is not safe for most actual drivers.
Let’s take a (hypothetical, but not absurd) example. The car can probably safely drive by a parked car with only a few inches clearance, since it knows far more accurately where it is than a human. Tesla explain that to the special trained drivers, who, after a few panic attackers when seeing this happen, settle down and take it for granted, admiring the car. Then, they ship the software, and ordinary drivers freak out when they see this “scary” behavior, grab the steering wheel, swerve over and hit another car. And Tesla get blamed.
Even if that is the optimal way to drive (I'm skeptical) it makes the system impossible to monitor. If the car hits a parked car Tesla will not accept responsibility and the owner will face ridicule on this very forum! If Tesla were to accept responsibility that would make monitoring the system much easier as there would be things you wouldn't have to worry about. The vehicle needs to drive in a way that gives the user enough time to correct errors.
 

Impressive job handling some stop signs and turns down some fairly narrow residential streets.

Mind-boggling to me that people don't have two hands on the wheel, or at least one hand in a useful position. Bottom of the wheel like that has the absolutely least ability to correct from any errors.

Also the vehicle is turning the wheels before starting from a stop sign (not good practice because you can get rear-ended and pushed into oncoming traffic), see 2:05 or so for an example. I assume it may do the same thing on protected left turns or when turning on unprotected lefts.

Seems to wiggle the steering wheel a lot. It seems like it must be very unpleasant to ride in the car, in current form! Presumably they'll smooth that out at some point.
 
  • Like
Reactions: 1 person