Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
I am curious though, who's fault would it be if he did override it with the go pedal but still get hit? Or if he had stepped on the brake a fraction second late and had the car poke out too much that the incoming cars cannot avoid. Would that still be the driver, or would there actually be occasions where the car is at fault because it puts the driver in an no-win situation? I think these issues need to be resolved before giving more people access to FSD Beta.

The driver is always at fault.
 
  • Like
Reactions: 82bert

Good short drive. Did a really good job on a roundabout where it properly entered without stopping and went around fairly seamlessly. It might have helped that it had a lead car. One disengagement where the car was in a lane that was merging into the right lane. It probably could have merged in, but the car in the adjacent lane that the Tesla needed to merge into decided to be an ass and started accelerating to cut off the opening in the lane, so Chuck had to manually merge over himself.
 
The driver is always at fault.
I doubt that will be always the case.

If FSD or indeed another passenger of the car exerts control over the car in a manner that the driver can not anticipate, the driver might not be found fully responsible. For example if you're using FSD on a straight road and FSD applies a full-deflection steering wheel rotation for no reason and with no warning causing a crash it would not be reasonable to expect the driver to be driving with a death-grip on the wheel just in case FSD was to do that.

It is going to be tricky law, but FSD cannot just do whatever "it" wants and all responsibility rests on the driver. There remains the possibility of malfunction, product defect, bad programming, or whatever. Nor is the driver able to fully blame FSD. It may depend on the situation.

Here is a somewhat relevant case from Canada where a non-driver caused a crash. The passenger reached over, grabbed the wheel and caused the crash. She was thus found responsible of being the "driver" at that moment.

[42] I find that (the driver) did nothing wrong and was not negligent in her operation of the vehicle that night. Specifically, she was not impaired; she was not speeding; notwithstanding her novice driver’s licence, she had the proper degree of skill and experience to operate the Jeep; she was attentive and alert; she did not allow the Jeep to wander from its proper course on the highway; and she could not have anticipated that (the front seat passenger) would do something so foolish as to grab the steering wheel and jerk it to the right….

[51] When (the front seat passenger) grabbed the steering wheel, she exerted an effort to control the Jeep’s trajectory. As such, she was, for a brief period of time, “driving” the Jeep by moving the steering wheel, and she was, for an equally brief period of time, “operating” the Jeep by inputting some control over its steering function.


[52] For those reasons, I find that just before the Jeep went off the road, both (the driver) and (the front seat passenger) were driving it. (the front seat passengers) efforts were unwelcome and unhelpful, not to say outright dangerous, while (the driver’s) efforts were blameless.


How will responsibility lie for an FSD crash? It might depend on whether the driver could/should have been able to react and prevent the accident or not. Some things are probably beyond the driver's ability to prevent, notwithstanding any attempt by Tesla to state a disclaimer covers all cases of product defect.
 
Last edited:
I agree. Tesla should only widely release FSD when its failures are somewhat predictable. At this point, it doesn't seem so. We'll see with 10.1.

Elon says that most people aren't aware this technology exists, so wide release will help with that. I don't see the point of FSD publicity. Release it when it's safe and predictable, like current AP.

But then again, what do I know. Tesla has done bold things with AP in the past and gotten away with it.
 
Looks like street signs aren't a strong signal (or doesn't exist as a feature yet?) for the Lanes network resulting in a confident path to the right as it predicted it was a right-turn only lane (ignoring the clearly visible sign, left turn signal and left turn navigation):
turn signs.jpg


The road markings are features for the neural network, but it doesn't help when the paint for both arrows are faded:
faded arrows.jpg


AI Day had multiple mentions of street signs, and the auto-labeling presentation did show signs colored differently, so most likely this is already in the works.. for 10.1?
 
  • Like
Reactions: powertoold
Another Lanes network misprediction wanting to make a left ignoring the overhead no-left-turn sign. I wonder if incorporating signs will include both allowed and disallowed turns at the same time. I suppose it does make some sense to train both together as the neural network might accidentally pick up on the black arrow of turn signs not realizing the red circle with slash actually indicates not-allowed.

no left.jpg
 
Another Lanes network misprediction wanting to make a left ignoring the overhead no-left-turn sign. I wonder if incorporating signs will include both allowed and disallowed turns at the same time. I suppose it does make some sense to train both together as the neural network might accidentally pick up on the black arrow of turn signs not realizing the red circle with slash actually indicates not-allowed.

View attachment 712796

I don't think there is any take away from that moment due to all the disengagements in a row, and the routing information had the car doing a left up ahead.

The navigation itself should never direct the driver (regardless of human or robot) to do a left where a left isn't legal.
 
I don't think there is any take away from that moment due to all the disengagements in a row
The initial disengagements were because it kept wanting to switch into the right lane to "follow route" probably because Lanes network predicted the left lane being a left-turn-only lane and the right lane being a straight & right turn lane. This matches up with the behavior of it "committing" to the left turn when it got to the intersection because it really thought it was left-turn-only instead of going straight. Although quite odd that the multiple clear road markings for straight lane and right-turn-only lane were not strong enough signals. At AI Day, Karpathy described a Space-Based Queue with the slide showing "e.g., Push Every 1 Meter" so it seems like the road features here should have been available as an input. Probably just needs more training data to boost the signal.

push queue.jpg


The navigation itself should never direct the driver (regardless of human or robot) to do a left where a left isn't legal.
It did know there's an upcoming left, and it's hard to tell from the recording, but the system likely understood and displayed "300 ft" -> 200 -> 100 then "Now" when it was actually at the correct place to turn. But yes potentially it was confused by these two close intersections thinking this earlier one was the place to turn left.
 
  • Helpful
Reactions: mhan00
Absurd metrics about setting off front collision warning. You mean like when the car in some stop and go traffic vision only can’t determine cars slowing ahead due to the crappy pixel magic metric and calculations it uses and slams the brakes on and FC goes off? So I get penalized for their sh$&ty vision system trying to save itself as a metric to use against me? What a joke. It goes on and on these scenarios where the car is its own worst enemy and goes against the metrics they have decided to use. Easy to avoid it? Yeah I guess don’t use AP and drive like grandma manually is the answer. The car will sabotage itself and ruin the metrics.

should be hilarious who who actually gets, if it happens, the expanded beta.
 

i really like the format this guy uses where he just puts in clips of the interesting parts. This does open up possible criticism of him cherry picking, but he doesn’t seem to shy away from showing the good or the bad (in this video the car makes a horrible left turn at a three way intersection he’s never had a problem with before, swerving into the opposite lane fortunately with no oncoming cars). I’m honestly not sure how warranted the praise he has for the last “maneuver“ in the video. He thinks the car waited for the car in front of him to clear the intersection before moving, but I doubt it. I’ve seen too many other times in other videos the car following closely and potentially causing gridlock if the light changed.
 
Ha ha ha. Wow this is the quality of map data that FSD needs to deal with:
bad map.jpg


The blue map navigation line is drawn by Tesla's aggregated/OSM-based maps while the gray lines are from Google Maps. The red triangle and FSD visualization do show GPS and cameras agree the street was just passed, but Tesla's map data believes it's still ahead. Estimating the mismatch based on the parking lot aisles off the previous street mapped on Google Maps, that's about 100 feet difference.

FSD is trying to work with maps that aren't even precise to 30 meters! (That's 3 orders of magnitude off from centimeter-level accuracy. :p)
 
Ha ha ha. Wow this is the quality of map data that FSD needs to deal with:
View attachment 713150

The blue map navigation line is drawn by Tesla's aggregated/OSM-based maps while the gray lines are from Google Maps. The red triangle and FSD visualization do show GPS and cameras agree the street was just passed, but Tesla's map data believes it's still ahead. Estimating the mismatch based on the parking lot aisles off the previous street mapped on Google Maps, that's about 100 feet difference.

FSD is trying to work with maps that aren't even precise to 30 meters! (That's 3 orders of magnitude off from centimeter-level accuracy. :p)
But it was good that it followed what it actually saw vs. what the bad on-board maps told it. Which does tell us that it doesn't really rely on maps for it's driving decisions.