Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

It's happening....FSD v9 to be released (to current beta testers)

This site may earn commission on affiliate links.
Part of the problem is that Elon is misusing software development terminology:

Alpha testing (what is happening now) is carried out by the testers who are internal employees of the organization. The main goal is to identify the tasks that a typical user might perform and test them. Opening testing to a handful of non-emolpyees does not make it beta testing.

Beta Testing is performed by "real users" of the software application in the "real environment" and it can be considered as a form of external User Acceptance Testing. It is the final test before shipping a product to the customers. Direct feedback from customers is a major advantage of Beta Testing. This testing helps to test products in customer's environment.
 
It always seems they're reacting to events they see outside the car. I guess I'm just assuming because I think I would have trouble focusing on both the real world and "the mind of the car" at the same time. I suppose the beta testers may have the innate ability to do that.
I have seen them comment on the path displayed on the screen while stopped but it's not clear how helpful it is to them.
So imagine there was no car display at all. What would the testers do? With no knowledge of the intent or state of the cars awareness, they would have to look everywhere just in case (and probably end up so fatigued in a short time they would give up). But in fact, they can (and do) glance at the screen as the car (say) stops a a red light, and yes, they can see it intends to (say) turn left into the first left lane, which allows the tester (and, later, everyone) to focus on the car doing that maneuver correctly (and watching for appropriate traffic hazards). Also, the display provides reassurance that (say) the car has seen a pedestrian walking across the crosswalk in front of the car.

I agree that dividing your attention is probably something to be careful about, but in reality we already do it all the time (checking mirrors, speed, and nav maps). My guess is after a short time it gets to be automatic, and a quick glance will tell us "all is ok with the car".

It would be an interesting test (though irresponsible given the current state of the beta) for someone to cover the display and let the car drive. I suspect all the testers would be very uncomfortable with this.
 
So imagine there was no car display at all. What would the testers do?

I think it would be just fine. There are so many cues - steering wheel movement, acceleration, etc., that as long as you are alert, I think not using the screen at all would be fine (there's still a substantial lag on the screen anyway). There is so much time to react if you are alert. The steering wheel is not turning unless you permit it to, and even acceleration can be nearly immediately stopped (less than a second)!

It's certainly nice to see what the car is planning to do (for the planning that you mentioned) and that could "take the edge off," but I don't think it's necessary to observe it as a driver since you have so much input when you're fully connected to the vehicle with your feet and hands and rear end.

Probably more useful ultimately as a passenger in the hypothetical driverless world where the passenger is watching the car's planning, just to provide some feedback and confidence.
 
  • Like
Reactions: pilotSteve
I think it would be just fine. There are so many cues - steering wheel movement, acceleration, etc., that as long as you are alert, I think not using the screen at all would be fine (there's still a substantial lag on the screen anyway). There is so much time to react if you are alert. The steering wheel is not turning unless you permit it to, and even acceleration can be nearly immediately stopped (less than a second)!
I respectfully disagree. Your observation is in real-time. The "mind of the car" gives you a proactive view of what the car is "thinking" of doing (like the path lines).
 
I respectfully disagree. Your observation is in real-time. The "mind of the car" gives you a proactive view of what the car is "thinking" of doing (like the path lines).
I'm not saying there's no value in it (for exactly the reasons you and others suggest). I'm just saying it's not necessary for safety. It seems to me it is more necessary for reducing anxiety and improving peace of mind - and also understanding (to some extent) "why" the car did what it did.

If the car isn't doing what you want or expect, as the driver of the vehicle, you just intervene immediately. You don't need the visualization for that.
 
Last edited:
  • Like
Reactions: pilotSteve
I'm not saying there's no value in it (for exactly the reasons you and others suggest). I'm just saying it's not necessary for safety. It seems to me it is more necessary for reducing anxiety and improving peace of mind - and also understanding "why" the car did what it did.
But I am suggesting it is necessary for safety. For the reason stated. Proactive vs reactive.
 
But I am suggesting it is necessary for safety. For the reason stated. Proactive vs reactive.
I guess I will also respectfully disagree. I think it comes down to whether you think on average a system which displays with about half a second or a second lag is useful for providing critical information about future behavior, and whether the occasional removal of eyes from the road (which may lead to critical information loss) as a driver is driving the vehicle is worth the extra information gleaned about the system aiding the driver.

As an example, I think it's a safety hazard to take your eyes off the road when you are sitting stationary in line at an intersection (to use your phone, etc.). You lose key information about vehicle and driver behavior and pedestrians near the intersection which may help prevent a collision when you start moving. You need to constantly be developing that map of the position & movement of all objects and making predictions about future movements.
 
Last edited:
  • Like
Reactions: gearchruncher
I can't think of any human that would stop in that situation. I also don't think the double yellow was crossed.
Agree completely - if the car had stopped dead in the lane even though there was no oncoming traffic people would howl that it was a dangerous "flaw" in the program. Doing so would be far more unsafe than veering lightly across the center line just to move around an open door, which 99.99% of us would do.
 
  • Like
Reactions: pilotSteve
I sit back every weekday and watch it drive . the first week i was holding my hands around the steering wheel but not touching it. The next week i watched very closely and sat forward in the seat without hands on steering wheel. Eventually i recline back and let it drive now. I also think the car learns your route because it seems to have gotten better when the lines on the road are very wide. It used to swerve now it picks the middle
 
I sit back every weekday and watch it drive . the first week i was holding my hands around the steering wheel but not touching it. The next week i watched very closely and sat forward in the seat without hands on steering wheel. Eventually i recline back and let it drive now. I also think the car learns your route because it seems to have gotten better when the lines on the road are very wide. It used to swerve now it picks the middle
As per Tesla:
Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.

Your description appears to be following the pattern of Normalization of Deviance. At each step you have found the system did not crash you so you have learned to trust it. From the first week you should been holding the steering wheel (which means touching). Note: Normalization of Deviance does not end well.
 
Part of the problem is that Elon is misusing software development terminology:

Alpha testing (what is happening now) is carried out by the testers who are internal employees of the organization. The main goal is to identify the tasks that a typical user might perform and test them. Opening testing to a handful of non-emolpyees does not make it beta testing.

Beta Testing is performed by "real users" of the software application in the "real environment" and it can be considered as a form of external User Acceptance Testing. It is the final test before shipping a product to the customers. Direct feedback from customers is a major advantage of Beta Testing. This testing helps to test products in customer's environment.


Ehhhhh there's a lot of abuse of these terms.

As far as I'm concerned, Alpha is incomplete with substantially missing features, crash prone, lots of temp assets, etc. Beta should theoretically be able to do everything within the design spec, but is likely buggy.

Beta would frequently have several phases of testing, starting with internal QA, then opening to a small private beta group (which would usually be NDAd so hard if they posted a screenshot we'd take half of their teeth at random, their favorite chair, and the contents of their fridge before, the idea of acting as an unpaid marketing team like Tesla has their testers do would never be taken seriously), and then maaaaaaybe having an open beta if we need to stress test things with real world users hitting servers.

I do agree that this is not a beta in any traditional sense, though. This would be a mid-late Alpha of the autosteer on city streets DLC, with unpaid contractors :p
 
  • Like
Reactions: Jejunjm
I sit back every weekday and watch it drive . the first week i was holding my hands around the steering wheel but not touching it. The next week i watched very closely and sat forward in the seat without hands on steering wheel. Eventually i recline back and let it drive now. I also think the car learns your route because it seems to have gotten better when the lines on the road are very wide. It used to swerve now it picks the middle
The car does not learn .. at least not in the sense of the AI/NN learning, that is done entirely at the factory. Some nav systems (though I don’t think this is true of Tesla) do try to remember route preferences if you divert from the nav route, but otherwise there is no learning going on.
 
  • Like
Reactions: gearchruncher
The rationale here (I think) is that (at least in beta) its more important to understand what the car is intending to do, since it may be intending to do something stupid. Sure, the map tells you where you might need to go, but not driving into a concrete wall is probably a bit more important.

I think the point of "mind of car" is to serve as a reality check, so the driver-- at least in the early stages-- can determine if the car's idea of the world is in fact correct. It serves as feedback to what the cars sees and how it will react. For example, in some of the videos I've seen from the few who have it, the "snake"-- the line in front of the car that shows where it plans to go-- has led them to disable the current FSD session since they could see the car was planning to do the wrong thing (make a left turn from the wrong lane in one case).
My comment - as was at least 1 other - was for people who do *not* have FSD but have the v3 computer. I do not want that junk polluting my screen.