Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot - What's a Valid Road Line to Follow?

This site may earn commission on affiliate links.
It would make things so much easier.

Bro, your horse image is cool. You know we could automate a Jockey... way easier than a car.

Full Self Riding - FSR (not Force Sensing Resistor)
  • L/R Whips, Reins and Kick control
  • 110 lbs cyclic motion (stride balancing)
  • "Ya - Ya" Horse Headphones
  • 3 Cameras (for lane changes)
  • No emergency braking needed!
  • Bio metrics (Heart-rate, Temperature, Fluids)
  • Speedometer, Accelerometer
  • Hoof audio and motion performance analysis (4)
Done!
 
Or that they don't understand human learning (HL) AND don't consider that as part of the overall safety equation.

If a tire sensor tells me that my air is low, we don't need to know HOW it knows. Machine knowledge is so deep, and we thought reverse engineering Assembly Language was hard (notice I didn't use AL). We still have no clue how our brains work, we only know it does it. See my point?

If you're in EAP (on the freeway of course) and someone cuts you off, should you:
  1. Hit the brakes?
  2. Let off the accelerator and change lanes?
  3. Let EAP perform the safest maneuver?
What if I told you that hitting the brakes disengaged EAP (but you didn't know that)? Would that change your response?

Then what if only the front cameras actually worked and EAP would not (yet) know if there a car was in your blind spot?

See what I'm saying? Tesla is not considering the total safety equation by keeping the human side ignorant, especially in these risk situations. But then again, "who needs training", it's just an added cost. Eventually, yes, this will be true when the Human response doesn't matter. During the migration where it's Human-Machine Control, I completely disagree!
Not that I think they don't care, but they are definitely a "throw out this feature and see what happens" kind of company. They don't seem to have a real guiding principle. If their guideline was "teach computer, and teach human to work with computer" we might see that sort of feedback.
 
Not that I think they don't care, but they are definitely a "throw out this feature and see what happens" kind of company.

Ya, we hear that loud and clear from S owners on the V9 split screen thing. I don't know details, have an M3 here. I just don't like that approach when it comes to autopilot. How did Pilot to Nav get as far as Beta before they said "that's really hard." Really?
 
Ya, we hear that loud and clear from S owners on the V9 split screen thing. I don't know details, have an M3 here. I just don't like that approach when it comes to autopilot. How did Pilot to Nav get as far as Beta before they said "that's really hard." Really?
The owners manual warns that Autosteer can struggle with cracks and similar in road.
 
The owners manual warns that Autosteer can struggle with cracks and similar in road

Thank you for making my point. But still has trouble? Therefore the algorithm is still flawed and isn't actually looking for what we see as a valid line. Maybe it's plain lack of processing power. It's a very simple shape and quite repeatable, so they obviously aren't setting the bar very high on line criteria.

And to my other point, the car also knows where the other cars are (and their distance). So why would a line suddenly move to the right 3 ft in an instant when the cars didn't? You have to look at both IMO. And when there is no line, or a line appears but not in an expected area of the road, consider but don't conclude without further analysis of circumstances. It knows the usable road surface, find center and go right of that. No road and no line? Slow or stop.

Folks, I'm not new to AI. Least Minimums etc... But no expert either. I just see myopic vision here, not taking other things into account. A second brain overseeing the 1st with more general rules (experience either trained directly as a rule or through supervised review of pass/fail to identify). "That's a line, that's not..." We are talking about a repeating rectangle, what am I missing? A college kid could write python code to find the lines on that road... seriously. Maybe the algorithm needs to be flexible enough to analyze deeper as confidence drops. "Study" it further, but the code needs to be there and the system has to keep up with the all the tasks. Code Interrupts and high frame rates get in the way, so they have to balance it is my thinking.

They'll get there eventually, just sayin this looks simple. Maybe that comment on V9.0 will solve this. We'll see soon (I hope).