Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

TACC/Autopilot discussion moved from general thread

This site may earn commission on affiliate links.
Totally agree. No reason to consider FSD at this point but crazy not to buy EAP considering where we are likely to be in just a few years.
The reason I bought FSD is that I want to watch Tesla's baby driving AI grow up. People who wait until it works won't get to see that. It will (possibly) be fascinating, well worth the price of admission. Or it might be so fast it's not really visible on our time scale. At this point we have AlphaZero learning to be better than any human at go in four days, at chess in four hours. So how long can self-driving take? Remember, that's the training time, so we're looking at far less than the interval between two software updates to my car.

So the real question is if we apply similar technology, and if we try to write the rules of the driving game, just how much more complicated than chess are they? And how does the training time scale with the complexity of the rules? And how long did it take AlphaZero to get better than the average human rather than any human?

My expectation is that FSD will just suddenly be working one day, better than human, and it will just be a matter of changing regulations and deploying it, perhaps after a few months in shadow mode. As Elon has pointed out, if it works reasonably well it would be a terrible thing not to deploy it as soon as possible.
 
So the real question is if we apply similar technology, and if we try to write the rules of the driving game, just how much more complicated than chess are they?
I'd say a lot more complicated, bound only by the laws of physics. Unexpected possible events are endless in driving, no deer are jumping across a game board during play.
 
No mention of the NTSB/NHTSA investigation opened after a Tesla crash w/ a fire truck, while the Tesla had autopilot engaged?
If these incidents continue it’s only a matter of time before someone gets hurt really badly. Sooner or later someone will be standing at the rear of the stopped vehicle (like a fireman grabbing a tool to fix the car) and be crushed between the Tesla and the stopped vehicle.
 
If these incidents continue it’s only a matter of time before someone gets hurt really badly. Sooner or later someone will be standing at the rear of the stopped vehicle (like a fireman grabbing a tool to fix the car) and be crushed between the Tesla and the stopped vehicle.
Not really due to electric or autopilot though. Emergency personnel is always in harms way. Incidents happen probably every single day. It's a people thing.
 
I'll just remind people that I said this problem was a lot harder than most people thought. Nobody's close to full-self-driving. However, at the moment the Cruise exec seems to have the best *attitude* (Tesla has a bad attitude), which means Cruise will probably get there first. A good attitude requires proper respect for the ludicrous complexity of the environment encounted by an automobile, and a dedication to ferretting out all the myriad corner cases. Once the problem is *well-defined*, then it can be solved. Tesla has the largest dataset but I'm pretty sure they aren't analyzing it properly because their attitude is wrong.
 
I'll just remind people that I said this problem was a lot harder than most people thought.
Yes, over and over.

Nobody's close to full-self-driving.
Yes, that does seem to be your opinion. Others disagree. Or perhaps it's just a failure to define terms.

Once the problem is *well-defined*, then it can be solved. Tesla has the largest dataset but I'm pretty sure they aren't analyzing it properly because their attitude is wrong.
Somehow people drive. Despite it being an ill-defined problem (according to you). AIs are now doing things that people do, by training. They are learning to do things in what seems to be a similar fashion to how people learn -- do it over and over and improve over time. It seems that all you have to have is a problem suitable for fast iteration and analysis as to whether a decision leads to a better or worse outcome. There is no reason to think that AIs haven't developed to the point where self-driving is a solvable problem. If it is, then it will be solved surprisingly quickly.

The evidence is that even fairly stupid humans can drive acceptably, while it takes geniuses and years of training to play go and chess well. So on some axis, driving is a much simpler problem. I don't see any reason why we should think that it won't be solvable for AIs. If they can do well at reading comprehension, they can do well at driving.

But here we are, stupid humans reasoning by analogy. Elon's superpower is to reason from first principles and come up with better, faster, impossible solutions. So I'm willing to believe him when he says it is a "solved problem" and have some confidence that the solution will be deployed "soon".
 
  • Like
Reactions: ValueAnalyst
The driver of the Tesla is my dad's friend. He said that he was behind a pickup truck with AP engaged. The pickup truck suddenly swerved into the right lane because of the firetruck parked ahead. Because the pickup truck was too high to see over, he didn't have enough time to react. He hit the firetruck at 65mph and the steering column was pushed 2 feet inwards toward him. Luckily, he wasn't hurt. He fully acknowledges that he should've been paying more attention and isn't blaming Tesla. The whole thing was pretty unfortunate considering he bought the car fairly recently (blacked it out too).
Tesla allegedly on Autopilot hits firetruck with 65mph • r/teslamotors

Sounds to me like he was way too close to the pickup truck at that speed. If AP was engaged is still to be proven as it normally should have reacted.

We heard from a lot of drivers blaming AP at start and later it was proven to not be the case. We'll see.

I have a model S without AP and the other day I did a lane change to pass a vehicle going maybe 10kph below the speed limit. I got to 1/4 way past and the steering was yanked in the direction of the vehicle seemingly because it thought I was going to hit the raised curb on the side of the lane away from the vehicle I was passing. I immediately pulled hard against it keeping my car straight in its lane and avoiding what I fear would have been a collision. I don't know whether a bird or some oversized bug maybe fooled the system or a blade of grass from the raised curb did, but it was spooky and why you need to have your wits about you even when driving a car that has emergency braking. Maybe it wouldn't have collided and would have corrected again but the amount of force I needed to apply makes me unsure about that. The front of the car got within 20cm or so of the other vehicle.

People say that Tesla's AP is flawed by not using LIDAR but I feel that if people can drive cars safely with 2 eyes then 8 cameras should work with the the right and correctly trained AI.

I put this accident down to some sort of thought mistake in the AI which was compounded by the driver not leaving a safe distance for the speed. It's possible to set the AEB to an unsafe level, maybe that should be revised so that people are protected from themselves.
 
  • Disagree
Reactions: DurandalAI
I have a model S without AP and the other day I did a lane change to pass a vehicle going maybe 10kph below the speed limit. I got to 1/4 way past and the steering was yanked in the direction of the vehicle seemingly because it thought I was going to hit the raised curb on the side of the lane away from the vehicle I was passing. I immediately pulled hard against it keeping my car straight in its lane and avoiding what I fear would have been a collision. I don't know whether a bird or some oversized bug maybe fooled the system or a blade of grass from the raised curb did, but it was spooky and why you need to have your wits about you even when driving a car that has emergency braking. Maybe it wouldn't have collided and would have corrected again but the amount of force I needed to apply makes me unsure about that. The front of the car got within 20cm or so of the other vehicle.

People say that Tesla's AP is flawed by not using LIDAR but I feel that if people can drive cars safely with 2 eyes then 8 cameras should work with the the right and correctly trained AI.

I put this accident down to some sort of thought mistake in the AI which was compounded by the driver not leaving a safe distance for the speed. It's possible to set the AEB to an unsafe level, maybe that should be revised so that people are protected from themselves.
Disagreed because you said you have a Model S without Autopilot, and claim that the car took corrective action on the steering wheel.
 
I have a model S without AP and the other day I did a lane change to pass a vehicle going maybe 10kph below the speed limit. I got to 1/4 way past and the steering was yanked in the direction of the vehicle seemingly because it thought I was going to hit the raised curb on the side of the lane away from the vehicle I was passing. I immediately pulled hard against it keeping my car straight in its lane and avoiding what I fear would have been a collision. I don't know whether a bird or some oversized bug maybe fooled the system or a blade of grass from the raised curb did, but it was spooky and why you need to have your wits about you even when driving a car that has emergency braking. Maybe it wouldn't have collided and would have corrected again but the amount of force I needed to apply makes me unsure about that. The front of the car got within 20cm or so of the other vehicle.

People say that Tesla's AP is flawed by not using LIDAR but I feel that if people can drive cars safely with 2 eyes then 8 cameras should work with the the right and correctly trained AI.

I put this accident down to some sort of thought mistake in the AI which was compounded by the driver not leaving a safe distance for the speed. It's possible to set the AEB to an unsafe level, maybe that should be revised so that people are protected from themselves.
Personally I think you DID hit the raised curb, and that's what yanked the steering wheel.
 
  • Like
Reactions: DurandalAI
Anybody know when TACC will respond to the curvature of the road? Sane drivers usually slow down going into a curve and speed up exiting the curve. But not Autopilot, it just plows ahead at the same speed regardless of passengers hurling their lunch.
 
Anybody know when TACC will respond to the curvature of the road? Sane drivers usually slow down going into a curve and speed up exiting the curve. But not Autopilot, it just plows ahead at the same speed regardless of passengers hurling their lunch.
Not sure what you mean. Are you talking about major highways or maybe a mountain pass? What I have noticed mostly is that it does not recognize a speed limit change. Like which switching freeways you will see a slower speed limit while entering the exchange and this is not recognized by the IC. Of course this only happens when you are in the lane that goes into the freeway exchange. I would like this new speed limits to be honored which I think will be required to support on ramp to off ramp support in EAP.
 
Anybody know when TACC will respond to the curvature of the road? Sane drivers usually slow down going into a curve and speed up exiting the curve. But not Autopilot, it just plows ahead at the same speed regardless of passengers hurling their lunch.
I have a 2015 Model S with AP1. It often slows down going into curves but not always. Sometimes it's downright amazing how well it handles a curve by slowing down going into it and then speeding up on the way out.
 
TACC and autopilot are not the same thing. TACC goes at the set speed unless there is slower traffic in front of you. Autopilot will indeed slow down for curves... sometimes too much. My car doesn't have either, but I've experimented with loaners to try to understand the behavior.
 
Well, in my experience, AP1 is not nearly as responsive to curves as I would like, but I mostly drive on surface streets. I know AP is not intended for surface streets, but that is what I drive. I do wish that Tesla would make progress on that kind of driving. It's been a couple of years since I've noticed any real improvement, but I guess they got tangled up with AP2 issues.
 
I just was looking at files that had been saved by
Sentinel mode and find that some files from my left front repeater are too corrupted to play at all and others are partly fine and then disintegrate into a pixelated mess. I tried to imbed a copy but I get a Security Error message.