Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP1 scare

This site may earn commission on affiliate links.
Usually a beta is followed by a final release... since they abandoned further development on AP1, does that mean it will always be in "beta"? Can I get my money back since it's clear it will never be 'finished'?

Crazy things happen even in final release software... however, putting it in a production car (that can kill you), charging $5,000 for it and perpetually calling it "beta" is... bullshit.

Tesla uses the "beta" label to limit their liability.
Tesla will never admit AP1 is abandoned, simply because that would suggest liability for never releasing the non-beta, production version. If this was non-beta, you could go after such an obvious flaw like cars running into clearly visible stationary objects. You thing AP beta is BS, what about FSD - what stage is that in, pre-alpha, dreamware, vaporware, slideware?
 
So it seems this is a known issue - but not exactly well known.

If i have it right, basically the (AP1 at least) hardware can't detect a stopped car in the path of travel soon enough at speed to panic stop if needed. Hence the need to stay engaged and vigilant.

Acceptable trade off for the "convenience" of AP? - still trying to figure that out for myself now ...
 
So it seems this is a known issue - but not exactly well known.

If i have it right, basically the (AP1 at least) hardware can't detect a stopped car in the path of travel soon enough at speed to panic stop if needed. Hence the need to stay engaged and vigilant.

Acceptable trade off for the "convenience" of AP? - still trying to figure that out for myself now ...

To be clear, this is a problem for *all* adaptive cruise controls.

Actually, AP has a better chance than almost all other adaptive cruise systems - the only one that seems to be consistently better is Subaru's stereo camera based Eyesight.

Most ACC will not stop for a car they never saw moving - Tesla might, if the camera recognizes it as a car (which usually works, but as you saw isn't 100%.)

I don't see it as a serious problem, as long as the driver is attentive and knows it can happen. We do certainly need better systems training for drivers these days...
 
How could you expect anything else? That's why you still need to monitor what is going on.

Well yeah, but the problem is that the users are in the dark about the program and parameters. Instead we are given a feature with limited explanation and you have the users discover the shortcomings; sometimes with horrible results. Many folks think about AI and autonomous features as intelligent, but the reality is that the features are only as intelligent as the programmer correctly predicted future scenarios. In this case, it appears programmers didn't correctly plan for leading cars to change lanes with a stopped (or slowed) vehicle ahead, and to slowdown to avoid impact.

We've become accustomed to using complex features that we don't understand enough. You're absolutely correct in highlighting the importance of monitoring what is going on, but that's not exactly how folks are using it. So far we've had someone asleep, another guy watching Harry Potter, and another guy who was passed out drunk...and those were just the ones that made headlines. My concern is that some folks will not use the feature responsibly and that creates a hazard for everyone out there. Be safe out there.
 
It seems radar should “see” an object in the path whether or not it is moving. White objects do seem to represent a challenge. That long ago fatal crash in Florida, the one in which a tractor trailer rig made a left turn across the path of the S, the trailer was white so wasn’t seen by the car. I understand fixed objects are a problem. When one comes up over a hill with an overpass in the near distance, that would be seen as an obstacle across the road. GPS helps there, those overpasses don’t move. When one travels in the late afternoon when tree shadows are long across the road, the camera sees the dark patches on the road and can brake to avoid these shadow objects. That’s better now than previously.

I’m sure all the programming and the interface with the GPS to identify overpasses is challenging.

I know my software is perpetually in beta. These issues need to be resolved before they even think about going to release versions, let alone rolling out FSD.

It seems they might approach the creeping along in traffic by identifying the car’s speed as very well below the posted limit, then using a different algorithm, maybe a lower speed setting and treat any object in the path as a possible obstruction.

That won’t help the parked fire engine hit at full speed issue, though. The GPS should help there. If there’s an overpass, the computer should be aware of it and avoid seeing it as a collision threat. I guess the problem there would be if there’s a parked fire engine under an overpass.

I think Tesla has a long way to go. I’m glad I’m able to use the autopilot beta versions while Tesla works out these problems, it makes the car a lot more interesting to drive. I have high hopes for the video game engines, they’ve been programmed from the beginning to predict and deal with colliding objects. Granted the stakes are a little higher when I’m on one of those potential colliding objects.

Anyway, I think my S with 2.0 hardware will be in beta for a very long time, maybe forever. Even if it goes from beta into release versions, I’ll treat it as still in beta. I can’t speak for the rest of you lot, but I’m far too valuable to meet my grisly end due to a programming error.
 
It seems radar should “see” an object in the path whether or not it is moving.

Radar does see stationary objects just fine.. that's not the issue. The issue is that most of these stationary objects get filtered out so your car isn't slamming on the brakes every time it sees an overhead sign, overpass, etc. The problem is when a legitimately stopped vehicle gets filtered out (i.e., mistaken for one of these other stationary objects that should get filtered). GPS-based "whitelists", vision models, etc all can help... but perfection is really, really hard. If the car correctly categorizes an object even 99.99% of the time, that means 1 out of every 10,000 objects it sees it miscategorizes... might not seem like a problem, until that one object is the stopped car you crashed into ;-)
 
Well it's not perfect but that's why you gotta pay attention. This sort of thing happens with pretty much any active cruise control. I won't even let the car accelerate hard when the car in front changes lanes. I will wait for a second and then disengage immediately or adjust the max speed because you gotta be always one step ahead. Glad you didn't risk it all and get into another AP crash. That would be bad for all of us ;)

I absolutely agree. The active cruise control on my BMW i3 for instance will specifically not see Dodge Chargers at night, I assume because of the tail light being on single shape instead of two. My MS AP1 won’t see cars merging close in front of me at a 45 degree angle.

Thank you for sharing constructively, rather than posting an angry rant :)
 
AP2 pretty much behaves the same way from my experience, it's definitely hit or miss and in these situations needs to be monitored. As others have said, the problem is that it uses radar for object detection.

Some cars have laser on the front which very accurately detects objects in front of the car, even if stationary.
 
Radar does see stationary objects just fine.. that's not the issue. The issue is that most of these stationary objects get filtered out so your car isn't slamming on the brakes every time it sees an overhead sign, overpass, etc. The problem is when a legitimately stopped vehicle gets filtered out (i.e., mistaken for one of these other stationary objects that should get filtered). GPS-based "whitelists", vision models, etc all can help... but perfection is really, really hard. If the car correctly categorizes an object even 99.99% of the time, that means 1 out of every 10,000 objects it sees it miscategorizes... might not seem like a problem, until that one object is the stopped car you crashed into ;-)

The word “see” has a number of meanings. By “see” I was referring to the process of sensing and then appropriately processing the information. Perhaps I should have used the word “notice”. I’m in agreement with you, the car needs to get it right, and if I’m in the car, it needs to get it right every time.
 
  • Like
Reactions: appleguru
It seems radar should “see” an object in the path whether or not it is moving. White objects do seem to represent a challenge. That long ago fatal crash in Florida, the one in which a tractor trailer rig made a left turn across the path of the S, the trailer was white so wasn’t seen by the car. I understand fixed objects are a problem. When one comes up over a hill with an overpass in the near distance, that would be seen as an obstacle across the road. GPS helps there, those overpasses don’t move. When one travels in the late afternoon when tree shadows are long across the road, the camera sees the dark patches on the road and can brake to avoid these shadow objects. That’s better now than previously.

You're not understanding how this works. There are reasons it doesn't see like you're thinking, which I'll try to explain here. What you're describing is much more typical of something like LIDAR, where you're projecting a laser along a single line, getting a range, then moving to a new line.

Automotive radar is generally continuous wave radar, in Tesla's case chirping continuous wave. What that means it the transmitter is putting out a single signal that covers the entire area the radar sees the whole time. It sees things by doppler effect and direction finding - objects moving at a different speed than the car reflect the signal at a different frequency than the one transmitted, with the frequency difference proportional to the speed difference. Using multiple receivers and the timing difference between the other frequency signal arriving at them, it has a sense of where the object is relative to it.

The chirping gives the car distance - 20 times per second, the car does a frequency hop, then times how long that hop takes to come back on the doppler frequency. Both the distance and direction are of limited resolution, based on how precisely the car can measure the timing involved.

So the initial approach of all automotive radar was just to throw out anything that wasn't moving (had a doppler shift that's the same as the car's speed.) That's why they couldn't be used to come to a complete stop and start up again. Some of those first generation designs are still in use, like what Ford offers in the Fusion.

Second generation continues to throw out everything that's stationary, but once the radar locks on to something in front that is moving, it remembers it and keeps track of the distance with the chirps. That's where most cars with ACC are today, able to to full stop and go flawlessly as long as the car in front was moving and in range when they got in front.

Tesla added another layer - the car does some sort of sensor fusion (not sure exactly how it's implemented) and compares things the radar sees with things the camera neural net thinks are vehicles - and so if the camera NN recognizes a car as a car, TACC will start responding to it and tracking it with the chirps even though it is stopped and was never seen to move.

Now you've got a soda can pointed at the car, a stopped car, and a big road sign next to the car. So the radar bounces off of all of them, with a doppler shift that says it's stationary. But it comes back as one return, and the signal is mixed in with all the cracks in the road and overpasses and such - all on the same frequency, but returning the chirp with different timing. Breaking them apart to realize that one of them is a stopped car on the road requires a whole lot of processing, which has to be done in very little time (twenty chirps per second coming through...) Unless the camera NN recognizes it as a car, all three of the above designs will likely hit it.

One approach Tesla said they were taking that can mitigate that is the radar whitelist. Basically, the cars all make a record of the stationary returns they see as they're driving, with as much information about location and signal intensity as the car can put together. Those get reported to the mothership, and it stitches them into map tiles of what a car should see on a given road in a given lane.

Then the car would download tiles for the area around it, and if the car sees something different than the tile says, it's either a stopped car or the aforementioned aluminum can (which has a nice concave bottom that gives a radar return far beyond its size.) If the resolution is high enough, it can serves as a third reference to back up the cameras and GPS for Autonomous driving in the future - that return is 5.2 degrees to my left at 200 feet, so by the map I'm six inches into the right lane.

This was supposed to be rolling out as part of 8.1 back in the day, but I haven't heard anything about it in a year or two, so I'm not sure what the status is. One challenge is how you deal with false positives caused by changes to the road signs or other nearby objects. I guess the car could pop a red hands take over immediately at the first couple times a car sees it, then add it to the map?
 
....You're absolutely correct in highlighting the importance of monitoring what is going on, but that's not exactly how folks are using it. So far we've had someone asleep, another guy watching Harry Potter, and another guy who was passed out drunk...and those were just the ones that made headlines. My concern is that some folks will not use the feature responsibly and that creates a hazard for everyone out there. Be safe out there.


Well there is no solution to "stupid" is there.

That's like saying a GT3 is "bad/dangerous" because some reckless "stupid" guy drives 180 mph on a 3 lane road, crashes and dies. As long as the autopilot doesn't suddenly try to make a rapid turn out of nowhere pointing the car at the center divider (something that basically doesn't make sense at all and is very dangerous) I don't see any issue with the current system if it is used with a driver that pays at least some attention to what his vehicle he is in charge off is doing. I know you are not trying to defend the autopilot haters, but it drives me nuts when people are the main problem and instead blame it on technology they don't understand.

In 3 years I haven't had a close call with my AP1 and I drive most of the time on AP. There were some situations where I had to take over, but I saw them coming seconds before it happened.... and I am even someone who drives hands off.
 
So far we've had someone asleep, another guy watching Harry Potter, and another guy who was passed out drunk....

That Harry Potter part isn’t correct. That comment was made by the truck driver who made the turn in front of the Tesla causing the accident. There was no recording of a Harry Potter found in the car. That story was debunked by the NTSB.

The NTSB report is available on line. It is worth reading. It’s an eye opener to see just how much information the car collects as it is driven.
 
Last edited:
  • Like
Reactions: kavyboy
I personally can't stand how quickly AP tries to accelerate in that situation you were in. I wish it did a better job of taking into account the multiple slow vehicles in the adjacent lane and distance to the vehicle in front of you, to have a slower acceleration profile.
I agree that there is room for improvement in this area. Generally, I think AP1 adjusts its speed rather apruptly, which is not comfortable for passengers - whether it's accelerating in the above situation or during a lane change, or decelerating when traffic slows down. All could be done more gradually.
 
Last edited:
Well there is no solution to "stupid" is there.

That's like saying a GT3 is "bad/dangerous" because some reckless "stupid" guy drives 180 mph on a 3 lane road, crashes and dies. As long as the autopilot doesn't suddenly try to make a rapid turn out of nowhere pointing the car at the center divider (something that basically doesn't make sense at all and is very dangerous) I don't see any issue with the current system if it is used with a driver that pays at least some attention to what his vehicle he is in charge off is doing. I know you are not trying to defend the autopilot haters, but it drives me nuts when people are the main problem and instead blame it on technology they don't understand.

In 3 years I haven't had a close call with my AP1 and I drive most of the time on AP. There were some situations where I had to take over, but I saw them coming seconds before it happened.... and I am even someone who drives hands off.

I'm glad you're responsible and taken time to understand how the system works. I don't think your analogy works because that implies the user is merely being reckless. How about instead of the GT3, you use the 930 Turbo as an example. It took folks a lot of scary incidents to understand that turbo lag in a RR with a short wheelbase can produce unpredictable oversteer. Understanding the design can help the user operate the car in a safer manner. In the AP analogy, I'm saying the system would be safer if folks knew how the system works and the situations when it doesn't work which many have identified here. Forum-folk generally seek more knowledge, but the average AP user probably doesn't know half of the stuff that the folks here know.
 
I think error on the side of caution with AP1 or any autopilot. I replaced my 2013 S with 2015 S with AP1 and read a lot about the system before actually using it. I think a good rule of thumb is keep your hands on the wheel at all times. This system is less worry on a freeway but in the rain, corners, secondary roads, snow, and stopped traffic stay alert and ready to take over. I drive 20K a year and absolutely love the ease of use this gives me on freeways and long trips. The trick is to learn from others mistakes thank you to the original poster on this CDSmith.
 
I think error on the side of caution with AP1 or any autopilot. I replaced my 2013 S with 2015 S with AP1 and read a lot about the system before actually using it. I think a good rule of thumb is keep your hands on the wheel at all times. This system is less worry on a freeway but in the rain, corners, secondary roads, snow, and stopped traffic stay alert and ready to take over. I drive 20K a year and absolutely love the ease of use this gives me on freeways and long trips. The trick is to learn from others mistakes thank you to the original poster on this CDSmith.

Mjmiron, how did your purchase and delivery go?
 
I think a good rule of thumb is keep your hands on the wheel at all times.

The problem is, keeping your hands on the wheel and paying attention doesn't count... you have to be mindful to torque the wheel every 20 seconds for it to register. Part of my daily commute is a 20 mile straight stretch of freeway and the car is a little too good at keeping a straight path ('riding on rails') for there to be any steering torque applied. After they increased the nags to 20 seconds; I find it's less stressful to simply DRIVE myself with just TACC.

As long as it works, I'm fine with whatever driver attention mechanism Tesla wants to throw at me... if they want to use hand detection; use a sensor that can actually detect hands... and works to lessen the drive burden vs. increasing it.

AP wasn't designed for steering torque/driver attention. It is a kludge implemented to throw a bone into NHSTA's investigations of AP related deaths. I find it particularly offensive when Tesla puts out statements after a fatal accident that their logs show "the drivers hands weren't on the steering wheel for N seconds", knowing that they can't state that with any real accuracy.

They could use the cabin camera in the 3 for eye detection, however that could be admitting S/X implementations are flawed and should be retrofitted.
 
Last edited:
Scary, but another reason to be aware of what’s going on with your car. I usually disengage AP when a car moves from in front of me at low speeds because of the jerky acceleration.
My scary moment was passing by an highway offramp (that I had passed hundreds of times on AP) when it decided to wildly swerve to take it. It probably would have hit the barrier if I had not taken over quickly.