Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

EAP HW2 - my experience thus far... [DJ Harry]

This site may earn commission on affiliate links.
Not really, if they read the manual and follow the MANY warnings about the limitations of the systems in use - which in many cases far exceed the competition and a few might be a little behind. Those claiming other cars do way better should read the manuals of those vehicles to see very similar warnings as their manufacturers also do not believe they've covered all corner-cases!

I have a 2017 Chevrolet Volt which has the following "Driver Confidence" packages

-Side Blind Zone alert - I trust it
-Rear Cross Traffic alert - I trust it
-Forward Collision Alert - NOPE
-Lane Keep Assist - NOPE
-Low-speed Forward Automatic Braking - NOPE
-Following Distance Indicator - Sort of
-Intellibeam automatic high beam assist - NOPE

and will Chevrolet ever ever ever be able to fix the features I don't trust? NOPE

I'm glad it was only a lease.
 
I have AP 1.0... But I also think an acceleration setting would be helpful...

Congrats on being one of the first 1,000 to get to try AP 2.0. Many of the people here probably would have paid an unreasonable sum to be in that position!
AP1 owner here .. at least in FW7.x, were was a setting for Autopilot. IIRC it was called something like "accelerate for passing." I turned it off to hypermile on a 500 mile trip. Turning it off slowed the rate of acceleration while passing, that you seem to be asking for. No longer present, at least with AP2?
 
AP1 owner here .. at least in FW7.x, were was a setting for Autopilot. IIRC it was called something like "accelerate for passing." I turned it off to hypermile on a 500 mile trip. Turning it off slowed the rate of acceleration while passing, that you seem to be asking for. No longer present, at least with AP2?
They were referring to an issue when a car in front of you moves out of the lane the Tesla accelerates too quickly to catch up to the next vehicle or max speed - not necessarily when passing.
 
They were referring to an issue when a car in front of you moves out of the lane the Tesla accelerates too quickly to catch up to the next vehicle or max speed - not necessarily when passing.
Correct . And here is the response I got from Tesla. So they are at least acknowledging it as a problem.

For your first item that makes you nervous regarding accelerating too quickly we can certainly submit your feedback to hopefully have that improved in a future update.
 
So just to clarify - If AP1 used TACC on a freeway that had no one in front and a car was stopped up ahead - it would not stop for it when it was in range ?

On EAP I haven't experienced the scenario yet as the range is supposedly doubled and I have only tried it with traffic.

If Tesla saw that car when it was moving before it stopped, AP1 can detect it 100% of time. If the car was allready at complete stop when Tesla saw it first time, AP can detect it in most cases, but not allways.
 
AP1 owner here .. at least in FW7.x, were was a setting for Autopilot. IIRC it was called something like "accelerate for passing." I turned it off to hypermile on a 500 mile trip. Turning it off slowed the rate of acceleration while passing, that you seem to be asking for. No longer present, at least with AP2?

The only thing that option does is allow the car to accelerate in the lane behind someone as it starts to move left to pass them when you hit the turn signal - I don't believe it has any effect on the rate of acceleration when your lane is finally empty.
 
  • Like
Reactions: Matias
That seems like a really dangerous way to "discover" corner cases; especially since most of the situations discovered in this manner will be things that have already been discovered by Tesla (and perhaps discussed in these forums) but just not publicized to owners in a way that they will understand and seek to avoid.

I didn't say this was either the primary way Tesla was identifying potential issues or a desirable outcome - it's just what happens when people become excessively lazy/arrogant/stupid and ignore both the owner's manual warnings and the traffic around them.
 
Sigh. Would you guys please go read the other threads where this is discussed ad nauseam? This challenge isn't unique to Tesla.

When using radar, it is very difficult to distinguish the difference between a stationary car in front of you at a distance and a fixed object that may be on the side or above/below the road. Imagine the road turns right up ahead, and on the left side is a fixed object. To the car, before it understands the road turns, that object could be a stopped car or something on the side of the road. Obviously we don't want the car needlessly slowing down or stopping for something on the side of the road, so AP tries to use the camera for more context. But the radar works further than the camera.

The way Tesla AP normally determines the object ahead is in fact a car is to track it while it's moving. We know the speed at which we're traveling. If the closing distance rate exactly matches our speed, it's possibly a fixed object. But if the closing distance rate is ANYTHING else, it's a moving object, and AP will tag it as such. Now if it stops, we still understand that it's a car, and we should slow down.

The net result is if a car in front of you is moving, even just a little - 1mph will do - when it comes into range of your radar, you're good. If the car in front of you is already stationary when it comes into range, AP must use other sensor data for context, and it may make a decision late or a poor one.

This works just fine for highway traffic as well as stop and go traffic during rush hour. Where you get into trouble is more rural areas with hills and turns and infrequent stops where it is more likely for you to come around a corner/hill at speed to stopped cars at a light.
Andrew,

I really appreciate your helpful message.

But I always return to "if the humans can do it, so can the computer" conclusion. I don't care what we have that the computers aren't programmed with: we can figure out what we have and program the computers with it, or something at least as good. For instance, there's lots of tools:
  • 3D via converting time dimension into spatial dimension (depth)
  • 3D via multiple inputs (even across input types: radar + camera)
  • memory (big data (maps, detailed, as well as prior drives)) of where the road is, so that a stationary object in it is known to probably be in it (these are probabilities)
  • Using more cues from all sensor input (+ memory, but humans do this also without memory pretty well although memory helps them) where the road goes. I think perhaps Tesla hasn't fully programmed in the concept of where the roads go into the analysis of the car. I think they are still growing their AI capability, and eventually, the AI will figure this out for itself, but for now, Tesla is stuck trying to cough up some old fashioned MobileEye thinking that tries to pre-determine all the possible outcomes in hand-drawn code (which of course is a rote memory type approach that never works in real life).
  • Brake lights on the stationary object (bright red!)
  • Heat emanating from the stationary objects, indicating they are active yet not going, if infra-red cameras are added (this would help with animal detection)
  • If backscatter x-ray detectors are added, then skeletal features could help with next point. (Helpful with animal detection, such as deer in forests, like I deal with every day.)
  • Use all sensor input to determine if someone is in the object (especially a vehicle). If a person is in a car, probably they're doing something. Could be head, even arm.
I have to go to work, but I thought of at least 6 things that humans do that computers could do even better that could distinguish a stationary object, plus another 2 that with added hardware could do even better. Where there's a will there's a way.

So, you did a good job of describing the current challenges, which is good, but it doesn't fit the customers' understanding that Tesla could solve these, and most likely will, so why haven't they already is the wonderment.
 
  • Like
  • Love
Reactions: ohmman and u00mem9
Andrew,

I really appreciate your helpful message.

But I always return to "if the humans can do it, so can the computer" conclusion. I don't care what we have that the computers aren't programmed with: we can figure out what we have and program the computers with it, or something at least as good. For instance, there's lots of tools:
  • 3D via converting time dimension into spatial dimension (depth)
  • 3D via multiple inputs (even across input types: radar + camera)
  • memory (big data (maps, detailed, as well as prior drives)) of where the road is, so that a stationary object in it is known to probably be in it (these are probabilities)
  • Using more cues from all sensor input (+ memory, but humans do this also without memory pretty well although memory helps them) where the road goes. I think perhaps Tesla hasn't fully programmed in the concept of where the roads go into the analysis of the car. I think they are still growing their AI capability, and eventually, the AI will figure this out for itself, but for now, Tesla is stuck trying to cough up some old fashioned MobileEye thinking that tries to pre-determine all the possible outcomes in hand-drawn code (which of course is a rote memory type approach that never works in real life).
  • Brake lights on the stationary object (bright red!)
  • Heat emanating from the stationary objects, indicating they are active yet not going, if infra-red cameras are added (this would help with animal detection)
  • If backscatter x-ray detectors are added, then skeletal features could help with next point. (Helpful with animal detection, such as deer in forests, like I deal with every day.)
  • Use all sensor input to determine if someone is in the object (especially a vehicle). If a person is in a car, probably they're doing something. Could be head, even arm.
I have to go to work, but I thought of at least 6 things that humans do that computers could do even better that could distinguish a stationary object, plus another 2 that with added hardware could do even better. Where there's a will there's a way.

So, you did a good job of describing the current challenges, which is good, but it doesn't fit the customers' understanding that Tesla could solve these, and most likely will, so why haven't they already is the wonderment.

Tesla is actively pursuing several of these things - they announced temporal smoothing with radar point clouds and the whitelist map a few months ago.

Being doable isn't the same as being easy, however. The tasks you describe involve a lot of complicated, safety critical code. I have confidence Tesla will do it well eventually, but I don't find it at all surprising that they haven't managed to yet.
 
  • Like
Reactions: 3Victoria
@Ulmo My intent was not to suggest that the problem isn't solvable. Many of the solutions you propose I believe Tesla is working on. Direct access to the radar data as well as having 3 cameras with different focal lengths I believe help most your suggested list. What I find confusing is two-fold: 1) Why people are surprised by the current behavior; AP1 has had this challenge since launch, and it's been out now for 15 months. 2) Why people believe AP2 should have already fixed the issue when it is currently nowhere close to parity with AP1, much less improved.

Sorry if my post came off snarky (is that even a word?)
 
  • Like
Reactions: Ulmo
So just to clarify - If AP1 used TACC on a freeway that had no one in front and a car was stopped up ahead - it would not stop for it when it was in range ?

On EAP I haven't experienced the scenario yet as the range is supposedly doubled and I have only tried it with traffic.

That appears to be correct in most cases. I've been in the situation once and I didn't feel like it was going to stop so I hit the brakes.
 
Now the EAP experience that I loved so far.
  1. EAP disengaged when one of my camera got a covered by slush ( it rained here) . I cleaned all cameras and it was backup again. I like that it errs on the side of caution if one of the sensors is not at 100%
Does this not concern anybody? I understand we're a long way off from fully autonomous driving, but what happens when a sensor(s) gets obscured by the elements or a plastic bag or something? How can the car continue to function without user input if any of its eyes are blinded? Build in redundant sensors and *hope* they aren't all obscured?

Maybe we need to bring these back ;)

upload_2017-1-5_10-53-1.png
 
I would think that EAP and FSD capability will respond to the scenarios the same way a human driver does if a bag blows against the windshield blocking his/her view....he pulls over as safely as possible and deals with it. We already know that the cameras have heating elements and that most of the sensors have been located in a place where things should generally not "collect" under most driving conditions. if an obscuration takes place, it appears the system already recognizes it and informs the driver that it can no longer provide service until the item is cleared. Snow or dirt buildup is a real issue even for human drivers (when I lived in Chicago I can remember times where I had to get out of the car to clear the windshield because the wipers/defroster simply couldn't keep up with things and ice/snow started to buildup on the windshield.

I would expect this system to be no different.
 
@Ulmo My intent was not to suggest that the problem isn't solvable. Many of the solutions you propose I believe Tesla is working on. Direct access to the radar data as well as having 3 cameras with different focal lengths I believe help most your suggested list. What I find confusing is two-fold: 1) Why people are surprised by the current behavior; AP1 has had this challenge since launch, and it's been out now for 15 months. 2) Why people believe AP2 should have already fixed the issue when it is currently nowhere close to parity with AP1, much less improved.

Sorry if my post came off snarky (is that even a word?)

To answer #2 in a tongue in cheek manner...

When Elon Hype tells you that the new 'super computers' can power the Matrix, make Skynet look like a kiddy calculator, be 40 times more powerful than the previous system, cars cost up to 150k, website says EAP will be ready in December, Tesla hype in general, etc - the expectations become really high.

Remember that being a part of TMC since 2012, having 5000 posts, owning both an S and an X you know far more than the average joe walking into a Tesla showroom and placing an order. You've seen the evolution from lack of parking sensors to the rollout of AP1, the maturation of AP1 and its limitations.

When you've never owned a Tesla and take delivery of an AP2 car, you are not going to be a happy camper as it had less automation than my Chevy Volt comparing Day 1 to Day 1. :)
 
Does this not concern anybody? I understand we're a long way off from fully autonomous driving, but what happens when a sensor(s) gets obscured by the elements or a plastic bag or something? How can the car continue to function without user input if any of its eyes are blinded? Build in redundant sensors and *hope* they aren't all obscured?

Maybe we need to bring these back ;)

View attachment 209195

The forward looking camera trio are cleverly placed within the arc of the main windshield wipers, so no extras are needed.

By the time Tesla releases autonomous driving, I'm expecting the car to have a triple redundant understanding of its path - high precision GPS map, radar map tiles, and cameras reading the actual environment and comparing to recent passes through the area by other Teslas. That should give it decent stability in the face of temporary loss of any of the senses.

Situational awareness to the rear sides is somewhat problematic with a failed camera, but the rear three have substantial overlaps, and the ultrasound can possibly fill in the gaps near the car that the failed camera normally sees.
 
If Tesla saw that car when it was moving before it stopped, AP1 can detect it 100% of time. If the car was allready at complete stop when Tesla saw it first time, AP can detect it in most cases, but not allways.

@Matias @u00mem9 so i had an opportunity to do a little experiment today with EAP. I set my TACC speed to 25 MPH getting off a freeway - with no cars in front to track . As I approached a stopped car at a red light , TACC slowed down and stopped. Now if my speed was higher it might feel to like it might not stop as it starts braking much later than I would have. I have posted this query to tesla to confirm as this seems fairly crucial to support. Including a stopped car on the freeway on a blind curve for example.
 
  • Informative
Reactions: Matias and u00mem9
My background is 28 years in the military and tech industry including operating and association with some of the most sophisticated automated weapons sensor and control systems on the planet as well as DARPA and US National Laboratory programs which were on the cutting edge of research and technology as Chief, Concepts and Experimentation of a Defense Agency that funded most of those programs in the United States and post-government employment in a tech company that was doing state-of-the-art cyber work. That said, I am 100% certain that several people on this forum are far more experienced and knowledgeable than I am (Bruce (@bmah) is one of them).

Apologies in advance for an off-topic post.

Um, you flatter me, @drklain! I probably have less experience than you...my background is in computer networking (PhD CS UC Berkeley 1996) and at various times I've been a computer scientist, system/network manager, and software engineer, both public sector and private sector. I don't have any experience in vehicular control systems. I've built / maintained embedded Linux / BSD systems that are in some ways similar to what runs on the Model S/X CID, but in very different application domains. One of the reasons I'm aware of the similarity is that at a prior job, I used to work with a few people who were / are deeply involved at Tesla Motors...unfortunately I don't have any insider knowledge and no desire to jeopardize good relationships trying to get some either.

I try to distinguish between what I know to be true versus what I think is true. This doesn't always succeed, so hopefully I didn't give the appearance of being an authority on Teslas or autonomous vehicles. If so, I apologize for misleading. I defer to people such as wk057, Ingineer, and yes even green1 for information about the nuances of Tesla behavior and innards, even if I don't always agree with their opinions.

(Somewhat on-topic, while driving my AP1 car to work today under manual control, I saw a flatbed truck stopped on the shoulder of a very twisty road exactly where the road curves to the left. It was a little surprising to hear/see the collision alarm sound, but I guess it made sense...if I didn't turn to follow the road, my path would indeed have intersected the middle of the truck.)

Cheers,

Bruce.
 
  • Like
Reactions: bhzmark