Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Accident while on EAP...

This site may earn commission on affiliate links.
Where did you quote me from?

For the life of me i can't remember me typing that.

What post number is it?

"<<<Insert Fake thing you did not say.>>>"

Unfortunately, its easy to make people sound they've said something they haven't. Poster should go back and edit the posting. I accidentally did this myself at some point today and went back to correct it.
 
Where did you quote me from?

For the life of me i can't remember me typing that.

What post number is it?

Sorry, I was quoting post # 78 @mswlogo but it is a typo that your name got quoted and the system doesn't allow me edit it now.

Switching from NoA to AP has no nag. It just happens when you drive in conditions (bad weather for example) or locations not supported. All NoA is AP with lane suggestions. But it doesn’t mean all that much. It will still steer and control speed. And if you get nagged on auto steer it will still run ACC. Not sure what you mean by speed becomes 0.

Since I don’t ignore nag often, if you break out of AP (or NoA) with the steering wheel. Is that the same as ignoring the nag?

I think the car was still steering and foot was touching throttle (which won’t cancel the auto steer). Just a hunch.

We all agree NoA / AP has issues. But this was not such an example.
 
Have a look at the video.


The lane markings appear to be clear. It does not seem to represent the case you have described (and for which I encounter daily on my drive home of a widening lane but it is possible that it got confused).

I'm very interested in your opinion, and the others in the forum.

Here is the narrative of what we think happened, after speaking in more detail with the driver this afternoon, and reviewing the footage in slow-motion.

- Model X was driving in the center lane and passing an adjacent car positioned in the left lane.

- The Adjacent vehicle was starting to drift slightly into the center lane.

- As the Model X was passing this vehicle, it also drifted slightly over the lane marker to the left as if it had an intention to occupy the lane.

- The Model X did not appear to be fully aware of the adjacent vehicle, or its distance.

- At this point, the two vehicles collided - the rear left wheel of the Tesla came into contact with the right front wheel of the adjacent vehicle.

- Driver corrected for a moment after the impact by briefly steering right, and then manually took the Tesla into the left lane, and stopped.

Tesla gathered all of the data, escalated it internally, made some calls back to the driver, and promised to be in touch within 7-21 days later. There was no additional meaningful follow up from Tesla other than to say that their AP team reviewed the footage and are currently prioritizing other AP related investigations and would eventually respond with more detail. There have been weekly communication since then (initiated by the driver) but no answers. I think we are hoping to better understand AP's intention just prior to, and during the impact.

The driver took immediate responsibility for the event and settled the damage. He did not end up involving insurance and was able to repair the damage to his model X for relatively little cost. Fortunately, no one was injured in the event.

Here are the images post impact:


View attachment 393749 View attachment 393750
How did your friend get Tesla to look at his data and escalate it internally? I've contacted Customer support and all I got back was sorry, glad everyone is OK, reach out to your local service center for repairs... I want them to look at the data and hopefully find out if there is a software issue or not.
 
How did your friend get Tesla to look at his data and escalate it internally? I've contacted Customer support and all I got back was sorry, glad everyone is OK, reach out to your local service center for repairs... I want them to look at the data and hopefully find out if there is a software issue or not.

He presented the issue as a failure of AutoPilot, and the cause of the accident. He said Tesla took the matter seriously and escalated it immediately to the AutoPilot team. He did email a description in detail, including video, and Google Map Data. He followed up with a phone call, and then the Tesla team got engaged. On the other hand, they still did not get back to him with the conclusion, nor did they offer to help cover any costs. You might want to try again.
 
  • Like
Reactions: Freewilly
Navigation on Autopilot would stop your car at the end if it is not already following a lead car and if there's no intervention from driver:

It would prompt:

"Autopilot navigation complete
Press Accelerator to resume"



This video is insane! I am sort of surprised we did not have any hot Infiniti-on-Tesla action take place. Personally, I would save such a test for 3AM when there is no traffic.
 
  • Informative
Reactions: Tam
I don't think this is true. Human stereopsis is useless beyond 10 meters. Your brain determines distance by looking at many other clues.
https://human-factors.arc.nasa.gov/publications/AIAA.2011.DepthPerceptionCueCntrl.pdf

There is some interesting research that is being applied in this area "monocular depth estimation". I've been watching some of it this evening. Its important to realize its still an estimation. Have a look: Papers With Code : Monocular Depth Estimation
 
It is very difficult (and therefore inaccurate) to predict distance from a single visual image source, like that from a single rear camera.
This is true, however we know that in V9 (?) software, one of the ways they're processing the image data is to run the neural network algorithm on 2 successive frames instead of just one. This way you get a rate of change of size/distance from processing a single pair of images. I think this was one of the conclusions drawn by @verygreen . So, the system is capable of extracting more than you may be initially giving it credit for.
 
There is some interesting research that is being applied in this area "monocular depth estimation". I've been watching some of it this evening. Its important to realize its still an estimation. Have a look: Papers With Code : Monocular Depth Estimation
The human brain is an amazing thing :D
I think Elon's argument that FSD can be done with only cameras because humans can do it with a single eye has put Tesla on a very challenging path.
 
Sorry, I was quoting post # 78 @mswlogo but it is a typo that your name got quoted and the system doesn't allow me edit it now.

It might stop at the end if you reach the end of the route.

But it most often drops out of NoA and transitions to AP. It will pop a very small message. But I would not call it a nag (does not vibrate wheel or flash screen or make a noise ahead of time). You can easily miss it.
But it will give a gentle jingle of some sort after it drops to AP.

It did it on me tonight. 99% of the time it transitions to AP because NoA rarely ever stops/ends at your destination.

There are cases where AP ends and it says take the wheel etc. It's usually AP that "Ends". NoA usually transitions to AP long before it.
Sometime it does go from NoA to nothing.
 
I experienced this before too, but I slammed the brakes myself. My experience is always with white cars though. I have also experienced once a small blue hatchback that was invisible to the side radar and camera. The car almost auto lane changed into it.

Most of the time it’s the front camera and radar not seeing white cars. Just today a White Model 3 was completely invisible to the radar and camera. Almost rear ended him when he changed lanes in front of me.
 
  • Funny
Reactions: AlanSubie4Life
I experienced this before too, but I slammed the brakes myself. My experience is always with white cars though. I have also experienced once a small blue hatchback that was invisible to the side radar and camera. The car almost auto lane changed into it.

Most of the time it’s the front camera and radar not seeing white cars. Just today a White Model 3 was completely invisible to the radar and camera. Almost rear ended him when he changed lanes in front of me.

I don't think Radar is color sensitive.
 
I don't think Radar is color sensitive.

Maybe it's so reflective that the radar doesn't see it. I don't know. Always nice shiny white cars in the morning with the sun behind me is where I see this problem. It's hard to show on camera because the car is doing 1 to 4 mph but it doesn't try to slow down or stop for white cars. Sometimes it'll even try to speed up. Very subtle but I have hit the brake to cancel AP. Don't really want to test it out.
 
TeslaVision is probably so good now they have started ignoring the radar. They don’t even need it any more, it’s like a child’s toy! This is because of the huge leaps forward in the neural net training.

Unfortunately there are some minor edge cases.
I can’t tell if you’re being sarcastic; but I can tell that Tesla still relies on radar input and will immediately abort AutoPilot if it experienced a reduced radar condition. It will also not allow AP to re-engage without it.
 
I can’t tell if you’re being sarcastic; but I can tell that Tesla still relies on radar input and will immediately abort AutoPilot if it experienced a reduced radar condition. It will also not allow AP to re-engage without it.

I was being sarcastic.

This whole conversation makes me a little sad. In this case it was probably driver override of AP that caused the problem, but AP has other known problems, and I feel like this FSD thing is really far off at this point, and while it’s fine (essential?) to be working on it, Tesla should be more focused on promoting their cars because they are awesome just as they are. They’re so good you would not want a computer to have all the fun driving them! That seems like a better strategy to me. It’s not like they are making tons of money off of FSD - so why hype it so much when it does not actually exist in a workable form, and likely won’t for quite some time?
There isn’t going to be anyone reading a book nonchalantly in a Model 3 while it drives itself for at least 4-5 years is my feeling. Well, there may be, but I mean accomplishing this without reduced life expectancy is unlikely in that timeframe.
 
Last edited:
I was being sarcastic.

This whole conversation makes me a little sad. In this case it was probably driver override of AP that caused the problem, but AP has other known problems, and I feel like this FSD thing is really far off at this point, and while it’s fine (essential?) to be working on it, Tesla should be more focused on promoting their cars because they are awesome just as they are. They’re so good you would not want a computer to have all the fun driving them! That seems like a better strategy to me. It’s not like they are making tons of money off of FSD - so why hype it so much when it does not actually exist in a workable form, and likely won’t for quite some time?
There isn’t going to be anyone reading a book nonchalantly in a Model 3 while it drives itself for at least 4-5 years is my feeling. Well, there may be, but I mean accomplishing this without reduced life expectancy is unlikely in that timeframe.

I would not have bought the car if not for FSD. And I know my friend bought it for the same reason. And I didn’t expect it to work for 4-5 years and I know it will have limitations when it does.
 
  • Like
Reactions: FlyinDelorean