Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

WARNING: I rear-ended someone today while using Auto Pilot in my brand new P90D!

This site may earn commission on affiliate links.
Yes. The reading issue has been covered multiple times. Electricfan says he will continue to read when he wants to but at same time is upset about Tesla releasing autopilot saying you must be attentive at all times saying it isn't safe.
 
Yes. The reading issue has been covered multiple times. Electricfan says he will continue to read when he wants to but at same time is upset about Tesla releasing autopilot saying you must be attentive at all times saying it isn't safe.

Well, I'm really upset that Tesla released autopilot in such a poor state. I am honestly shocked its so bad. But maybe I'm crazy. One thing is for sure, time will tell. If AP is truly dangerous then somebody is going to die in an AP-involved accident eventually, and when that happens the members of this forum don't get to pass judgement. It will be a court, with a judge and jury. Also passing judgement will be the masses, most of whom don't drive Teslas. If fires that didn't hurt anybody were a threat to Tesla, what do you think an AP-related death will cause? I sure hope I'm wrong. Or we never find out, because Tesla releases 7.2 very quickly, and the car becomes truly safe to drive on AP. But asking people to let AP drive, but keep two hands on the wheel and be ready to take over at any moment to catch an AP screw-up is just stupid, in my opinion. Nobody, even the boy scouts on here who follow directions so religiously, can pay attention 100% of the time. So if you enable AP, and you try your best to be attentive but the car makes a suicidal dive just at the moment you get distracted (by one of million things - billboard, car accident, sunset) anybody can be killed by AP. Tesla should not have released it if they knew it could kill somebody if their attention wanders at just the wrong moment. How is that not common sense? How can anybody think releasing a feature that might malfunction and kill someone is reasonable or socially acceptable? Like calling it "Beta" makes that ok? Really?
 
How can anybody think releasing a feature that might malfunction and kill someone is reasonable or socially acceptable? Like calling it "Beta" makes that ok? Really?

How can any company, dealership or private citizen sell a product (motorized vehicles) to teenagers, pregnant women, families, and all others of adult age all over the world in the first place, that can have a mechanical failure which can lead to serious injury or death (AND HAVE), and requires the 'driver' be 100% attentive at all times? *Gobsmacked!*

If I were you, I'd be all over the worldwide banning of automobiles, motorcycles, 18-wheelers, snowmobiles, jet skis, etc... Then you'd not have to be concerned about remaining 100% attentive with AP features engaged in any of those death machines. I'd suggest horses, except that they are even more dangerous because they have free will. We need to get back to walking everywhere. It's the safest transportation mode to get from point A to point B - well, except for the Cellphone Zombies; they're pretty dangerous to themselves and others because they aren't...wait for it...paying 100% attention when driving their bodies forward. Better ban walking too.
 
Well, I'm really upset that Tesla released autopilot in such a poor state. I am honestly shocked its so bad. But maybe I'm crazy.
And I am amazed that you are shocked, "...shocked to find that gambling is going on in here!"* (Gambling with people's lives, apparently.)
Yes, Tesla's V1 beta AP is not yet perfect. I would also point out that human drivers are very far from perfect, and cause hundreds of thousands of deaths and serious injuries worldwide every year and yet they are allowed to drive after passing laughably minimal driving tests (at least in the US).
So now you have concluded that Tesla's AP is dangerous, after repeatedly acknowledging that you have used it on public roads while completely ignoring Tesla's instructions and deliberately ignoring the road around you by reading a book while the car drove itself.
That is what shocks me.

*Obviously that is not a quote from Electricfan, but from the film "Casablanca".
 
Last edited:
I'm wondering if the white/black stripes had something to do with it, and caused AP to get confused about the lanes, as ankitmishra suggested above. You can see in the video the car did drift to the left right before it dove to the right.

Fascinating video. But even if the car got confused about where the lines were - why would it actually appear to dive into the truck? That is just super weird. Wouldn't the radar and ultrasonics have told it there was an object there it was about to hit? EDIT: n/m I read more posts that say the ultrasonics turn off at a certain speed, and other posts that hypothesize that the "brain" is getting overwhelmed with sensory input and thus the radar seems to fail.

- - - Updated - - -

The pickup truck has running boards and a stylized groove along the bottom of the doors. Perhaps AutoSteer picked up on those cues and interpreted those as lane markings?

As to why the ultrasonic sensors didn't pick up the truck - I don't believe the range on the ultrasonic sensors is long enough to be of much use with the closing speed in the video. Where you took over control is probably at the limits of the useful range of the ultrasonic sensors.

So the million dollar question is - would the car have seen the truck in time and suddenly braked to avoid hitting it? Yes I realize we can't test this possibility. I do wonder if Tesla is secretly banging up test mules on private tracks in scenarios like this.

- - - Updated - - -

If these kinds of "corner cases" can only be learned by the neural networks via trial and error over time then it is certainly plausible that Tesla is building an unassailable lead in the race to autonomous driving and that they might arrive at the goal several years sooner than anyone else in the industry - because nobody else in the industry has nearly the fleet size and they've had a big head start in "school."

I do wonder if this is why they pushed Autopilot out on the bleeding edge - because Musk is taking a calculated risk that nobody will die using Autopilot while the fleet learns its way to 99.9999% error-free behavior - and he knows the only way to get there is to put a big fleet on the road and let it get to learnin' so to speak.
 
Last edited:
So now you have concluded that Tesla's AP is dangerous, after repeatedly acknowledging that you have used it on public roads while completely ignoring Tesla's instructions and deliberately ignoring the road around you by reading a book while the car drove itself.
That is what shocks me.
.

I don't know what my allegedly inappropriate use of AP has to do with Tesla releasing what they released.

Anyway, I contacted Tesla service this morning and ask them to look at my car. They responded and are pulling the log files. I'll report if they find anything interesting.

One update - this morning as I was driving in on Beltway 8, the car just decided to drive on the shoulder for a while, all by itself. I'll post the video, although its not as exciting as the near-miss with the truck.

For the record, since everybody seems to get so heated up on the subject, I no longer read while on AP. The car used to work better - the incidents with the truck and the shoulder-cruising this morning are new, at least for me. But while its messed up I obviously don't trust it. AP isn't helping me at all right now - its much more stressful to use it than not.

Which means I wasted a lot of money upgrading my 2013 to a 2015, which might make some people on here happy, based on some of the unfriendly posts I've seen.

 
Last edited by a moderator:
Ap does what it does. It's not perfect but it will get better. Abs and seat belts and rear view mirrors etc. aren't perfect either but we make use these tools within their limits. Ap is still beta and it's getting better. I would never go back to a car without ap.

I share the frustration that seemingly obvious errors (too close on the right side, not distinguishing lanes from shoulders) ought to be programmed out. And I suspect they will very soon. But meanwhile it is so much better to have the help with (but not full delegation of) most micro adjustments while driving on highways and other appropriate roads.
 
I just took my first long road trip in my 70D. Before that, I'd experimented with AP but not used it for an extended trip. Thus, I don't know if my observation is specific to 7.1 vs. 7.0.

But dang, that thing sure wanted to get up close and personal with trucks as I passed them. Maybe it's because I normally give a wide berth when passing, but it FELT like the car was magnetically attracted to trucks one lane over to the right. I took control a few times. Many other times, I let it go and it was fine (though uncomfortably close). I would much rather if AP gave extra space when passing in the left lane.

Even with that and a reminder to hold the wheel every three minutes, AP performed well overall and made the trip a lot more pleasant. I did not mind keeping a hand on the wheel almost all the time, and occasionally giving it a little "tug" to reassure the car I was still there. I guess I basically used it as intended, including lots of lane changes, and it was fine except for cutting it a little close on the right side.
 
I don't know what my allegedly inappropriate use of AP has to do with Tesla releasing what they released.

I've seen a huge change in your posts since this thread started on Jan 13. It's only been three weeks but your posts have become more logical and more willing to consider the other side of the conversation.

On the other hand I've seen others that are still stuck giving you the same response they gave you weeks ago when it was pile up on elctricfan day.

I really don't see any reason to be stuck in the past. You've provided new insights and video evidence and we should just go forward with this new input...
 
I don't know what my allegedly inappropriate use of AP has to do with Tesla releasing what they released.

Anyway, I contacted Tesla service this morning and ask them to look at my car. They responded and are pulling the log files. I'll report if they find anything interesting.

One update - this morning as I was driving in on Beltway 8, the car just decided to drive on the shoulder for a while, all by itself. I'll post the video, although its not as exciting as the near-miss with the truck.

For the record, since everybody seems to get so heated up on the subject, I no longer read while on AP. The car used to work better - the incidents with the truck and the shoulder-cruising this morning are new, at least for me. But while its messed up I obviously don't trust it. AP isn't helping me at all right now - its much more stressful to use it than not.

Which means I wasted a lot of money upgrading my 2013 to a 2015, which might make some people on here happy, based on some of the unfriendly posts I've seen.


You can hear the two done beep again in this one as well(right before second 11 clicks over). Same as the previous. So you were on AP and it started veering over the yellow line?

Is the glass in front of the cameras nice and clean???
 
Last edited by a moderator:
I don't know what my allegedly inappropriate use of AP has to do with Tesla releasing what they released.
I've seen a huge change in your posts since this thread started on Jan 13. It's only been three weeks but your posts have become more logical and more willing to consider the other side of the conversation.
"more logical" -- you state that while quoting him where he clearly still believes reading was "appropriate use of AP" since he states "allegedly inappropriate use of AP". I'm looking forward to using AP but following Tesla VERY clear instructions on how to use it. Hands on wheel and you are responsible for the vehicles driving. Every odd scenario needs to be reported but none from looking at the logs timestamps based on an ambulance report documented time for the non-Tesla driver and their family/friends ... if you know what I mean.
 
Last edited:
I think it is the sound of car going over some minor imperfection in the road. The car goes over two strips of imperfections beginning at 0:14. At 0:05 imperfection, car was recovering from a curve. Do minor speed brakers/bumps cause problem with AP? This is my observation of the video. I cant find any other event.
Also, what is that black space on the other side of road? The one with incoming traffic. It is also at same location as the anomaly.
 
I have glitches every single day in my car from calendar not working, radio stuck on certain channels to screen blanking out randomly, f'ing navigation system taking 30-60 seconds to tell me where the heck I am, map half full of grid, no streets, I can go on and on. For anyone to think or say like the Tesla service apparently did here, that autopilot is 'flawless' is complete bs. It is not a good system-yet. Don't trust it, but you can use it as long as you keep hands on wheel and foot ready to brake. Funny that other scenarios seem to tell a story that the car will absolutely brake to the point of skid to avoid a collision...my opinion after reading this is that the autopilot probably failed to do its "intended" job which is to monitor everything around it and react for you. But, it is still in beta and can't be trusted.

I've been in HOV lane going along just fine in AP and when there is an 'exit' from the HOV and no more stripe on the right side, the car jerks violently to the right even thought the left stripe continues just as it had been...huge problem...don't trust it.
 
I can't answer questions about the shoulder incident because I left the video file at work.

About the truck incident - I noted something new this evening when I was creating a video clip to send to Tesla (I heard back from service, and somebody in CA was copied and they wanted details on the problems so I sent them the video - not the youtube crappy version but the actual dashcam footage which is much better - and if anybody wants it and knows a way I can send it to you or place I can upload it for you please pm me). By the way, this is a great little video cutter that will let you take a few minutes out of a longer video file, and its totally free: Free Video Cutter - Free download and software reviews - CNET Download.com

What I noticed tonight solves the puzzle, I believe. Below is a picture of two skid marks that are in the middle of my lane, and appear right before my car dives toward the truck. I think AP "calculated" that the lane was suddenly ten feet to the right based on these skid marks, and that's why it abruptly dove to the right. We'll see if Tesla agrees. I'll post back if they share something with me about it.

Here's the pic of the skid marks that I think caused my near-miss with the truck.

truck_incident.jpg
 
I have glitches every single day in my car from calendar not working, radio stuck on certain channels to screen blanking out randomly, f'ing navigation system taking 30-60 seconds to tell me where the heck I am, map half full of grid, no streets, I can go on and on. For anyone to think or say like the Tesla service apparently did here, that autopilot is 'flawless' is complete bs. It is not a good system-yet. Don't trust it, but you can use it as long as you keep hands on wheel and foot ready to brake. Funny that other scenarios seem to tell a story that the car will absolutely brake to the point of skid to avoid a collision...my opinion after reading this is that the autopilot probably failed to do its "intended" job which is to monitor everything around it and react for you. But, it is still in beta and can't be trusted.

I've been in HOV lane going along just fine in AP and when there is an 'exit' from the HOV and no more stripe on the right side, the car jerks violently to the right even thought the left stripe continues just as it had been...huge problem...don't trust it.

I call BS. I don't know what your car seems to be doing or what you might be doing as the operator but I drive mine daily and my calendar works, radio is never stuck, screens never blank out randomly and my navigation system calculates all routes I've given it to date within seconds and always knows where my car is with a full map and streets, etc...

The car doesn't jerk violently in any direction, that's absurd. You are free to not trust whatever it is you aren't trusting but while AP isn't perfect and there are things it really needs to improve upon such as taking sharp\sharp'ish freeway curves (101 north through San Rafael comes to mind), I find it to be quite good at what it does. Naturally I don't ask it to do things I know it's not capable of doing so that may skew my results some...

Jeff

One more thing to note as it pertains to exits while on the freeway. In 7.0 the car would drift towards the "open" side of the lane, IE were the white line breaks for the exit, but I never had an issue with it trying to take the exit. In 7.1 I have noticed it's much stabler in that scenario and continues on straight.
 
Last edited:
Playing armchair engineer...there are other possibilities. I'm sure the data show AEB suddenly kicked in at 40 MPH due to the rapidly decreasing distance between you and the car in front. However, there are alternate possibilities as to why TACC saw that.

Imagine this scenario:

  1. You are Car T moving 60 MPH, Car B is directly in front of you moving 60 MPH, Car C is directly in front of her, slowing down from 60 MPH to come to a stop up ahead.
  2. Car B sees Car C slowing down, and moves to the right lane to pass Car C, slowing down to 50 MPH, then 40 MPH.
  3. Car T is tracking Car B as she moves right and slows. Car T slows as well. Car B is moving faster than Car C.
  4. Car B passes Car C, which is now moving 20 MPH. Car T moving 40 MPH (because it was tracking Car B) switches to track Car C.
  5. Car C is moving significantly slower than Car T.
  6. Car T applies AEB as it calculates Car C is in collision path.

I'm guessing the data Tesla sees is that the tracked car suddenly slowed, thus AEB was applied. What it probably doesn't show is that Mobileye just switched which car it was tracking, which I've seen occur quite often when using the Mobileye based ACC in the BMW. I use ACC almost every day, even though every day does something "wrong" at least once, and I need to correct it. It's still a useful feature that I prefer over regular cruise control.

So have that drink, but mull over the possibility you may not be as wrong as you think.

First, I would like to thank the OP for sharing his experience and provoking a useful discussion.

Second, I would like to add my support for Woof's theory. I am also an engineer and I also currently own a Mobileye (same as used in Tesla) based ACC in my BMW i3. I had a nearly identical situation to the OP and had to stand on the brakes to avoid rear ending the stopped car in front of me. Bottom line. The ACC was tracking the car that passed the slowing car and did not switch to the one right in front of me until it was too late. This is a very repeatable failure mode that I am now very sensitive too. OPs memory is not shot. Tesla's engineers are reporting the truth. The TACC was simply tracking the wrong car till it was too late. The bug is very reproducible on my BMW when the highway makes even the slightest bend (right bend if the car leaving my lane to pass goes to the left or vice a versa).