Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

3 day old import P85D crashed while using TACC

This site may earn commission on affiliate links.
Fair enough.

But that isn't "Lane Keeping."
Agreed. Currently "Lane Assist" simply gives the driver feedback when the car comes too close to a lane marker and the turn signal isn't on.
"Lane Keeping" would be automatic steering to keep the car in a lane, and that will be part of the upcoming Autopilot release n

That's my understanding, anyway.
 
The camera is used for the lane warning and the headlight dimming. The assertion was that the camera is only used for the speed limit signs. It us probably also used with the radar for the acc also but no clear source for that. Perhaps the fact that the tacc **sometimes does** pick up stopped cars without previously tracked them when moving is a result of the camera assisting the radar. But from the op situation, it still isn't perfect. And since even a camera will never be able to tell if you are going to stop behind the stopped car, or drive around it, perhaps following the other car that it was recently tracking, it may never be perfect until the car controls the steering as well and thus knows which option to take.
 
Agreed. Currently "Lane Assist" simply gives the driver feedback when the car comes too close to a lane marker and the turn signal isn't on.
"Lane Keeping" would be automatic steering to keep the car in a lane, and that will be part of the upcoming Autopilot release n

That's my understanding, anyway.
And, for the record, I've only had Lane Assist for a short period and I can definitely say that if Lane Keeping is "only" as evolved as Lane Assist in detecting the "proper place to be" then the feature isn't ready yet.

Part of the problem is poorly painted (and unpainted) highways/freeways that apparently are more frequent than I would like in my neck of the woods.

I would much prefer if Lane Assist "doesn't have good data" it would tell me rather than "enforcing"/"complaining" about its conclusions with bad data.
 
[Addressing the part in bold above.] Not necessarily. There are ambiguous situations due to the fact that the human driver is still controlling the steering. Consider the case of a fork in the road (or lane split or exit ramp, etc.), which is apparently the situation the OP encountered. Maybe there is a straight ahead branch of the fork with stopped traffic and a slight right branch with no traffic. You have TACC enabled and are following a vehicle that TACC has locked onto. That vehicle bears right at the fork. The car has no way of knowing whether you plan to follow the moving car or take the other branch with the stopped traffic. If it's the former, slamming on the brakes would be a bad idea. If it's the latter, the car may not have enough distance to avoid a collision, even if Automatic Emergency Braking kicks in (which it should at some point when it detects the collision is unavoidable--unless there isn't even enough time for that). There are any number of scenarios where AEB will not be able to avoid a collision (different fork angles, speeds, following distances, road/tire conditions, etc.).

Smarter features incorporating more sensor inputs will be added over time. More/better sensors will be added to get even smarter functionality. Mobileye EyeQ and NVIDIA Drive PX have some impressive demos. The driver will still be responsible for the safe operation of the vehicle and understanding how the driver assistance features work.

I wonder if the car could somehow alert you to such an "ambiguous situation".
I don't want my car whining at me all the time, but it would be nice if the car warned that "hey, I think I'm making the right choice, but I'm not 99% certain."


While I agree that the OP's accident was "the driver's fault", I don't think it's fair to say that the driver's behavior was particularly far beyond what should be expected of drivers. If TACC worked well in all situations for several hours, I can see how a driver would come to expect that it knew what it was doing. With the way Tesla is advertising its "Autopilot" features, I think this is even a more reasonable assumption, even with the caveat that "the autopilot features are progressively enabled over time".
My 2012 Model S doesn't have TACC so I have no experience with it, but until this thread, I was unaware of the car's inability to handle this situation. On the contrary, the promotional videos are all about how well TACC and MobilEye work and how aware they are of everything going on. I fully expect that computers will be better drivers than teenagers by the time my 3 yr old gets a car. I guess we're not there yet -- or at least the hybrid between the human steering and the computer accelerating has some dangerous loopholes.

I feel bad for the OP.

I relate to their excitement over getting a Tesla, even if they don't live in a "supported" country.
I understand how Tesla's "autopilot" propaganda could make someone feel like the car darn well should know enough not to ram a car stopped at a red light.
and I agree that TMC as a community is often unreasonably harsh to some people who are upset at Tesla Motors.

Yes, the human failed in this case. but the computer also failed.
At least with coffee, most people have experience that coffee is served hot, and often above 180°F. (should be brewed at 200°F.)
Many Tesla drivers have no experience with the limitations of TACC and are unaware that it can fail when the car in front turns at a red light.

Good luck to the OP. I hope they continue to be Tesla fans.
 
Yes, the human failed in this case. but the computer also failed.
At least with coffee, most people have experience that coffee is served hot, and often above 180°F. (should be brewed at 200°F.)
Many Tesla drivers have no experience with the limitations of TACC and are unaware that it can fail when the car in front turns at a red light.

Good luck to the OP. I hope they continue to be Tesla fans.

How are you attributing failure to the computer?

An example of what it seems you are saying would be if you input an equation into calculator, but added a + when you needed a -, and then partially blamed the calculator. Obviously this would be false, because the calculator can only compute based on your input.
 
Last edited:
I don't have a lot of experience with TACC, but from my memory in a loaner, it would be really nice if I knew what the computer was thinking. Maybe another Tesla APP for the touchscreen that showed my car in the middle, and what the TACC system is interpreting around it. Show a green highlighted car in front of mine if it thinks it is following a moving object. Show a red highlighted car if it sees a stopped object, and show nothing if it sees nothing.

I understand that I should hit the breaks the second I am uncomfortable, but for at least testing purposes in the early phase, a little understanding of what the car is "seeing" would be wonderfully helpful.
 
How are you attributing failure to the computer?

An example of what it seems you are saying would be if you input an equation into calculator, but added a + when you needed a -, and then partially blamed the calculator. Obviously this would be false, because the calculator can only compute based on your input.

The computer's job was to recognize the speed of things in front of it. It failed to recognize a stationary car as something in front of it. It failed.

I think the + vs. - is a different scenario. That would be me hitting the accelerator in my 2012 car hoping the car would know that I really meant brakes.

This is more of a case of me typing "throw down the gantlet" on my phone, and it failing to autocorrect my "spelling" mistake. I expect my phone will fix spelling errors. It often won't correct spelling errors when the erroneously spelled word happens to be a valid English word. While the computer is doing what it's been told to do and is working within its limitations, it is a failure to do what I wanted it to do and what I expected it to do.

The operator of the car had an expectation that the computer would identify the car in front of his as something that should not be driven into. It failed to do this.

- - - Updated - - -

I don't have a lot of experience with TACC, but from my memory in a loaner, it would be really nice if I knew what the computer was thinking. Maybe another Tesla APP for the touchscreen that showed my car in the middle, and what the TACC system is interpreting around it. Show a green highlighted car in front of mine if it thinks it is following a moving object. Show a red highlighted car if it sees a stopped object, and show nothing if it sees nothing.

I understand that I should hit the breaks the second I am uncomfortable, but for at least testing purposes in the early phase, a little understanding of what the car is "seeing" would be wonderfully helpful.

agreed.
 
Well, this is where there is a disconnect. This wasn't the computers job in this situation, it specifically ignores stationary objects. So, not only did it not fail, it worked 100% correctly as designed.

and working 100% as designed was a failure.

- - - Updated - - -

Well, this is where there is a disconnect. This wasn't the computers job in this situation, it specifically ignores stationary objects. So, not only did it not fail, it worked 100% correctly as designed.

incidentally, is it supposed to ignore stationary objects? Most luxury cars have cruise control that works just fine in stop and go traffic. Does Tesla's not work in that circumstance?
 
incidentally, is it supposed to ignore stationary objects? Most luxury cars have cruise control that works just fine in stop and go traffic. Does Tesla's not work in that circumstance?
Wrong characterization. While a lot of these systems are advertised as working with stop and go traffic, they are not designed to detect stationary vehicles and will react in a similar way to a stopped vehicle that it did not track or a moving vehicle that braked but it lost track of because it deviated out of radar range (from a bend or not being in middle of lane).

Here's Mercedes' DISTRONIC PLUS which is advertised to work in stop-and-go traffic:
The DISTRONIC PLUS regulates only the distance between your vehicle and those directly ahead of it. It may not register stationary objects in the road, e.g.:
- a stopped vehicle in a traffic jam
- a disabled vehicle
- an oncoming vehicle
...
Obstructions and stationary vehicles
DISTRONIC PLUS does not brake for obstacles or stationary vehicles. If, for example, the detected vehicle turns a corner and reveals an obstacle or stationary vehicle, DISTRONIC PLUS will not brake for these.
http://www.m-sedan.com/tips_for_driving_with_distronic_plus-4465.html

Same with BMW's "Active Cruise Control with Stop & Go function":
The system does not decelerate when a stationary obstacle is located in the same lane, e.g., a vehicle at a red traffic light or at the end of traffic congestion
...
Unexpected lane change
If a vehicle ahead of you unexpectedly moves into another lane from behind a stopped vehicle, you yourself must react, as the system does not react to stopped vehicles.
http://www.bavarianmw.com/guide-2677.html

Those are the exact same situations where people complained the Tesla TACC "failed" and other brands would not. It all comes from a misunderstanding of what ACC in general is supposed to accomplish. All it does is follow the car immediately in front of you at a set distance (accelerating and moderately braking to maintain that distance). Anything that deviates or interferes with that is not what the system is designed to handle (like the oft quoted situation where the car you were following leaves the lane and there's a stopped vehicle in front and driver wants to stay in same lane).
 
I don't have a lot of experience with TACC, but from my memory in a loaner, it would be really nice if I knew what the computer was thinking. Maybe another Tesla APP for the touchscreen that showed my car in the middle, and what the TACC system is interpreting around it. Show a green highlighted car in front of mine if it thinks it is following a moving object. Show a red highlighted car if it sees a stopped object, and show nothing if it sees nothing.

Nice idea for a gadget but I think it's better for the driver to keep their eyes on the road and simply apply the brakes shortly after they'd expected the TACC system to have done so.

There is a lot of safety margin and predictability in the system in that TACC -if at all- will apply the brakes early.

I think that's also what went wrong here and almost went wrong in the video posted a few pages back. The driver must have thought: "It'll brake ... It'll brake!!! ... I'm still confident it'll brake!?! ... I'm sure it'll brake?!? ... I still think it'll brake??? ... $€£$#%!!! ... ... ... Darn! guess I should have applied the brakes after all. Hmmm, perhaps I can sue Tesla for not implementing a system that's smarter than I am?".

Bottom line, all it takes is to intervene when the car doesn't slow down or brake when you think it's a good idea to. This is the basic concept behind the "driver assist" features.
 
I spent the last couple days thinking about this, and I have to give this a fair assessment-- there was driver error, but there was also Tesla error. I agree with AR's comment that TACC should recognize a stationary object and react. I also see Elon talking up the autopilot features and setting the context so that the customer base expects the car to see something in front of it and not cruise control itself into it.

I wouldn't have placed any responsibility on Tesla if this was a dumb cruise control system. But they way Elon and other press communications talk up TACC and autopilot, Tesla has a lot of work to do.

I have lowered my trust level in my TACC since this post a lot ... can't help but think about it when I drive.
 
I wouldn't have placed any responsibility on Tesla if this was a dumb cruise control system. But they way Elon and other press communications talk up TACC and autopilot, Tesla has a lot of work to do.

Many of the features that Musk has been "talking up", as you put it, haven't been released yet. Musk has generally been referring to the complete suite of auto-pilot features, not the subset of them that we currently have.

You still should place no responsibility for this on Tesla because, quite simply, Tesla was not in any way responsible for what happened.
 
I spent the last couple days thinking about this, and I have to give this a fair assessment-- there was driver error, but there was also Tesla error. I agree with AR's comment that TACC should recognize a stationary object and react. I also see Elon talking up the autopilot features and setting the context so that the customer base expects the car to see something in front of it and not cruise control itself into it.

How can there be a Tesla error if the manual specifically warns against this situation?

Of course, it would be nice if the car would correctly lock on to stationary objects that are in the path traveled (i.e. not on a tree that's dead ahead on a winding road). However, the technology simply isn't quite there yet. It'll take time. Some argue that this makes it too early for manufacturers to release the technology (Tesla is not the only one with the feature), but I personally think it's fine as it's a "dumb" "assist" feature anyway. Like CC, TACC relies on driver intervention if the traffic situation requires for whatever reason.

I wouldn't have placed any responsibility on Tesla if this was a dumb cruise control system. But they way Elon and other press communications talk up TACC and autopilot, Tesla has a lot of work to do.

It *is* a dumb CC system. It's called TACC and is part of a suite of features Tesla markets as "autopilot". You can argue that none of the features actually "pilots" the vehicle, but then CC doesn't really "control" the vehicle does it?

I have lowered my trust level in my TACC since this post a lot ... can't help but think about it when I drive.

My approach from the start was to put a low trust in the assist features and be ready to intervene at all times. Perhaps that explains why I was (and still am) so impressed with the convenience of the system. Regular CC required me to intervene all the time. With today's traffic, actually so often that I found myself to almost have stopped using it. TACC changed that. It really adds convenience in my experience.
 
It would be prudent for TM to disable most or all of this 'assist' stuff for *new* buyers until the first CAR ON following 3,000 miles. Liability exposure effectively removed.

Then they would present an on-screen interactive tutorial about these features (with quizzes!). If one chose 'I'll deal with this later' to postpone the tutorial then the features would be delayed further.
--
 
It would be prudent for TM to disable most or all of this 'assist' stuff for *new* buyers until the first CAR ON following 3,000 miles. Liability exposure effectively removed.

Then they would present an on-screen interactive tutorial about these features (with quizzes!). If one chose 'I'll deal with this later' to postpone the tutorial then the features would be delayed further.
--

Is that per driver profile? It doesn't make much sense to do something like this to me at least. Tesla simply needs to do a better job at delivery explaining things. Of course in this case the car was in an unsupported market so no official delivery I would guess.
 
The computer's job was to recognize the speed of things in front of it. It failed to recognize a stationary car as something in front of it. It failed..

Tacc recognized the stationary object in front of it. It simply predicted that the driver would drive around it. That is especially true if, like in the blue Prius video, the car that was being tracked did drive around the parked car. The tacc would make the reasonable assumption, which would be true most of the time, that the driver would follow the tracked moving car rather than drive up behind the parked car.

If tacc had previously locked onto to that car and saw that it went from moving to stationary, and you were following it, it would detect that it was stopped and stop behind it no problem. If instead it sees two lanes: one with a previously moving and locked car that is still moving, and another lane with a stopped car, it will assume that you will follow the moving car instead of aiming for the stopped car.
Lesson: don't drive up behind parked stationary cars when tacc wasn't previously tracking that car while it came to a stop.