Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

let's look at this video of a tesla crashing while on autopilot

This site may earn commission on affiliate links.
AP1 has a problem with stationary object detection. Certainly the barrier fits with that. The radar used to pretty much ignore everything stationary as clutter, a tough technical problem. It's supposed to be getting better, but I guess not yet.
The ultrasonics work out to 12 ft or 18 ft, not enough warning there to do much. That's why they reduce collision severity, and can potentially come to a stop at low speeds, but can't avoid a collision.
Hopefully the camera recognition will improve enough in the future to help with this case.
 
AP1 has a problem with stationary object detection. Certainly the barrier fits with that. The radar used to pretty much ignore everything stationary as clutter, a tough technical problem. It's supposed to be getting better, but I guess not yet.
The ultrasonics work out to 12 ft or 18 ft, not enough warning there to do much. That's why they reduce collision severity, and can potentially come to a stop at low speeds, but can't avoid a collision.
Hopefully the camera recognition will improve enough in the future to help with this case.

Radar is almost useless I have told many tesla fans about it and the advantage lidar has over it.
Ultrasonics is useless in situations other than parking.

The reason ap1 doesn't see stationary objects/cars is because it doesn't do object recognition.
Neither does Ap2's EAP. this is why this scenario will continue to repeat itself.
 
AP1 has a problem with stationary object detection. Certainly the barrier fits with that. The radar used to pretty much ignore everything stationary as clutter, a tough technical problem. It's supposed to be getting better, but I guess not yet.
The ultrasonics work out to 12 ft or 18 ft, not enough warning there to do much. That's why they reduce collision severity, and can potentially come to a stop at low speeds, but can't avoid a collision.
Hopefully the camera recognition will improve enough in the future to help with this case.

AP1 camera recognition will not get any better than it currently is. Camera and it's software are MobilEys proprietary platforms and Tesla has no access to them. MobilEye and Tesla had ugly divorce and they don't have any cooperation any more.
 
Apparently you don't drive much in Dallas where some of these construction zones and lane alterations seemingly come out of nowhere. It's very easy to see how this could happen and not just on hwy 121 where this incident took place and construction was marked (poorly). Might want to drive a mile on someone else's route before casting stones.
True. Construction is everywhere in Dallas.

The problem is that the individual is supposed to be driving the car. The AP system, like that in Honda, Mercedes, etc, is a lane assist. It's a partnership, likely what makes it safer. It would be interesting what the driver was doing, it appears that they were not paying attention to the road. We need a driver cam to go along with the dash cam.
 
Apparently you don't drive much in Dallas where some of these construction zones and lane alterations seemingly come out of nowhere. It's very easy to see how this could happen and not just on hwy 121 where this incident took place and construction was marked (poorly). Might want to drive a mile on someone else's route before casting stones.

I don't disagree with the idea that we cast stones too often

however; as K5ing posted: there was a Construction Sign listing that a curve was coming and a 45 mph speed limit, 17 seconds before the incident. it even told the driver which way the curve would go.

that seems to be pretty well marked to me.

I would suggest never using AutoSteer in Dallas at all if lane alterations and construction changes truly come out of nowhere with no notification whatsoever.
 
The car should be programmed to see the construction sign, and then make/warn the driver take over. The camera should be able to see colors, and the construction signs are all the bright orange IIRC. Should be a relatively simple fix.
 
Tesla Autopilot crash caught on dashcam shows how not to use the system

I mean we can see that this isn't marked very well whatsoever. The driver with the dashcam is almost crashing too.

But this is the one occasion where I would think that the autopilot should be handling this quite well? The cameras and radar should be clearly detecting that the road is narrowing and steer the tesla to the right to keep in lane?

am i mistaken here? The only thing I can spot that there will be a conflict with the road markings, but imho a solid object which you are a bout to crash into should still be avoided?

Putting aside the fact that driver should not have AP engaged in construction zone, since ultrasonic sensors have 2x range and cars do slow down based on radar (w/o camera confirmation) perhaps new AP2 would have handled it better. On the other hand, ultrasonic and radar may be blind to obstacles at such angle.

This is an interesting corner case I would love Tesla to explain how HW2 is supposed to handle in the future (which sensors would system rely on).

One thing that many people noted is that the car stayed well within the lane after the crash, instead of bouncing off and probably colliding with the car in the other lane. Considering that airbags deployed it's rather likely that it was Autopilot still steering the car after the crash.
 
Last edited:
The car should be programmed to see the construction sign, and then make/warn the driver take over. The camera should be able to see colors, and the construction signs are all the bright orange IIRC. Should be a relatively simple fix.
From what I know, the Mobileye camera in AP1 is a monochrome camera, with the only color it recognizes being the red channel. I believe AP1 only uses sign recognition for speed limit, not to the extent of being able to respond to a construction sign (even if it recognizes it).
 
From what I know, the Mobileye camera in AP1 is a monochrome camera, with the only color it recognizes being the red channel. I believe AP1 only uses sign recognition for speed limit, not to the extent of being able to respond to a construction sign (even if it recognizes it).

You don't need color to recognize road signs. The color of a sign is irrelevant.
I'm glad you replied to this thread cause you are the main person I told how almost useless radar is and almost any configuration of lidar is better.
 
You don't need color to recognize road signs. The color of a sign is irrelevant.
True, but considering that construction signs can come in many forms with various wording, the easiest way to do it is to narrow down the orange and yellow colors, and then decide if it's a sign of interest. But since the AP1 camera can't do that, it would require a lot more processing power to figure out what each road sign says.

LMGTFY
 
Tesla Autopilot crash caught on dashcam shows how not to use the system

I mean we can see that this isn't marked very well whatsoever. The driver with the dashcam is almost crashing too.

But this is the one occasion where I would think that the autopilot should be handling this quite well? The cameras and radar should be clearly detecting that the road is narrowing and steer the tesla to the right to keep in lane?

am i mistaken here? The only thing I can spot that there will be a conflict with the road markings, but imho a solid object which you are a bout to crash into should still be avoided?

IMHO, totally the driver's fault. He didn't regain control of the car in circumstances that required it. In such road conditions one has to always exercise caution and this guy didn't. Then he blamed AP, which is always the outcome when a driver does want to take any responsibility for his own mistakes. Regrettably that's the way of the world today.
 
Apparently you don't drive much in Dallas where some of these construction zones and lane alterations seemingly come out of nowhere. It's very easy to see how this could happen and not just on hwy 121 where this incident took place and construction was marked (poorly). Might want to drive a mile on someone else's route before casting stones.
nonsense!
I know that road well, it has been in rebuilding for a long time. the zones while changing are well marked.
 
nonsense!
I know that road well, it has been in rebuilding for a long time. the zones while changing are well marked.
I relayed my experience on a different road. I was paying attention and the "driver assistance" nearly took me into barrels in a poorly marked area. It wasn't a clearly marked construction area in my experience, so I wasn't engaging AP1 contrary to its intended use. I've read the manuals and warnings. Not sure why I would lie about it, and I can't understand how this is a controversial take.

If you take issue with my comments and subsequently question my reading comprehension, then I hope your day/life improves.
 
i don't understand why these videos are interesting or newsworthy. if the driver had been paying attention, wouldn't the accident have been averted? how is this any different from any other driver-fault accident in any other vehicle?

autopilot or not, pay attention to the road, dummies.

i didnt post this so we can argue about who is at fault and how you should be driving through construction zones. The driver is 100% at fault. By DEFAULT.

I simply want to have a technical debate on the capabilities of the AP.
 
I'm really getting perturbed at the constant shift of blame on these Tesla accidents. Every time one of these AP-engaged accidents occurs we go through the same things:
  • AP isn't designed for this or that ...
  • Manual says to do this and the driver wasn't ...
  • Hardware isn't up to that task ...
  • Not meant to be autonomous ...
  • It's AP's fault for luring the driver into a false sense of security ...
  • It's Elon's fault for releasing stuff too soon ...
  • It's the government's fault for not stopping the whole thing ...

Every last Tom, Dick, and Harry is blamed except the driver who crashed his own car into the frakking wall.

Do we blame the cell phones for distracted driving accidents? Or the radio? Do we blame the gun for the accidental shooting? The table saw for the lost hand?

So many people preach personal responsibility above all else in this country yet can't accept that as of right now, no matter what car you're in, you still have to drive.
 
True, but considering that construction signs can come in many forms with various wording, the easiest way to do it is to narrow down the orange and yellow colors, and then decide if it's a sign of interest. But since the AP1 camera can't do that, it would require a lot more processing power to figure out what each road sign says.

LMGTFY
"the easiest way.."

You lost me there, that's not how sign recognition using deep neural networks work.
You don't need color, plain and simple, nor does color help, plain and simple.
 
AP1 is not designed to handle this. There is not anything mysterious with this. AP1 follows lines and the line leads to this wall.

I'm sometimes critical towards Tesla, but IMHO, this is clearly outside AP1's promised capabilities.

I had similar experience (posted at link below), absent the K-rail, at a transition from two lanes to one where double yellow lines were ignored.

AP1 driving over double yellow merge lines on 405

Costa Mesa service center told me today that logs identified the "deviation of lines was to steep for AP to react", so it just didn't. Apparently, the camera, or lane marker recognition algorithm, is not looking very far forward.
 
  • Informative
Reactions: Az_Rael