Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Here's another reason to be extra cautious with the new "wide-lane" support in AP2 on 2018.10.4: Lane correction when the vehicle is literally about to split two lanes at highway speeds is extremely quick. It felt more like an avoidance maneuver—as if it wanted to complete the lane change before the lane divider actually started on the pavement. I had both hands on the steering wheel (as usual) when driving, and it was still surprising how rapidly Autosteer moved into the right lane.

This happened at the lane split on I-280 South headed for the I-880 North/US-17 South exit, just past the Winchester Blvd exit (images are copyright Google from from Google Street View; they were not taken by me, and they are not from today):

View attachment 291377

The Google Street View vehicle is one lane to the left—the lane to its right is the one that splits into two lanes. With 2018.10.4, the wide-lane support kept my Model X centered between the ever widening "lane" until maybe 30-50 feet (very rough estimate) before the new lane divider started, at which point Autosteer moved rapidly into the right-hand lane (I don't know why the right lane was chosen vs. the left lane in this case):

View attachment 291378

I don't have a dash cam, and I don't care to have the publicity that might come with posting a video, but I want folks to be aware of this behavior. I'm considering reporting this to Tesla since it happens so quickly—it's basically the opposite of a smooth lane change.

Just wanted to follow up to note that the behavior of 2018.12 is much improved at this location.

The MX moves to the right-hand lane well before the new center lane marker appears now, and it's much less jarring (if only because it's traveling a shorter distance).
 
My single camera, ThinkWare Dashcam F770 has forward collision warning and lane keep warning.
Of course, I disabled it for my Model S, but for the few days I tried it, it was pretty effective.

I got cut off last night at a merge.
I was driving 55 mph, I just had time to slam on my brakes.

I am using a Blackvue 650 dashcam.
I wonder what kind of warning, if any, the ThinkWare F770 would have provided.
And if I would have time to react in time?

With recent 2 blogs for this accident,
Telsa's claimed that Autopilot can "reduce crash rates by as much as 40%"
as reported by Department of Transportation,
Quality Control Systems has sued to get the supporting data and method for such claim.
This is where AP excels.
Any active anti collision system would certainly reacts faster that a person.

Potentially yes. His hands were off the steering wheel for six seconds.
I doubt they were folded neatly in his lap.
A distraction of some type occurred and held his attention for that six seconds.
iPhone?
Believe or not, there are less than two seconds between the following first and last pictures.

Driving is a serious business.
All these attentions Tesla's AP L2 get lately reminded us that AP feature
is fun (and relax) to drive and at same time, be ready to take control.
Hope you and your kid will enjoy the car as much as it is designed for.
Using AP or not, driving without keeping your hands on the steering wheels in a congested area, should not be allowed.

Freeway Merge Cut Off 01 .jpg Freeway Merge Cut Off 02 .jpg Freeway Merge Cut Off 03 .jpg Freeway Merge Cut Off 04 .jpg

Since rearview camera are now mandatory, it would not be too complicated
and costly to have also a front camera.

Having both cameras recording may be the last 5 minutes in a continuous loop
(only 5 minutes for privacy issues) would help determining the cause of accidents.
 
Last edited:
Agree & I'll go a step further:

Turning the steering wheel enough to override the autosteer is as simple as turning the wheel as you would if autosteer was not engaged.

It's not some special way to disengage that requires learning - your normal steering will override autosteer if there is a difference between the two. You'll feel a small 'bump' under your hands, but it is not some effort to wrestle it back. It is simply normal steering (which is why I've said at other times that I'm more likely to inadvertently disengage than get a nag).
This is so true. Many times I've been using AP with one hand on the wheel and because I'm, say, on a curve where the truck next to me drifting close, I cut out of AP without consciously intending to simply because I'm reflexively pressuring the wheel to move the car away from the truck. It's very natural even with sport mode steering, at least in AP1.
 
The AP 2.0 Radar can’t discriminate targets. The car is not supposed to hit other cars, so it works.

I believe the 2.5 radar can tell the difference between people & vehicles & tell the vehicle. Then it’s up to Tesla to display people instead of cars.

Pretty sure the radar does see all. But it takes the software (neural net) driving the cameras to positively identify the object and react or process the data accordingly. Labeling object recognition is a laborious task.

Pretty sure I remember reading somewhere the AP1 fatal crash was partly due to the hard coded mobile eye software. It was unable and not trained to identify, label, and react to the side of a semi trailer. But, the radar and camera did see it. It just didn’t know what to do with the data.
 
  • Informative
  • Helpful
Reactions: EVie'sDad and 22522
Honda uses the increasing distance between tail lights to process closing speed
(frame to frame vector on features known to be on the same object).

For that to work on these barriers with radiuses on the front corners or little background contrast, assembly of pixels into a single object needs to be very fast.

Seems like if you use the pixel vectors from two frames you could do it. With enough pixels... more pixels might be more important than higher frame rate.

I have not figured it out yet.

Slept on it.

Since the cameras can be modeled as if they are on the top of a stick, rolling on a ground plane, and the fact that cars are wider than they are tall, it is easier to use the change in lateral distance between tail lights to determine if you are closing on the car in front of you.

These concrete things are typically taller than they are wide and sometimes rounded like a post to allow cars to deflect. They also have ^ shaped marks in the center, if any marks beside the top edge exist.

This means you need to measure the change in height, or change in vertical spacing between the top tip of the top ^ and the top tip of the bottom ^. As you get closer you will have to account for the changing view angle that will actually make the distance you are measuring get smaller as you get closer.

The change height from the ground intersection at the base to the top edge should be measurable if the frames are as far apart in time as the two snapshots that chicken and other prey birds take as they move their heads when they walk... 180 ms should be enough. You need a lot of pixels...

The car does not need expensive sensors to avoid hitting barrels, barriers and bridge abutments.
 
  • Like
Reactions: mongo
I got cut off last night at a merge.
I was driving 55 mph, I just had time to slam on my brakes.

I am using a Blackvue 650 dashcam.
I wonder what kind of warning, if any, the ThinkWare F770 would have provided.
And if I would have time to react in time?


This is where AP excels.
Any active anti collision system would certainly reacts faster that a person.


Believe or not, there are less than two seconds between the following first and last pictures.


Using AP or not, driving without keeping your hands on the steering wheels in a congested area, should not be allowed.

View attachment 292407 View attachment 292408 View attachment 292409 View attachment 292410

Since rearview camera are now mandatory, it would not be too complicated
and costly to have also a front camera.

Having both cameras recording may be the last 5 minutes in a continuous loop
(only 5 minutes for privacy issues) would help determining the cause of accidents.
A very insightful post. Some participants in the forum seem to believe that experience with using AP supports hands off the wheel while AP is engaged. My experience has been just the opposite; hands must remain on the steering wheel when AP is engaged. There is a lag time that is created between the time that AP takes an action and the driver reacts to that action. That lag time is increased if the hands are off the wheel and further increased if the hands are engaged with a phone or other device. Add to that the potential to be distracted by what your hands are doing such that you fail to recognize that AP has made a poor choice and you are seconds from disaster.

When I first started using AP I was apprehensive. As I became more familiar with it and gained confidence I found it to be relaxing in that I could let the car deal with the stop-and-go of commuting. But I’ve always kept my hands on the wheel. The few times that AP has made a poor choice while engaged have reinforced to me the importance of doing so.
 
Great discussion.. Phew, 108 pages.. finally got to the end.

My $0.02 on the blame game:
If CALTRANS had replaced the crash absorber on the highway, might the driver have survived?
Possibly.
If CALTRANS had repainted the faded lane lines, could the current software have avoided the accident?
Probably.
If the Tesla software had been more mature, might if have been able to avoid getting in the gore zone?
I would hope so.
but most importantly, If the driver had been paying attention could this have all be avoided?
Almost certainly YES!

I can't fault Tesla with putting out the 2nd blog post.. People need to be reminded to stay vigilant and it's not like they're
going to admit "yeah, AP drove him into a barrier"

I'm working on an auto steering system similar to AP1 and while these LDW/LKA cameras do a decent job detecting lanes,
they can be fooled. I've seen my system lurch (momentarily) in the direction of diagonal tire skid marks that may be more
prominent than the normal sub-optimal lane markings. Not a big deal if I'm watching the road and can grab the wheel with
short notice... I prefer to have my hands off the wheel but I'm always watching the road ahead and ready to take control.
I've setup my system so moderate user torque on the wheel doesn't disengage but rather backs off for 3 seconds with a small
beep to let you know it's not in control and when it will resume control.

IMG_1833.jpg



As for the hands off wheel detect as I mentioned on another thread, fortunately the EPS systems I've worked with had very precise
user torque outputs and with a bit of software it's not too hard to differentiate between true hands on the wheel and phantom torque
induced when the auto steering turns the wheel. People shouldn't get nags when their hands are on the wheel!

Finally, I too would like to believe Tesla can achieve FSD with a mostly vision based system. I'm sure that as the NN system
advances and more data from the other cameras is incorporated, the system will get much better. We humans don't just look
at the lane lines but also consider the relative positions of other moving vehicles around us and other fixed landmarks. Not sure
if the system will ever get close enough to 100% such that a person can completely tune out though..
Since that will likely be the case for a while,
PLEASE KEEP YOUR EYES ON THE ROAD!
 
Pretty sure the radar does see all. But it takes the software (neural net) driving the cameras to positively identify the object and react or process the data accordingly. Labeling object recognition is a laborious task.

Pretty sure I remember reading somewhere the AP1 fatal crash was partly due to the hard coded mobile eye software. It was unable and not trained to identify, label, and react to the side of a semi trailer. But, the radar and camera did see it. It just didn’t know what to do with the data.

I may not have been clear. The AP 2.5 radar (Continental) can do target recognition on its own & report the results to the vehicle. It runs a more sophisticated radar waveform than the AP 2.0 (Bosch) radar. No NN update will be needed other than what’s needed to decide what to do with the data.
 
I may not have been clear. The AP 2.5 radar (Continental) can do target recognition on its own & report the results to the vehicle. It runs a more sophisticated radar waveform than the AP 2.0 (Bosch) radar. No NN update will be needed other than what’s needed to decide what to do with the data.

Assuming that Tesla uses that feature. (Since it sounded like they preferred to run the radar in raw mode and do all of the processing themselves.)
 
  • Informative
Reactions: 22522
First, I don’t know what radar is in the AP 1.0 cars. I know what’s in the 2.0 and 2.5 cars. Second (perhaps dumb) question, does the AP 1 driver display actually show a person instead of a vehicle when a pedestrian is detected?

Yes it shows a red person in the IC but it's not 100% at speeds above 25mph I think based on the YouTube testing. Kman was the YouTube guy who I recall revealing it.
 
Great discussion.. Phew, 108 pages.. finally got to the end.

My $0.02 on the blame game:
If CALTRANS had replaced the crash absorber on the highway, might the driver have survived?
Possibly.
If CALTRANS had repainted the faded lane lines, could the current software have avoided the accident?
Probably.
If the Tesla software had been more mature, might if have been able to avoid getting in the gore zone?
I would hope so.
but most importantly, If the driver had been paying attention could this have all be avoided?
Almost certainly YES!

I can't fault Tesla with putting out the 2nd blog post.. People need to be reminded to stay vigilant and it's not like they're
going to admit "yeah, AP drove him into a barrier"

If Automatic Emergency Braking...
I see you skipped that one.

Pay attention all you want, some accidents are unavoidable, which is why automakers have AEB and Front Collision warnings.
What do you do when those systems fail to perform?
 
Great discussion.. Phew, 108 pages.. finally got to the end.

My $0.02 on the blame game:
If CALTRANS had replaced the crash absorber on the highway, might the driver have survived?
Possibly.
If CALTRANS had repainted the faded lane lines, could the current software have avoided the accident?
Probably.
If the Tesla software had been more mature, might if have been able to avoid getting in the gore zone?
I would hope so.
but most importantly, If the driver had been paying attention could this have all be avoided?
Almost certainly YES!

I can't fault Tesla with putting out the 2nd blog post.. People need to be reminded to stay vigilant and it's not like they're
going to admit "yeah, AP drove him into a barrier"

I'm working on an auto steering system similar to AP1 and while these LDW/LKA cameras do a decent job detecting lanes,
they can be fooled. I've seen my system lurch (momentarily) in the direction of diagonal tire skid marks that may be more
prominent than the normal sub-optimal lane markings. Not a big deal if I'm watching the road and can grab the wheel with
short notice... I prefer to have my hands off the wheel but I'm always watching the road ahead and ready to take control.
I've setup my system so moderate user torque on the wheel doesn't disengage but rather backs off for 3 seconds with a small
beep to let you know it's not in control and when it will resume control.

View attachment 292505


As for the hands off wheel detect as I mentioned on another thread, fortunately the EPS systems I've worked with had very precise
user torque outputs and with a bit of software it's not too hard to differentiate between true hands on the wheel and phantom torque
induced when the auto steering turns the wheel. People shouldn't get nags when their hands are on the wheel!

Finally, I too would like to believe Tesla can achieve FSD with a mostly vision based system. I'm sure that as the NN system
advances and more data from the other cameras is incorporated, the system will get much better. We humans don't just look
at the lane lines but also consider the relative positions of other moving vehicles around us and other fixed landmarks. Not sure
if the system will ever get close enough to 100% such that a person can completely tune out though..
Since that will likely be the case for a while,
PLEASE KEEP YOUR EYES ON THE ROAD!
Let's say FSD = AP + Human Monitoring. So, human monitoring and intervention is a crucial part of safe driving requirement.
Since your work involves in this area, here is my two senses: Currently, AP is heavily dependent on external manmade references such as lane marking. However, lane making does not guarantee road safety because it is a static reference (rule based or predetermined). In order to reach next level of driving safety (non-conflict environment), the self driving AI must apply the real time and true space topologies. In this sense, each object (moving or stationery) may be represented in terms of topological space and its associated energy vectors. The AI NN can reliably compute and predict all the objects within the environment. The task of monitoring can then be transferred to the AI software within the limitation of the moving object's sensors. In other words, the FSD car can safely operating not only in normal as well as in nearly chaotic environment without human intervention. Finally, when the FSD became reality, the driving safety will be less dependent on the rule based factors such as lane marking or tire skid marks. To take this idea to extreme, with all cars on the road are equipped with FSD capability, one can think of the "lane marking" would be dynamically determined at any given moments.
 
  • Like
Reactions: mongo
Pay attention all you want, some accidents are unavoidable, which is why automakers have AEB and Front Collision warnings.
What do you do when those systems fail to perform?


AEB and FCW are specifically for the cases when the driver is not paying attention. (Other than the radar look ahead system which provides data not available to the driver. However, even in those cases a greater following distance renders the system redundant). If a driver hits a stopped/stopping object with a moving one, either the driver was going too fast or following too closely (for their level of reaction time).

The unavoidable crashes involve such things as oncoming traffic veering into the driver's lane, or cross traffic running a light. AEB/ FCW can't stop other cars from colliding with them. (Although, AEB did help in Phoenix when a car with no lights was going the wrong way on an virtually empty freeway at night.)

AEB and FCW are designed to help catch some of the the former situations, they are not designed to remove driver attentiveness or safe driving practices. If they fail to detect a situation, it is no worse that the original situation the driver was in.

If a driver wants to rely on AEB/FCW as a replacement for a safe speed, safe following distance, or safe level of attention and it fails to compensate for their choice, that's 100% the driver's decision/ consequence.
 
This is so true. Many times I've been using AP with one hand on the wheel and because I'm, say, on a curve where the truck next to me drifting close, I cut out of AP without consciously intending to simply because I'm reflexively pressuring the wheel to move the car away from the truck. It's very natural even with sport mode steering, at least in AP1.

This is the exact situation in which the Nissan implementation of Mobileye has it all over Tesla's AP IMHO. With Tesla, AutoPilot is an all-or-nothing deal. Either AP is steering or you are, but never both. In fact, some physical force is required to wrest control of the car away from AutoPilot. And, once you have taken over, AP is disabled permanently until you manually reactivate it.

With Nissan's ProPilot Assist, the steering is perfectly natural and both the driver and ProPilot Assist can be steering at the same time without totally disengaging ProPilot Assist. It's more like a Driver's Ed car with two steering wheels where both can make steering adjustments as necessary to avoid a problem. You can even change lanes manually, and ProPilot Assist will reactivate as soon as it has clear lane markers again.
 
If Automatic Emergency Braking...
I see you skipped that one.

Pay attention all you want, some accidents are unavoidable, which is why automakers have AEB and Front Collision warnings.
What do you do when those systems fail to perform?

Crash I suppose :/

Yeah, good point. Since my project is only concerned with steering I'm not very focused on AEB but I agree it's a critical component to a solution.. Most of the current systems can do ok at relatively low speeds but the situation here seeing something approaching at 60+ mph while not reacting to a ton of false positives is an enormous task.
 
Let's say FSD = AP + Human Monitoring. So, human monitoring and intervention is a crucial part of safe driving requirement.
Hmmm.. yeah I suppose it's hard to claim FSD if a human is still required in the loop. The Uber AZ accident is very concerning in this regard. I do agree though as you say when the Tesla system is fully able to track the other objects in it's environment the EAP experience will be greatly improved.