Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
c) Running commentary. I encourage everyone to try this if they want to be a safer driver. (Make sure you are on your own in the car to start ;), but then with practise you sort of think the commentary as you go.)

That's funny - I didn't know that was a driver technique. I do that when I am driving in tight city street situations in the LA area. Its Mad Max country on the side streets down there - I have seen 8 series BMWs jump sidewalks to "get ahead". My commentary often involves invoking the Force on people. As in "This is not the lane you are looking for" as someone looks like they are about to dart in front of me. But yah, driving next to a row of parked cars with an eagle eye out for people and sudden doors saying "no doors, no doors, no doors" to myself.

There is a reason my dash cam audio is muted, LOL.
 
That's funny - I didn't know that was a driver technique.

Yep it's taught as part of the IAM driving course. (Advanced driving and riding courses | IAM RoadSmart). It is not mandatory for the pass, but doing so is strongly encouraged.

Here it is being demonstrated in regular day to day driving:

Statistically speaking drivers who take advanced qualifications are far less likely to be involved in car accidents. (The stats are something like 66% less likely, and those accidents are generally less severe.)

If safety is the main concern one option is to throw tech at the problem, the other is to address the weakest part of the system, and that's the drivers. To do that means more training. The two are complementary.

As for expletive based commentary... yep I'm just as guilty as the next person when I get cut up by someone not checking their mirrors ;)
 
  • Like
Reactions: Kant.Ing
a) Not having set lane centering position. I.e the car shouldn't sit consistently in the center (/slightly offset) of the running lane. Rather moving to the left or right the imaginary center line between lane marker / boundary in order to gain maximum visibility. This is particularly applicable to single lane roads, with curves and "crossable" markers. It is perfectly acceptable/legal to move across into what would ordinarily be the opposite travel direction when no other cars are coming in order to increase your line of sight.

Needless to say, AP isn't designed to handle the kinds of roads shown in your videos. And these are going to be especially difficult for autonomous, whenever that comes. Don't see many roads like these in the US. Not sure how often your cross-into-the-wrong-side-of-road strategy would be appropriate here.
 
Needless to say, AP isn't designed to handle the kinds of roads shown in your videos. And these are going to be especially difficult for autonomous, whenever that comes. Don't see many roads like these in the US. Not sure how often your cross-into-the-wrong-side-of-road strategy would be appropriate here.

The 'dancing the lane' is a more common to MC riders here than car drivers. The idea is to keep oncoming drivers alert so they don't make a left in front of you then say, "I didn't see you!". It's kind of like driving with your headlights on all the time. Improves your chances.
 
  • Like
Reactions: smac

“We believe in transparency, so an agreement that prevents public release of information for over a year is unacceptable.”


I find that quote humorous on so many levels. Quote from a related article:
"Tesla has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel. It’s also refused to comment on how many alerts can be ignored before the system disengages, what version of Autopilot software was in Huang’s Model X, or when the car was built."
Tesla Criticized for Blaming Autopilot Death on Model X Driver

Perhaps Tesla knew it was about to get removed from the investigation as a participating party by the NTSB, so they withdrew before that could happen.

"You can't fire me, because I quit first!"
 
“We believe in transparency, so an agreement that prevents public release of information for over a year is unacceptable.”
I find that quote humorous on so many levels. Quote from a related article:
"Tesla has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel. It’s also refused to comment on how many alerts can be ignored before the system disengages, what version of Autopilot software was in Huang’s Model X, or when the car was built."

Why is that funny? Tesla has already said those time should be zero. The driver should always have their hands on the wheel and be paying attention. Just like every other non-FSD car. I'd find it strange for them to give guidelines on how long you are allowed to not do what you are supposed to.
 
  • Disagree
Reactions: NerdUno
Why is that funny? Tesla has already said those time should be zero. The driver should always have their hands on the wheel and be paying attention. Just like every other non-FSD car. I'd find it strange for them to give guidelines on how long you are allowed to not do what you are supposed to.

They say they are withdrawing from the investigation because they don't like how they can't comment on the crash, but then they have refused to publicly divulge a bunch of other info related to the crash when asked. They are talking out of both sides of their mouth.
 
They say they are withdrawing from the investigation because they don't like how they can't comment on the crash, but then they have refused to publicly divulge a bunch of other info related to the crash when asked. They are talking out of both sides of their mouth.
Seems like it varies for the first two questions (based on my reading here) and for the last (also based on my reading here) he probably had 2018.10.x as that was when they added the wide lane centering (also based on the videos posted of that area here).
 
They say they are withdrawing from the investigation because they don't like how they can't comment on the crash, but then they have refused to publicly divulge a bunch of other info related to the crash when asked. They are talking out of both sides of their mouth.

I'll grant you that SW version would be interesting to know. However, given the changes that occur version to version, relying on any particular behavior is not something to promote.
The timeout data is like asking the police how fast you can go above the speed limit without getting a ticket. You shouldn't be doing that, so they aren't going to give you an answer.
 
  • Like
Reactions: Ugliest1
Why is that funny? Tesla has already said those time should be zero. The driver should always have their hands on the wheel and be paying attention. Just like every other non-FSD car. I'd find it strange for them to give guidelines on how long you are allowed to not do what you are supposed to.

I don't know how many times it needs to be pointed out:

Tesla detecting hands not on wheel DOES NOT MEAN that the hand is not wheel.
My hands are always on the wheel when I am using AP2 (and I use it almost all the time), and my model X gives spurious warning every 1.5 minutes or so.

Many drivers hold steering with a light touch. Tesla is probably doing this trick to skirt blame, and fool non-Tesla owning public.
 
...My hands are always on the wheel when I am using AP2 (and I use it almost all the time), and my model X gives spurious warning every 1.5 minutes or so. ...

That means the protocol is not observed by the driver!

Even though hands are registered by human eyes as they are indeed on the wheel but the automation system say that is not the correct way and it does not register as hands on the wheel!

Light touch on the wheel may not allow human to detect when it is time to take over the task.

The correct way is: Constant small torque to the wheel so human can have a continuous feed back from the automation status and it would not issue any more "spurious warning".
 
  • Disagree
Reactions: NerdUno
I don't know how many times it needs to be pointed out:

Tesla detecting hands not on wheel DOES NOT MEAN that the hand is not wheel.
My hands are always on the wheel when I am using AP2 (and I use it almost all the time), and my model X gives spurious warning every 1.5 minutes or so.

Many drivers hold steering with a light touch. Tesla is probably doing this trick to skirt blame, and fool non-Tesla owning public.

Well, it didn't need pointed out this time. I'm well aware (and have posted) that not detected != not on wheel. Hands should always be on the wheel (detected or not). the timers are due to the issues in detecting, not to allow hands free driving. I was referring specifically to :
Tesla has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel. It’s also refused to comment on how many alerts can be ignored before the system disengages,

Hands not on the wheel will cause hands to not be detected. If you say publicly that the car will let you go 60 seconds while hands are not detected, people will turn that into, "I can leave my hands off the wheel for 59 seconds".
 
The family have just hired lawyers and are preparing to file a lawsuit against Tesla for wrongful death. I find it interesting that they are not sueing Caltrain or whoever was supposed to be responsible to fix the barrier, which would probably be an easier case to win.
 
Seems like it varies for the first two questions (based on my reading here) and for the last (also based on my reading here) he probably had 2018.10.x as that was when they added the wide lane centering (also based on the videos posted of that area here).
There are no such things in nature as wide lanes - kind of like a vacuum... Unmarked, multipurpose: yes. Wide: no.
 
...lawsuit against Tesla...

There may be many reasons but I think it's also due the lack of understanding of how buying a beta product works.

When I bought Autopilot, it only allowed a maximum speed of 45 MPH on freeway.

That's very dangerous when my neighborhood freeway CA-99 speed limit is 70 MPH and people drive much faster than that!

Supposedly I got rear ended and ran over by a speeding tractor-trailer 18-wheel truck because I was driving too slowly with an Autopilot that's designed for a maximum speed of 45 MPH, should I blame Tesla?

I could say why Tesla would allow a system to drive so slowly at 45 MPH in a 70 MPH zone.

And now, Walter's family has the same kind of question why Tesla would allow a system to drive into the gore point?

They might not understand that it takes human hours to write programming codes, to debug, to add features... and the system is not yet completed!