Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Implied being the keyword.
screen-shot-2018-04-01-at-3-17-47-pm-png.291027
As this picture demonstrates, it is consistent both to have an unobstructed view of the attenuator and not be in line with it. Nothing Tesla said is inconsistent with AP swerving over late into the attenuator.
 
After countless swerves - APII last 2nd decides to swerve us onto available alternate route - or swerve us onto off-ramps, has anyone considered the possibility that something like that happened to this driver?
.

There should not be any swerves.

Road are designed for comfort using this table,
Screen Shot 2018-04-01 at 2.34.48 PM.png



from this reference:
https://nacto.org/docs/usdg/geometric_design_highways_and_streets_aashto.pdf

There are other tables for banked corners, but as far as swerving goes, autopilot actions should be tested against this table.
 
Last edited by a moderator:
Shifting subjects slightly, several posts upthread have criticized setting AP’s following distance to “1”, which is where Tesla’s blog post said the driver in the accident had his set. I also tend to set AP to 1 on the interstate/freeway in moderate to heavy traffic at speeds up to 70 MPH, although I go up to 4 or 5 in light traffic. Here is my argument why 1 is a good idea.

On I-66, the main road west out of DC, a setting of 1 closely imitates the following behavior of most cars in heavier traffic. I suspect this is true in many roads in large metropolitan areas. I have AP copy other drivers’ behavior because I don’t want to present a large gap that would invite drivers, or at least the more aggressive drivers of which there are plenty, to cut in ahead of me, potentially (depending on the cut-in’s speed) causing my car to brake-check the car immediately behind me. Brake-checking is a dangerous thing, so I figure the reduction in safety from this possibility is worse than the reduction from the shorter stopping distance to the car ahead. There are three reasons why.

First, since my AP1 car is blind to the rear, with no rear-facing radar or camera, it cannot assess how dangerous it is to brake hard when another car is tailgating me, as often happens. And when AP is set to a longer following distance than 1 in moderate to heavy traffic I am definitely tailgated more, increasing the hazard. Also, more cars will cut in front, increasing the likelihood that AP will brake, which also increases the hazard.

Second, because AP’s forward view is narrow it will react suddenly once it sees another car cut in front of me. AP also cannot read or interpret the turn signal of a car in the adjoining lane, so it cannot anticipate the cut-in as a human driver would, which means it will leave less space to brake as the cut-in occurs. Once again, the car behind me can be surprised by the suddenness of the reaction. Sometimes I will brake out of AP to allow a smoother cut-in.

Third, when AP is locked on to the car in front (which turns white on the display) it has more accurate distance information than I have, and can react to it more quickly than I can. Reacting to the speed of the car ahead is one thing that AP can do better than a human. AP can also sense and react to the second car ahead by detecting the radar reflection that bounces under the first car. Thus I have confidence in its ability to avoid running into the car immediately ahead when the following distance is set to 1, and believe there is little safety benefit to lengthening the following distance, in contrast to the higher likelihood of being rear-ended at longer following distances.

Again, I use setting 1 because it closely approximates the following distance of most cars in moderate to heavy highway traffic around here, even at high speeds up to the low 70s (and obviously AP will increase its following distance at higher speeds, even at setting 1). When traffic is light I set AP to follow at a distance similar to what the other cars are doing in the lighter traffic.

I think this is the safest policy.
 
I have to disagree to some extent. Remember that radar detect distance as well. When the radar sensor first receive signal of an object that is still far away (by distance), and it maybe afraid of false positive so it takes no action, this is fine. HOWEVER, when the radar keep receiving signal of an object that it KNOW is getting closer and closer and closer by distance (in the span of many seconds) to the point where the object is literally a few feet in front of the sensor during the last second before collision, and the braking system still doesn't take any action to slow down the car. That's just insanely dumb design.

If the radar ignores all or most of the signal and afraid of false positive, the emergency auto braking basically will never kick in during ANY situations where stationary objects are involved. Tesla should have put warning signs and pop up message on screen everywhere saying auto braking only works on moving objects, but it would not work in ANY stationary objects, even for a huge fire truck or a massive brick wall in front of you.

There absolutely should be a training video that shows you how a Tesla will react when approaching different types of objects at different speeds.

The problem for Tesla is

1. The video would have to be updated often for it to be accurate (as software updates change the results)
2. The video would have to be updated for various hardware (do the S, 3, and X all react the same every time?)
3. It would be used against them by competitors, trolls using the legal system or media, people with buyers remorse, etcetera.

Still given the number of people having the reaction you have there needs to be better education on what it can and can't do.
 
Last edited:
*Won't always.

It's been in the AP manual from the first release. You would think before you put your life in the hands of new technology that has really never been done before in the history of earth, you'd at least reading the f****** manual.
Yes, I read it and that’s how I figured it out, then I confirmed it it here at TMC, my statement still stands as to how this technology cannot figure out such situation which should be a no brainer when implementing auto pilot.
 
After countless swerves - APII last 2nd decides to swerve us onto available alternate route - or swerve us onto off-ramps, has anyone considered the possibility that something like that happened to this driver?
.
Yes but most Tesla diehard disagree yet that was my first thought when I read the news as it has done to me before. Either sweet or not recognize the wall after somehow entering the gore
 
Probably a dumb question, but is the neural net tied to the vision system in this way: Can the system be "trained" (I don't know the right word for this) to identify warning symbols? For example, on at least the non-collapsed attenuator, there is a easily detectable yellow and black pattern. Isn't this the sort of thing that can be identified by a vision system?
 
  • Like
Reactions: Matias
My recollection from other threads is that Tesla's choice of radar makes it impossible for it to distinguish a large object directly in front of the vehicle and one that is 15 feet up in the air. So... to keep your vehicle from slamming on the brakes every time it passes under a tree or overpass, Tesla radar has been programmed to ignore many stationary objects.
 
As this picture demonstrates, it is consistent both to have an unobstructed view of the attenuator and not be in line with it. Nothing Tesla said is inconsistent with AP swerving over late into the attenuator.

If you modify that to say "As this picture demonstrates, it is consistent both to have an unobstructed view of the attenuator and not be in line with it. Nothing Tesla said is inconsistent with AP following the pink path into the attenuator." Then I'd agree with you.

I just wouldn't call that a swerve, since you can draw a straight line from one point to the collision even if it crosses lane lines.


https://teslamotorsclub.com/tmc/attachments/screen-shot-2018-04-01-at-3-17-47-pm-png.291027/
 
Nothing Tesla said is inconsistent with AP swerving over late into the attenuator.

"The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."

This implies that the sight picture for 5 seconds would prompt action.
 
Last edited by a moderator:
Problem with frequent OTA updates is, that AP’s behavior constantly changes. What is more alarming is, that those changes are not documented in the release notes.

Ah, well, now you may be highlighting an issue of using a neural net in AP. You can judge the actions of the neural net but it’s difficult to discern why the system chose to do what it did. i.e. the latest version might crash less or lane center better even the designers may not know why - so how do you document that?
 
Yes, I read it and that’s how I figured it out, then I confirmed it it here at TMC, my statement still stands as to how this technology cannot figure out such situation which should be a no brainer when implementing auto pilot.

No brainer? uh-huh. You should join their development team.

What a sad state of affairs when I've basically had nothing positive to say about Tesla in 2 years, and yet here I am defending them. It shouldn't crash - no brainer :rolleyes::rolleyes::rolleyes: You understand there's a barrier there in the first place because humans keep crashing into it? 4 billion years of evolution and 100 billion neurons, and it still doesn't work right.
 
Texas seems to do a better job with highway marking. I have seen some worn lines, but at least at one time there were diagonal lines in the gore area. Also, a tall sign right above the barrier is unambiguous from a distance.

IMG_5173.jpg
IMG_5177.jpg


Also, wide lanes don't exist, because they mark the use model even for infrequent uses (once a week in the summer on these spots).
IMG_5206.jpg
 
No. The safest car on the road is a Tesla with drivers who understand that AP is a car-following and well-marked-lane-line-keeping tool and they understand the design and design limitations of the tool and how to use the AP tool to make the minor adjustments in steering and accel but remain vigilant (and more so being relieved of the burden of those minor adjustments) for the need to make other major driving decisions and take other major driving actions.

Hire this man for VP of Common Sense anywhere.
 
If you modify that to say "As this picture demonstrates, it is consistent both to have an unobstructed view of the attenuator and not be in line with it. Nothing Tesla said is inconsistent with AP following the pink path into the attenuator." Then I'd agree with you.
Huh? You don't agree with me because something else is also consistent?

Me: It's possible to win a baseball game 3-1.

You: I disagree, because it is also possible to win 4-1.

Me: Oooooo k.
 
  • Funny
Reactions: croman
No brainer? uh-huh. You should join their development team.

What a sad state of affairs when I've basically had nothing positive to say about Tesla in 2 years, and yet here I am defending them. It shouldn't crash - no brainer :rolleyes::rolleyes::rolleyes: You understand there's a barrier there in the first place because humans keep crashing into it? 4 billion years of evolution and 100 billion neurons, and it still doesn't work right.

No need to be snappy, and that is correct, an auto pilot that drives you against a concrete wall/divider/fire truck or any stationary object for that matter is unthinkable. I cannot side Tesla as you do on these instances.