Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Texas seems to do a better job with highway marking. I have seen some worn lines, but at least at one time there were diagonal lines in the gore area. Also, a tall sign right above the barrier is unambiguous from a distance.

View attachment 291039 View attachment 291040

Also, wide lanes don't exist, because they mark the use model even for infrequent uses (once a week in the summer on these spots).
View attachment 291041
Hopefully we all can agree that Caltrans did an unforgivably poor job marking the area in which the crash took place.
 
There absolutely should be a training video that shows you how a Tesla will react when approaching different types of objects at different speeds.

The problem for Tesla is

1. The video would have to be updated often for it to be accurate (as software updates change the results)
2. The video would have to be updated for various hardware (do the S, 3, and X all react the same every time?)
3. It would be used against them by competitors, trolls using the legal system or media, people with buyers remorse, etcetera.

Still given the number of people having the reaction you have there needs to be better education on what it can and can't do.

That may be useful for testing purposes, or as a warning of what it does not do, but a driver should already know where the car should be and how fast it should be going. If you disagree with AP, override it immediately. Don't wait and expect it to do the right thing at the last minute. Imagine it's your kid/ SO/ guy on car pool driving.

Dude, you see that barrier? STOP!!!
Or in these parts:
DEER!
 
  • Like
Reactions: favo and Icer
Probably a dumb question, but is the neural net tied to the vision system in this way: Can the system be "trained" (I don't know the right word for this) to identify warning symbols? For example, on at least the non-collapsed attenuator, there is a easily detectable yellow and black pattern. Isn't this the sort of thing that can be identified by a vision system?

Training is exactly the right word. Basically the system is capable of this at present. AK (Andrej Karpathy) has rewritten the garbage produced by Chris Lattner and has, by all accounts, a robust and workable NN in place.

This NN is capable of identifying the kind of signage necessary to avoid this but it requires being trained with tons of data. Training involves adjusting weights of the various input/output NNs. Adjusting weights basically consists of punishing or rewarding the NN based on how well it performs (if it is shown a yield sign but says its a stop sign, then it needs to be reweighted until it consistently gives the right result).

Now this is a very simplistic analysis since I have a very simplistic understanding of NNs but there is no doubt the vision system could (or is) being trained for this kind of identification. Driving policy, though, is what Tesla lacks (what to do after visual identification). I can see my car knows it is deviating on turns and it will show my car on the IC touching the lane lines but it fails to turn the wheel sufficiently. I know you are aware of this issue :D

So Tesla needs to really fix that part of their equation. HD maps will also really help ensure the car only stays in drivable paths.
 
No need to be snappy, and that is correct, an auto pilot that drives you against a concrete wall/divider/fire truck or any stationary object for that matter is unthinkable. I cannot side Tesla as you do on these instances.
EAP is not FSD, it is smart speed control and lane keeping. It will also drive you through red lights and the wrong way on one-way streets.
 
  • Like
Reactions: e-FTW
EAP is not FSD, it is smart speed control and lane keeping. It will also drive you through red lights and the wrong way on one-way streets.

Big difference in between running a red light and driving straight to an stationary object w/o stopping, I own 2 Teslas and I know the differences in between EAP and FSD, AP should have the ability to stop before hitting a divider or firetruck.
 
  • Like
Reactions: Icer and croman
Big difference in between running a red light and driving straight to an stationary object w/o stopping, I own 2 Teslas and I know the differences in between EAP and FSD, AP should have the ability to stop before hitting a divider or firetruck.

I agree it's a nice feature to have, but what about the fire truck that t-boned your car (or you t-boned) because you let it run the red light? Or the time you were rear ended entering the construction zone because the temporary lane barrier was too close and triggered AEB?

Once the vision side of things is improved, I expect EAP will be better about obsticles. When that happens, drivers will still be responsible for situational awareness.
 
Huh? You don't agree with me because something else is also consistent?


There is nothing in the picture that requires swerving, it's not a curvy tight path between shoulders or trees or anything else, it's wide open pavement.

Somehow with no eyewitness account of swerving and no statement from Tesla that it swerved you seem to have decided to suggest that it swerved.

"change or cause to change direction abruptly."

Sorry, that is what I disagree with. Until we hear otherwise I'm going to say it followed a straight line path into the attenuator. Whether or not it followed paint lines, road seams, or just made up it's own straight line path.
 
  • Like
Reactions: Ugliest1 and Icer
I agree it's a nice feature to have, but what about the fire truck that t-boned your car (or you t-boned) because you let it run the red light? Or the time you were rear ended entering the construction zone because the temporary lane barrier was too close and triggered AEB?

Once the vision side of things is improved, I expect EAP will be better about obsticles. When that happens, drivers will still be responsible for situational awareness.

I agree the driver is responsible, this is just level 2 thus driver's fault, yet I keep going back to the same, how come such a system and technology cannot recognize an stationary object and stop the car as needed, mind-blowing.
 
Somehow with no eyewitness account of swerving and no statement from Tesla that it swerved you seem to have decided to suggest that it swerved.

"change or cause to change direction abruptly."

Sorry, that is what I disagree with. Until we hear otherwise I'm going to say it followed a straight line path into the attenuator. Whether or not it followed paint lines, road seams, or just made up it's own straight line path.
You are fundamentally misunderstanding the entire point of my posts.

Modal Logic (Stanford Encyclopedia of Philosophy)

Oh, and also, if you have a link to the collected witness statements, I think everyone here would appreciate it. I don't know how/where to get access to the police report. I looked at CHP's site and it doesn't appear that a party without relevant interest can get a copy without being in person or maybe going via FOIA.
 
Last edited:
  • Like
Reactions: croman
Well, since Tesla doesn't mention that they sometimes make significant changes to AP in the release notes, I could see an owner who doesn't follow here not knowing that. My husband didn't know that they did that until I told him to be cautious with AP after ANY update. He assumed any AP changes would be documented.

Whether I was on here or not have to say seeing how we hear about tech products having issues after an upgrade in general, I wouldn't be in any rush to "trust" a car's OTA AP, mentioned or not, until some time and careful use of it were had. Not totally inheard of to have one change affect something else. And even with Tesla we've seen quick follow up updates. Kind of the general nature of the overall tech business. You'd like to think you've successfully tested in house but some edge case crops up.
 
I understand how AP works, I know is just level 2 but, I was speechless when I figured out and confirmed here at TMC last year that AP won’t stop against a stationary object.
you would think with today’s technology the system should be able to recognize a stationary object on the vehicle’s path/trajectory that is approaching at a high level of speed equal at your vehicle’s speed, it should definitely be able to determine that will result on a collision and act accordingly.

I think we overestimate what stage technology is at with regards to many things. I mean you would think with all the years of research by now we would have a cure for the common cold. We put a man on the moon and a Starman in a Roadster orbiting in space. :) The average person today hears so much about self-driving cars but doesn't have the understanding of the hurdles still ahead be it with AP/driver assist systems or FSD mode. Unrealistic time expectations for the most part and whether they can ever get there for Level 5. Personally I don't want a car with a ton of electronic equipment on my cars roof. How much all this will cost the average car owner is probably another issue.
 
  • Like
Reactions: JimVandegriff
Shifting subjects slightly, several posts upthread have criticized setting AP’s following distance to “1”, which is where Tesla’s blog post said the driver in the accident had his set. I also tend to set AP to 1 on the interstate/freeway in moderate to heavy traffic at speeds up to 70 MPH, although I go up to 4 or 5 in light traffic. Here is my argument why 1 is a good idea.

On I-66, the main road west out of DC, a setting of 1 closely imitates the following behavior of most cars in heavier traffic. I suspect this is true in many roads in large metropolitan areas. I have AP copy other drivers’ behavior because I don’t want to present a large gap that would invite drivers, or at least the more aggressive drivers of which there are plenty, to cut in ahead of me, potentially (depending on the cut-in’s speed) causing my car to brake-check the car immediately behind me. Brake-checking is a dangerous thing, so I figure the reduction in safety from this possibility is worse than the reduction from the shorter stopping distance to the car ahead. There are three reasons why.

First, since my AP1 car is blind to the rear, with no rear-facing radar or camera, it cannot assess how dangerous it is to brake hard when another car is tailgating me, as often happens. And when AP is set to a longer following distance than 1 in moderate to heavy traffic I am definitely tailgated more, increasing the hazard. Also, more cars will cut in front, increasing the likelihood that AP will brake, which also increases the hazard.

Second, because AP’s forward view is narrow it will react suddenly once it sees another car cut in front of me. AP also cannot read or interpret the turn signal of a car in the adjoining lane, so it cannot anticipate the cut-in as a human driver would, which means it will leave less space to brake as the cut-in occurs. Once again, the car behind me can be surprised by the suddenness of the reaction. Sometimes I will brake out of AP to allow a smoother cut-in.

Third, when AP is locked on to the car in front (which turns white on the display) it has more accurate distance information than I have, and can react to it more quickly than I can. Reacting to the speed of the car ahead is one thing that AP can do better than a human. AP can also sense and react to the second car ahead by detecting the radar reflection that bounces under the first car. Thus I have confidence in its ability to avoid running into the car immediately ahead when the following distance is set to 1, and believe there is little safety benefit to lengthening the following distance, in contrast to the higher likelihood of being rear-ended at longer following distances.

Again, I use setting 1 because it closely approximates the following distance of most cars in moderate to heavy highway traffic around here, even at high speeds up to the low 70s (and obviously AP will increase its following distance at higher speeds, even at setting 1). When traffic is light I set AP to follow at a distance similar to what the other cars are doing in the lighter traffic.

I think this is the safest policy.

I imagine DC traffic could be as bad as Silicon Valley, however I think setting car follow distance to 1 is just a bad accident waiting to happen. I get people don't want to let others cut in, which if too much space see that happen all the time out here. At highway speeds there is little redpect for maintaining a safe distance. All it takes is for one driver to make an unexpected move (debris, another car suddenly merging in their lane, wrong lane to be in) and people jam on their brakes. If you are the one following at 1 car length you likely will be rear ending the guy in front and responsible for their damage or injury. It can of course be a worse situation. I don't think many AP Tesla drivers will agree with your decidion and bieve it's contary to what Tesla suggests.
 
I imagine DC traffic could be as bad as Silicon Valley, however I think setting car follow distance to 1 is just a bad accident waiting to happen. I get people don't want to let others cut in, which if too much space see that happen all the time out here. At highway speeds there is little redpect for maintaining a safe distance. All it takes is for one driver to make an unexpected move (debris, another car suddenly merging in their lane, wrong lane to be in) and people jam on their brakes. If you are the one following at 1 car length you likely will be rear ending the guy in front and responsible for their damage or injury. It can of course be a worse situation. I don't think many AP Tesla drivers will agree with your decidion and bieve it's contary to what Tesla suggests.

I respect your point but "1" isn't one car length: it's much more, depending on speed. Again, it's roughly the following distance that at least the more aggressive half of the drivers around me are using. And to be honest, if I wasn't using AP it would be the following distance that I would naturally use in heavier traffic while driving the car manually. And perhaps I'm lucky, but after decades driving in the Boston (bad, aggressive traffic there!) and Washington areas, I've never rear ended another car, although another car once rear ended me.
 
Last edited:
Big difference in between running a red light and driving straight to an stationary object w/o stopping, I own 2 Teslas and I know the differences in between EAP and FSD, AP should have the ability to stop before hitting a divider or firetruck.

But when Tesla tells you (via Operating Manual) AP isn't presently capable of doing certain things and that's why there's driver's responsibility here, why do people want to insist it should do so now? Yes, that's the goal ultimately but technology isn't there yet. You can only hold Tesla to what it says the system is capable of and right now stationary objects aren't part of its ability. There is no substitute for driver awareness still.
 
I agree the driver is responsible, this is just level 2 thus driver's fault, yet I keep going back to the same, how come such a system and technology cannot recognize an stationary object and stop the car as needed, mind-blowing.

I don't have the tech specs for the radar system, but the major issue is likely that it has only one beam direction/pattern, and so cannot tell where in the pattern an object is. Therefore all objects 20ft away that are in the beam get summed to one return strength, be that the road surface, a reflector, a side barrier, or an obstacle. Due to this, reliable object determination is not easily feasible.

If they used a tight beam, and aimed it straight ahead, it would give decent detection until the car was turning (thus pointed at the side barrier) or traveling into a dip (pointed at the ground) or up a hill (pointed at sky).
If they used a mechanically steered antenna, or phased array, that would allow more lidar like resolution with more cost and complexity.
Integration of vision should allow detection in both distance and position.
 
Attenuators are perfectly good and preferable to "launch type" rail ends as they contain an accident to the immediate area, reducing the probablity of secondary incidents; but only if they are in their operating condition.

But since Caltrans can't be relied upon to reset the smart cushions in a timely manner, lives would probably be saved by using the launch type rails which are more Caltrans idiot-proof.

Both the earlier prius driver, and model x driver would likely be alive if a Caltrans-proof launch type rails had been installed here -- avoiding the immediate deceleration caused by hitting a CalTrans ignored collapsed smart cushion.
 
I've driven both AP1 and AP2 for a couple of years under many different conditions.

I've never had a experience of the car having fine lines and just swerving rapidly for no reason..

But I have had the experience of lines going away, causing me to take over.

It is much more plausible in my experience that somehow the car, way before the barrier, having lost the poorly maintained left line and the right edge of pavement becoming a line that went right to left and pushing the car over, caused the car to go over to the "lane" leading to the barrier.

Normally the driver, paying attention, would simply correct.

This scenario seems more likely than the car, having fine lines in the area the barrier is, deciding last minute to swing left into it.
 
Yes, I also take every bit of marketing to the bank as if it were gospel, and drive as if marketing hype were true, even though my every experience in the car says AP cannot drive for me! NOT!

I just don’t understand this whining “they said it would...” that somehow absolves people from using their 5 senses and making rational judgments.
I want to elaborate on this post. I’ve got no quarrel with people who feel Tesla oversells AP capabilities. What I have an issue with is people who would keep forcing/trusting the car to do what they think/expect it should do vs. using their experience and senses to temper their use and adapt their use to what it actually can do.