Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP/FSD related crashes

This site may earn commission on affiliate links.
He said 61-63 MPH in a previous response:



It is hard to imagine someone with the authority to set speed limits deciding that 55 MPH is appropriate for that road.
Agreed. Clear day in the middle of the day would be fine (except you don't want to go airborne from crossing a railroad track), but you have to worry about deer and farm equipment in addition to fog in the mornings and evenings around sunset. 60 would feel too fast for me.
 
  • Like
Reactions: willow_hiller
Agreed. Clear day in the middle of the day would be fine (except you don't want to go airborne from crossing a railroad track), but you have to worry about deer and farm equipment in addition to fog in the mornings and evenings around sunset. 60 would feel too fast for me.
I drive that road everyday. It is a highly trafficked road most of the time, not meaning bumper to bumper, but there are always cars and trucks on the road. There are semis that regularly travel at or above 60 MPH. that is not a sharp curve at all, its a slight bend in the road. For miles it parallels the track, turns in to cross then turns again to again parallel the tracks until you again turn to cross the tracks again entering Hamilton, OH. Map Seven Mile, OH. This incident was just north of there, the 1st incident was just south of Seven Mile
 
@cdotyii thanks for answering all these pesky questions! Much appreciated!

I'm just wondering if you're willing to address this question by @S4WRXTTCS that you might have missed?

I'm confused by why you didn't intervene earlier?

Was it intentional inattention?
Was in unintentional inattention? Essentially unconscious trust in the system.
Was it due to the wanting to know whether it would handle the situation?
 
My experience with FSD 11.* and 12.* is it doesn't fully understand trains yet. It is particularly bad with RR crossings that don't have gates, which is the case with many rural and small town ones around here. I've been reporting improperly handled RR Crossing / train incidents. I think part of the issue is it doesn't really understand how they are unique and different than a generic cross street. Rules meant for stoplights and flashing red lights are being applied to them, and that only partially works. A flashing red light is stop, then proceed when it is your turn. With a train's flashing red lights it is wait until the train passes, but FSD 11 would try to nose its way in to get it's turn before the train has fully passed. I haven't had a chance to see if FSD 12.3.* still does that.

I suspect Tesla will have to have some of their test drivers, in rural midwest, stake out trains and time their arrivals at RR crossings to get enough video data to make training models from. The railroad lines that roughly follow along US-30 and US-34 in Iowa are some of the heaviest used in the country. When more coal was being shipped, they would have trains every 20 minutes in both directions 24 hours a day, 7 days a week, except during track maintenance periods. They also are heavily used for shipping grain to the Mississippi grain terminals. I refer to the empty coal car trains heading back to the mines as coal can returns.

Railroad Crossings: Rural RR crossings are often made so one can maintain speed for the road type over them. So, yes 55 MPH is often able to be safely driven over them on paved rural roads. In fact, they will tell you if the crossing is bumpy, but not if it is smooth.

On roads that parallel railroad tracks, FSD will sometimes react to the railroad signal lights. Railroad signal lights are very bright as they need to be easily seen over a mile away.

Speed limit signs that are placed well out into the ditch are often ignored.

Rural route number signs, which are square, in Minnesota and Wisconsin are sometimes read as speed limit signs.

Don't get me started on the bad handling of default rural road and highway speed limits. US-69 in northern Iowa somehow has a 60 MPH speed limit. 55 MPH is the rural 2 lane highway maximum speed limit state wide. No Iowa 2 lane roads have a speed faster than 55 MPH by state law. Turns out the next speed limit sign is over in Minnesota where it is marked 60 MPH. I often run into sections of road that are marked 25, 30, 25, 40, 45, 50 which is the next speed the road is marked at when it enters a town. Paved rural roads default to the state wide rural road speed limit.

FSD seams to be doing very well in city areas, but not in rural and rural town areas.

Fog and speed: The rural Midwest drivers are not quite as bad as commuters in the SF Bay area with going too fast, but it is close.
 
It's in national news:

Finally! It is such a good video and story (this was the second time this happened to the owner and for some reason they let it happen???), I am surprised it took so long.

Seems fairly well verified too. I don’t know about the legitimacy of the Tesla Data but I assume that is legit (easy to verify anyway).
 
Personally if I were being an idiot (which is too frequent anyway), I would not want it on social media even if it got millions of views.
I posted it here for verification not for publicity, to verify the issue with other people who were discussing Tesla FSD accidents. I never imagined when I did so, that someone would take it an post it on X or other sights, I was looking for answers as to How the hell this could happen. And if there were other reported instances where did not recognize the train
 
  • Like
Reactions: Matias and y_naught
I posted it here for verification not for publicity, to verify the issue with other people who were discussing Tesla FSD accidents. I never imagined when I did so, that someone would take it an post it on X or other sights, I was looking for answers as to How the hell this could happen. And if there were other reported instances where did not recognize the train
So, did you got the answer for "How the hell this could happen?"
 
  • Love
Reactions: flutas
This is not an attempt at "insurance fraud." It is very concerning to me that the FSD system has failed to recognize a train crossing twice.

It would appear that you also failed to recognize a train crossing. Twice no less.

Maybe spend some time contemplating how you are utilizing FSD on your Tesla for the future because this could have gone very badly (again)? 🤔

For reference, my Model Y came upon trains at crossings twice during the April free FSD trial (v12.3) and both times it handled the stop at the crossings perfectly. One instance was on a clear sunny day and the other was in light drizzle but good visibility. Both were gated crossings with the gates down, no other cars ahead of me. I think the fog was the problem for you, was it also a foggy day the first time you almost hit a train? (serious question, not being snarky)
 
Last edited:
  • Like
Reactions: JB47394
The article seems to imply that the famous actor Brad Pitt now avoids Tesla collision avoidance technology after he rear-ended the car in front and that car rear-ended another one in front:

Tesla Glitch Forces Brad Pitt to Ditch Electric Vehicle

That article is actually a really interesting example of the confluence of media bias against Tesla and AI generated content.

Here's how the article you linked describes the accident. Note how it says "this May" implying it happened in May 2024:

"Renowned actor Brad Pitt was involved in a multi-car accident this May in the Los Feliz neighborhood of Los Angeles. The incident, which could have had serious consequences, occurred when the actor, behind the wheel of his Tesla Model S, rear-ended a silver Nissan, which in turn hit a Kia in front of it."


And now read this article about a crash in February 2018: Brad Pitt’s surprising reaction when he allegedly causes a 3-car pile-up

"His Tesla allegedly rear-ended a Nissan Altima, which then crashed into a black Kia truck"

So either Brad Pitt caused two 3 car pile ups in the exact order of Tesla > Nissan > Kia, or MSN has written a fictional article. And note that the actual crash occurred in 2018, over two years before the first version of FSD went out to Beta testers.
 
really interesting example of the confluence of media bias against Tesla
It's not really an interesting example of anything. It's just garbage. Never ascribe to malice that which can be adequately explained by incompetence.

The vast majority of "hit pieces" out there on Tesla are based on straight-up ignorance of Tesla and their software. And then the misinformation propagates and feeds on itself. And eventually will feed into the AI training data, which will produce more garbage, and get worse and worse and worse. It's the future!

This is why companies have PR teams which work to shape image, which is sometimes nonsense and burnishing the turd (bad), but sometimes it is actually simply correcting falsehoods (good).
 
It's not really an interesting example of anything. It's just garbage. Never ascribe to malice that which can be adequately explained by incompetence.

I don't think it's malicious or incompetent, but rather a strategic decision to run news that will attract the most clicks. In the case of MSN, it's most likely someone prompted an LLM with something like "Based on this dataset of clicks per headline, write an article that will maximize clicks." It's got everything: an embarrassed celebrity, fear of new technology, Tesla in the headline.

If I asked you what percentage of fatal collisions involving Autopilot or FSD made it to national news, what would you guess? I worked it out based on the NHTSA data earlier in the V12 thread, and it turns out the number is 100%. Every single one made headline news. That's too consistent to be incompetence, it's strategy. Just because it's not malicious doesn't mean there doesn't exist a measurable bias.
 
  • Informative
Reactions: stopcrazypp
I don't think it's malicious or incompetent, but rather a strategic decision to run news that will attract the most clicks. In the case of MSN, it's most likely someone prompted an LLM with something like "Based on this dataset of clicks per headline, write an article that will maximize clicks." It's got everything: an embarrassed celebrity, fear of new technology, Tesla in the headline.

If I asked you what percentage of fatal collisions involving Autopilot or FSD made it to national news, what would you guess? I worked it out based on the NHTSA data earlier in the V12 thread, and it turns out the number is 100%. Every single one made headline news. That's too consistent to be incompetence, it's strategy. Just because it's not malicious doesn't mean there doesn't exist a measurable bias.
There's definitely clickbait strategies as well. That could be just as much of an explanation. But in the end the strategy to counter has to be the same.

The misperception of risks is not limited to Tesla. Existence does not mean a strategy or a bias.
 
So either Brad Pitt caused two 3 car pile ups in the exact order of Tesla > Nissan > Kia, or MSN has written a fictional article. And note that the actual crash occurred in 2018, over two years before the first version of FSD went out to Beta testers.
Surprising no one, it's the same event. Here's a 2018 article with the same pictures, but cropped differently.

 
  • Like
Reactions: willow_hiller