Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Yes. I don’t think FSD software is ready yet. If AP2 gets still confused with lane markings, FSD would be no better. FSD requires for AP2 to work very well.

My opinion:
AP2 is a simplified code set that is not representative of FSD. This allows easier updating and feature additions while letting FSD do fully tear ups, if needed, in order to progress.
Once FSD is finalized, EAP will become FSD with some non-safety features turned off (forced driver participation).
 
  • Funny
Reactions: NerdUno
My opinion:
AP2 is a simplified code set that is not representative of FSD. This allows easier updating and feature additions while letting FSD do fully tear ups, if needed, in order to progress.
Once FSD is finalized, EAP will become FSD with some non-safety features turned off (forced driver participation).
Good point. However Tesla has a PR problem with drivers getting killed when AP2 gets confused. I am certain if they had a fix for that they would put it in the AP2 software. Crashes and deaths are bad for business if you claim using this is safer.
AP2 is for Highway use, FSD is for Highway use and everything else plus.
In other words if they had something in the FSD software to make AP2 safer, they would use it!
 
  • Like
Reactions: NerdUno
Good point. However Tesla has a PR problem with drivers getting killed when AP2 gets confused. I am certain if they had a fix for that they would put it in the AP2 software. Crashes and deaths are bad for business if you claim using this is safer.
AP2 is for Highway use, FSD is for Highway use and everything else plus.
In other words if they had something in the FSD software to make AP2 safer, they would use it!

If they had a fix that was robust and implementable, they probably would. However, this is not functional programming with clear delineations of code functions. This is a big ball of integrated wax . You either get the whole ball of wax, or nothing. I'm guessing the best parts of the FSD development formed the basis for the recent AP2 SW improvement (pull out lane following sections of NN, possibly retrain, test, and release), but object detection/ false positive rate is not where it needs to be, yet.
 
Some of you are obviously from states where smoking weed is permissible. Other than Tesla's two vaporware videos, what other evidence is there that FSD code development has even begun? It's only been in the last two months that most AP2 cars could even drive reliably in a straight line without darting into another lane or a barrier. Oh wait! Scratch the barrier part. And then there are those pesky overhanging trees.
 
  • Funny
  • Disagree
Reactions: SummitX and bro1999
Hmm... mixed emotions on this one!
Would love for the environs to be a bit cooler...
But I've found lane tracking works MUCH better with white lines on blacktop. White dots on cement are the worst. On some cement highways I've seen alternating white and black lane stripes which seem to be ok with the camera so maybe black stripes on a white road would be ok? not sure.
Driving to Vegas for CES we were hands free almost the whole way until about 20 miles out of Vegas when the road turned from blacktop with white stripes to cement with strange white stripe markings.. the system was having such a hard time tracking the lane I had to give up and actually (gasp) hold the steering wheel :/
 
  • Funny
Reactions: NerdUno
Looks like ABC7 in San Francisco has been on this case quite frequently:

EXCLUSIVE: Wife of man who died in Tesla crash gives emotional interview to I-Team

EXCLUSIVE: Tesla issues strongest statement yet blaming driver for deadly crash

Victim's wife said: "I just want this tragedy not to happen again to another family,"

Her attorney blamed: "We believe this would've never happened had this Autopilot never been turned on."

In response, Tesla has issued the statement:

"We are very sorry for the family's loss.

According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang's drive that day.

We empathize with Mr. Huang's family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive."

 
Last edited:
According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

Tesla pulled punches. They didn't mention he said AP 7 to 10 times tried to steer him into that barrier. I don't play Russian roulette with a revolver even once, let alone try and squeeze it 100 times until it finally goes off. Not even the most accurate analogy because you can look at the road in front of you and know what's "loaded".

His brother said Walter is a careful driver. "He has not gotten a ticket his entire life." BS. I speed everyday, but I don't get a ticket everyday. Or even every year. So what if he didn't have a ticket, what does that matter? Though if they find one, it's time to roll out "if it doesn't fit, you must acquit defense" since its such a ridiculous superlative non quantifiable, and useless declaration.

Complete BS interview from ABC News. What kind of s*it jounalism when you don't ask.. uhhh why would he keep doing this?

I read books to my kids every night as well and I manage to not commit suicide via autopilot in the daytime.

Sorry for his kids but wife needs to tell them the truth: "Daddy was NOT paying attention to the road, and daddy made zero attempts to save his own life by not hitting the brakes. He'd rather f around with beta software always in development than treat driving as a sacrosanct responsibility which left his wife a widow and his children orphans. Attempts to shake Tesla down for money and hurting the reputation of all other responsible drivers isn't going to bring Walter back.

Hundreds of Tesla’s cross that area every day which has resulted in over 100,000 trips. No one else managed to kill themselves on that barrier.

Tesla can't state it the way I did but its the truth, the whole truth and nothing but the truth.

I am sorry for the loss, but truth and responsible party must come out.
 
Last edited:
I cant believe that the wife and the personal injury lawyer don't mention CalTrans at all and their negligent barrier and lane line maintenance.

Cultural face-saving issues at play too.

I even more respect Joshua Brown's family and lawyers for not exploiting a tragic accident that ulimately was due to the negligent acts of the driver and other third parties. Accidents happen.

Thanks to Tesla for giving us the tools to make accidents even less likely to happen to us -- at least those of us who do as instructed every time AP is engaged:

Keep hands on wheel and ready to take over at any time.
 
Last edited:
Tesla certainly is singing a different tune than they do in their sales pitches and demo videos. As my grandma used to say, "What goes around comes around."

I'm not suggesting that the vehicle darted into the barrier because I wasn't there. But I wish I had a dollar for every time our Tesla has done just that while on AutoPilot. Beta is being kind and touting safety based upon the Mobileye technology that's no longer even in the car is really disingenuous.
 
Last edited:
Yes. I don’t think FSD software is ready yet. If AP2 gets still confused with lane markings, FSD would be no better. FSD requires for AP2 to work very well.

Camera based lane following would presumably be only one string out of several used by a self driving car.

I would expect the car to know where the lane is supposed to be from high precision GPS maps - and for level 3, any discrepancy between the two paths is grounds to alert the driver and let them sort it out.

(For true FSD, the car would have to be programmed to resolve the discrepancy somehow - probably by some combination of watching what other cars do and carefully sorting out the full set of lines and the spaces between them, then submitting a map update if needed. They'd also probably quickly build an automated list of spots where the lines don't read well and include that in the local data tiles along with the radar mapping and such.)

Depending on the precision the radar is capable of, the radar whitelist/map that Tesla has been building since 8.0 came out might be a third source of location information to validate against the other two.
 
Tesla certainly is singing a different tune than they do in their sales pitches and demo videos. As my grandma used to say, "What goes around comes around."

I'm not suggesting that the vehicle darted into the barrier because I wasn't there. But I wish I had a dollar for every time our Tesla has done just that while on AutoPilot. Beta is being kind and touting safety based upon the Mobileye technology that's no longer even in the car is really disingenuous.

Experiences like this prompt the question for the umpteenth time... is Autopilot really many times safer? Or is it many times safer because the driver is the fail safe backup
and that driver falls into a very safe driving demogaphic?
 
...Tesla certainly is singing a different tune than they do in their sales pitches and demo videos...

1) Disclosures:

There's a difference in marketing talk and legal one.

Sales talk tend to emphasize the positives and once a customer bought it, the negatives are disclosed in a document somewhere.

In a simple lawsuit, as long as the disclosures were given to a customer, it's a straight forward case: A signature or consent has a very serious weight in the legal system no matter whether there's a claim or not for the convenience of skipping reading what was signed for.

For example, some lost their homes in the last financial crisis. They claimed they were tricked into signing what they didn't read but they still lost their homes!

Tesla has plenty of disclosures in an owner manual, when Autopilot is enabled from the display screen (once until it's disabled and enabled again), and when Autopilot is enabled by a physical stalk (each time).

The jurors will decide whether the disclosure was sufficient or should it be like an advertisement for peanut butter that would have graphic deadly peanut allergic reactions 95% of the whole clip and the rest of 5% of the clip would say it tastes good?

2) Liabilities:

I think Tesla is confident that it is legally cleared as NHTSA found that Tesla was not at fault for the Florida Autopilot fatal accident.

However, remember that NTSB also found that "...the system gave far too much leeway to the driver to divert his attention to something other than driving, the result was a collision that, frankly, should have never happened."

Thus, in juror selection process, Tesla should to pick ones who are more sympathetic to the legal/bureaucratic system like:

"The software and hardware performed within specs and as designed."

and not someone like NTSB who is sympathetic to more rigid, restrictive system to make sure no drivers can ever be inattentive while driving which Tesla's design is currently lacking.
 
Tesla pulled punches. They didn't mention he said AP 7 to 10 times tried to steer him into that barrier. I don't play Russian roulette with a revolver even once, let alone try and squeeze it 100 times until it finally goes off. Not even the most accurate analogy because you can look at the road in front of you and know what's "loaded".

His brother said Walter is a careful driver. "He has not gotten a ticket his entire life." BS. I speed everyday, but I don't get a ticket everyday. Or even every year. So what if he didn't have a ticket, what does that matter? Though if they find one, it's time to roll out "if it doesn't fit, you must acquit defense" since its such a ridiculous superlative non quantifiable, and useless declaration.

Complete BS interview from ABC News. What kind of s*it jounalism when you don't ask.. uhhh why would he keep doing this?

I read books to my kids every night as well and I manage to not commit suicide via autopilot in the daytime.

Sorry for his kids but wife needs to tell them the truth: "Daddy was NOT paying attention to the road, and daddy made zero attempts to save his own life by not hitting the brakes. He'd rather f around with beta software always in development than treat driving as a sacrosanct responsibility which left his wife a widow and his children orphans. Attempts to shake Tesla down for money and hurting the reputation of all other responsible drivers isn't going to bring Walter back.

Hundreds of Tesla’s cross that area every day which has resulted in over 100,000 trips. No one else managed to kill themselves on that barrier.

Tesla can't state it the way I did but its the truth, the whole truth and nothing but the truth.

I am sorry for the loss, but truth and responsible party must come out.

If he wasn't paying attention/fell asleep and plowed into the barrier in the Model X when AP was off, then in that case it's 100% his fault. But *AP* is what eventually steered him right into the barrier. Tesla repeatedly trying to throw a dead man under the bus saying he was inattentive, received many warnings, blah blah blah, does not change the fact AP steered his car directly into a concrete barrier at highway speeds is just ridiculous. That remains a stone cold fact.

Tesla doing everything but dig up his body and spitting on it. They are desperate to maintain the narrative that using AP is much safer than a car without AP, to the extent they will stomp on a dead man's grave to maintain that narrative. It is frankly disgusting.

Let's compare this to the GM ignition scandal. Do you think the people that died in cars that had their ignitions shut off and the drivers were NOT wearing a seatbelt deserved to be included in the settlement? They ignored a safety feature after all (like you are saying William ignored the alerts from AP).