Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla promoting self-driving video was staged

This site may earn commission on affiliate links.
Which part of the article is untrue? The author isn’t Editorializing about the testimony he’s just reporting what was said. The closest thing to opinion in the article was a line “their propensity to crash into things” but he linked that to an article that discussed NHTSA data that suggests it’s true.
 
  • Like
Reactions: BrerBear
...Should other AV companies be prevented from posting demo videos of software not available to the public?
I don't think that is what this lawsuit is about.

If they show me a model house so I can see what the price tag would include: 2,000 ft2, 4 bedrooms, 4 baths, 4 car garage...

But after I fully paid it up, they would give me only 500 ft2, 1 bedroom, 1/2 bath, no garage then it's problematic.

It's about truth in advertising: For a price your car drives itself without interventions but they left out the part in the video that the demo car collided with the fence and please watch out for those concrete medians in Mountainview, CA!
 
..."The route had been mapped in advance" meaning the car was just following a navigation route or.. what else could it mean and why would it be negative? This is essentially what autonomous driving means. Like, should you expect to get in a Waymo without giving it a destination and it should... do what...
That's against Tesla’s philosophy. It aims for a generalized solution: Tesla Vision will save the day!

The argument against Waymo method is because it requires advanced homework, staging, preparation, pre-mapping: It's just too costly, too much labor intensive, and very slow to scale up globally.

Waymo doesn't collide with a fence in Phoenix, San Francisco, because they have done their homework there. All bets are off for unprepared locations like Detroit, Washington DC...
 
Last edited:
I don't think that is what this lawsuit is about.

If they show me a model house so I can see what the price tag would include: 2,000 ft2, 4 bedrooms, 4 baths, 4 car garage...

But after I fully paid it up, they would give me only 500 ft2, 1 bedroom, 1/2 bath, no garage then it's problematic.

It's about truth in advertising: For a price your car drives itself without interventions but they left out the part in the video that the demo car collided with the fence and please watch out for those concrete medians in Mountainview, CA!
It wasn’t a demo of what you could buy. They never portrayed it that way. 2016 was way before FSDb was available to anyone outside of Tesla. The demo was as what turned into FSDb, and it was not a demo of AP.
 
It wasn’t a demo of what you could buy. They never portrayed it that way. 2016 was way before FSDb was available to anyone outside of Tesla. The demo was as what turned into FSDb, and it was not a demo of AP.

The video wrote on the screen:

“The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

That is false because the Tesla Director admitted under oath that the car collided with the fence: The car could not drive itself without colliding with the fence.

They should have disclosed on the screen with a different message: "Although the car did crash, we expect to work out all the bugs so it can drive itself when the technology allows."
 
That's against Tesla’s philosophy. It aims for a generalized solution: Tesla Vision will save the day!

The argument against Waymo method is because it requires advanced homework, staging, preparation, pre-mapping: too costly, too much labor intensive, and very slow to scale up globally.

Waymo doesn't collide with a fence in Phoenix, San Francisco because they have done their homework there. All bets are off for unprepared locations like Detroit, Washington DC...
You misunderstand. No one is saying anything about HD maps. The argument seems illogical: you can’t generalize autonomous driving without a destination. That destination is calculated as a navigation (what turns where like way back when with MapQuest). “Tesla used 3D mapping on a predetermined route” seems to just mean “navigation steps calculated” as in what every AV company is try to do and what every customer is wanting/hoping for in an AV. What else would it mean in a fraud sense? E.G - Tesla told the car how many feet to drive forward, turn at what angle, then drive forward at a 2° right curve for 426 ft, etc.? It doesn’t even make sense how “predetermined route” would ever not be the case. Unless you don’t have a destination, which means your car shouldn’t make any turns, which means it’s not autonomous.
 
The video wrote on the screen:

“The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

That is false because the Tesla Director admitted under oath that the car collided with the fence: The car could not drive itself without colliding with the fence.

They should have disclosed on the screen with a different message: "Although the car did crash, we expect to work out all the bugs so it can drive itself when the technology allows."
It crashed when it was trying to park “with no driver” which I’m thinking means no safety driver at that moment even.

Yes, this was a video demo of FSD, which Tesla was still developing. Don’t people remember what we had in 2016? We didn’t have NoA, FSDb, etc. we had basic AP, most Teslas at the time only had forward facing cameras.
 
Which part of the article is untrue? The author isn’t Editorializing about the testimony he’s just reporting what was said. The closest thing to opinion in the article was a line “their propensity to crash into things” but he linked that to an article that discussed NHTSA data that suggests it’s true.
These parts (are misrepresentations at worst, misunderstandings at best): Tesla promoting self-driving video was staged
 
You misunderstand. No one is saying anything about HD maps. The argument seems illogical: you can’t generalize autonomous driving without a destination. That destination is calculated as a navigation (what turns where like way back when with MapQuest). “Tesla used 3D mapping on a predetermined route” seems to just mean “navigation steps calculated” as in what every AV company is try to do and what every customer is wanting/hoping for in an AV. What else would it mean in a fraud sense? E.G - Tesla told the car how many feet to drive forward, turn at what angle, then drive forward at a 2° right curve for 426 ft, etc.? It doesn’t even make sense how “predetermined route” would ever not be the case. Unless you don’t have a destination, which means your car shouldn’t make any turns, which means it’s not autonomous.

There's no crime in staging, except when it was set up to deceive.

If the demo house included a golden toilet, and I was guaranteed that it's included in the price but when I paid fully, I got a porcelain toilet.

Same with Tesla: It was preparing thoroughly for the demo route. Except when I paid $15,000, I don't get that kind of preparation for my commute trip. It keeps choosing the wrong lane because it's not prepared in advance like the demo route!
 
  • Like
Reactions: pilotSteve
Here's an Ars Technica article from 2020 about the crash:


The hearing determined that the driver was playing a game on his phone 30 seconds before the crash and had previously experienced glitches in the same area where the crash occurred. He was an Apple engineer, not someone unaccustomed to technology or a proverbial "babe in the woods".

And, CalTrans, California's highway agency had failed to replace a damaged crash attenuator in front of the concrete gore, which would have saved his life.
 
  • Like
Reactions: nvx1977
There's no crime in staging, except when it was set up to deceive.

If the demo house included a golden toilet, and I was guaranteed that it's included in the price but when I paid fully, I got a porcelain toilet.

Same with Tesla: It was preparing thoroughly for the demo route. Except when I paid $15,000, I don't get that kind of preparation for my commute trip. It keeps choosing the wrong lane because it's not prepared in advance like the demo route!
But that’s what‘s not clear here: did they intend to deceive?

Looking at the website back then (closest archive is from a month after the original video and a week after the second FSD demo video), they seem to be promising an upcoming feature set and not a currently available feature set. Their timeline was way too optimistic (like most in the AV industry back then), but they could also hide behind the “dependent upon… regulatory approval” etc. disclaimers.

People should take (another) look at what this world looked like back then and also remember how different it was in the AV space back in Oct and Nov 2016:

  1. We didn’t have Smart Summon (no, it’s not that great still even if it’s amazing compared to all other cars)
  2. We didn’t have NoAP
  3. Vast majority of Teslas (2012 - earlier 2016 Model S) on the road at that point only had HW1 (only forward facing and backup cams, no side pillar nor fender cams)
  4. We didn’t have FSDb. YouTubers didn’t even yet have FSDb. Heck, even the name/term/label “FSD beta” wasn’t known/used yet.
  5. Oct 20, 2016 video in question (on Tesla’s website because maybe there’s nothing to hide?): Full Self-Driving Hardware on All Teslas | Tesla. Notice the end of this video has the car turn into the parking lot before the safety driver gets out, it cringingly slowly makes its way around the small parking lot and out before trying to park (all without anyone inside the Tesla). I’m guessing from the testimony language that this is when the car crashed into a fence during a previous take, not during this take and not while driving for miles on public roads. This video has some cuts, but still looks like one take.
  6. Nov 18, 2016 video with a much more complicated route (filmed only less than a month later): Tesla Self-Driving Demonstration | Tesla Canada. This video was available on the product pages and Autopilot info page linked below. Notice this demo did NOT have the car try to navigate around the Tesla office parking lot. Also notice how scarily close it gets to pedestrians, dog walkers, etc. compared to what we have today. There’s also no “FSDb” visualization on the displays (no cones or trash cans to get excited about yet! Haha). The only FSDb-like visualization we get are the 3 cams synced on the side. It’s not as nice as the very first early on FSDb YouTubers posted videos of. The video is sped up, but it’s still one take. Even more clear in this next video…
  7. Nov 19, 2016 video someone posted of the same video released by Tesla a day before, but slowed down to more resemble real-time (aka slow and monotonous to watch through):
    . You really get to see how this was one take (who knows how many tries it took back then to get a good take).
  8. Nov 29,2016 Archive.org capture of Tesla’s website info on Autopilot: https://web.archive.org/web/20161129012459/https://www.tesla.com/autopilot. Notice the bolded disclaimer to hedge their bets. In other words, they were way off but doesn’t seem to be intentionally tricking anyone that’s reading their website.
 
Here's an Ars Technica article from 2020 about the crash:


The hearing determined that the driver was playing a game on his phone 30 seconds before the crash and had previously experienced glitches in the same area where the crash occurred. He was an Apple engineer, not someone unaccustomed to technology or a proverbial "babe in the woods".

And, CalTrans, California's highway agency had failed to replace a damaged crash attenuator in front of the concrete gore, which would have saved his life.
The lane markings were also not clear and the construction caution signs had fallen down (not readable). I could try to find them again, but someone just before the accident happened had later posted pics of the area showing how understandably a computer vision system needing to read lane lines could have been fooled. Playing a game on his phone (not paying attention to the road nor hands on the wheel) for at least 30 seconds before the crash seems nuts to even me that used and tested AP for years, then FSDb for the last year.
 
Looking at the website back then (closest archive is from a month after the original video and a week after the second FSD demo video), they seem to be promising an upcoming feature set and not a currently available feature set. Their timeline was way too optimistic (like most in the AV industry back then), but they could also hide behind the “dependent upon… regulatory approval” etc. disclaimers.
From the archived page you linked to:

Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, [...]
Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction.

Combined with Elon's claim that by 2017 a Tesla would drive from L.A. to New York with no human intervention and combined with the misleading video (using software that would never be available to the public) it sure seemed like Tesla had already solved FSD and the only hurdles left were validation and regulatory approval. I see nothing here for them to hide behind.

Elon bet the ranch they would solve FSD soon enough so this misleading information wouldn't matter. He lost that bet. They may have had good intentions in their hearts but they intentionally deceived and misled people. They knew it wasn't ready yet. They knew there were much bigger hurdles than just validation and regulatory approval. They knew the video didn't depict software that would ever be made available to the public.
 
From the archived page you linked to:

Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, [...]
Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction.

Combined with Elon's claim that by 2017 a Tesla would drive from L.A. to New York with no human intervention and combined with the misleading video (using software that would never be available to the public) it sure seemed like Tesla had already solved FSD and the only hurdles left were validation and regulatory approval. I see nothing here for them to hide behind.

Elon bet the ranch they would solve FSD soon enough so this misleading information wouldn't matter. He lost that bet. They may have had good intentions in their hearts but they intentionally deceived and misled people. They knew it wasn't ready yet. They knew there were much bigger hurdles than just validation and regulatory approval. They knew the video didn't depict software that would ever be made available to the public.
This requires a lot of assumed intentions and speculation. Keep in mind, I’m not assuming the opposite (however one defines what that would be), but rather the website and available disclaimers at the time hedges their bets. I knew I wasn’t buying an autonomous vehicle. I have yet to find an owner that thinks they did.
 
  • Disagree
Reactions: Mark II
Don't an older Model S was involved in the recent San Francisco Bay Bridge incident? \
Any details about which Autopilot version was used?
A little more confirmation: The car ADAS was active according to the government data which didn't contradict with the driver claimed: He didn't drive and he let the system drove for him, changed lane for him and even slowed down to 7 mph on the 50 mph zone.

We still don't know what year, software versions...

Tesla-induced pileup involved driver-assist tech, government data reveals

 
  • Informative
Reactions: JRP3