Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

CR Engineers Show a Tesla Will Drive With No One in the Driver's Seat

This site may earn commission on affiliate links.
Tesla claims they're near 10x lower accident rate with autopilot. That's a decrease that *no other automotive innovation* has accomplished.
A claim does not make it true (and Tesla doesn't even claim it). Look at the other threads disassembling Tesla's claims on safety, and you'll see it's statistically invalid. So it remains unbroken that a single automotive innovation has increased overall driving safety by 10X. It's unlikely that Tesla has increased safety by 10X per mile even in the narrow places it can be used.

Tesla's own statistics don't even say AP makes it 10X safer:

In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

Teslas not on AP are 1:2.05, Teslas on AP are 1:4.19, which is only a doubling when on AP, which ignores the fact that AP can't be engaged in difficult driving situations. Their claim is that a Tesla on AP is 10X safer than the average car, but 2X of that comes from the driver and >2X of that comes from the non-AP features (that many new cars have vs the average 12 year old car on the road today- even a 9 year old Tesla didn't have this stuff.).
 
Last edited:
Their claim is that a Tesla on AP is 10X safer than the average car, but 2X of that comes from the driver and >2X of that comes from the non-AP features (that many new cars have vs the average 12 year old car on the road today- even a 9 year old Tesla didn't have this stuff.).
I would argue that Tesla drivers on autopilot are safer drivers just because we are forced to pay attention more. “What is it trying to do? Why is it Phantom braking now?!?”
 
  • Like
Reactions: daktari
I would argue that Tesla drivers on autopilot are safer drivers just because we are forced to pay attention more. “What is it trying to do? Why is it Phantom braking now?!?”
Safety systems help drivers of all levels. Safer drivers pay more for additional safety features.
This particular safety system has a human backup driver system that happens to be the safest driver demo there is.
 
  • Like
Reactions: captanzuelo
...


The only step that might not be easy is for an arthritic driver to move over to another seat. But otherwise, the procedure is not hard at all.

...

The procedures outlined require specific intent to defeat the safety interlocks. They are not simply actions that a complacent or lazy person is apt to do unintentionally and put themselves at risk. The specific intent is where you have to start drawing the line to assign liability to the users and not the technology provider. I am all for constructive criticism in exposing true weaknesses in safety systems, and in my opinion, there are plenty of systems that Tesla could improve in their cars. But trying to prevent users from intentionally defeating adequate safety interlocks or even just common sense such as remaining in the driver's seat so as to be able to quickly react to an emergency is frankly a waste of resources. And yes, without the ability to have the two passengers able to explain exactly what happened we cannot 100% know their actions and motivations, but I personally have a very hard time explaining how they could wind up in that situation without some kind of willful attempt to put themselves in a very dangerous situation. Perhaps you can offer a theory as to how this could have happened accidentally in such a way that it was somehow Tesla's fault for not considering a potential scenario in which the driver would need to vacate the driver's seat while driving the vehicle.

Sure, maybe as you say sitting on the seatbelt is a common practice done by people that hate seatbelts...so why are there not articles by CR showing how every single car out there actually drives just fine if you are not actively wearing the seatbelt.
 
...Sure, maybe as you say sitting on the seatbelt is a common practice done by people that hate seatbelts...so why are there not articles by CR showing how every single car out there actually drives just fine if you are not actively wearing the seatbelt...
They can test whether other cars allow sitting on a buckled seatbelt instead of wearing one or not, Still, the result will be the same for those: Without a competent human driver to steer, accelerate, and brake, an undesirable result would happen. With a competent driver, regardless of the seatbelt status, the test trip would be fine with no crashes.


...The procedures outlined require specific intent to defeat the safety interlocks...
No doubt. The relatives reported the intent: The 2 men went for a test drive/to test out the Autopilot.


...They are not simply actions that a complacent or lazy person is apt to do unintentionally and put themselves at risk...

With Tesla, there are people who want to test the system out to see what the system can do. Some would brake too late because they thought they could wait a fraction of second longer to see whether the system would do that for them timely. Many reports of summon damages at low speed because people intentionally wanted to test the system out to the max.

I think part of the reasons that people keep intentionally testing the system because they keep hearing from Tesla that the system is getting so much better, drastically better...

...But trying to prevent users from intentionally defeating adequate safety interlocks or even just common sense such as remaining in the driver's seat so as to be able to quickly react to an emergency is frankly a waste of resources....
The bar that NTSB is asking for is quite low: Driver camera monitoring system. And now, the bar from Consumer Report is also quite much lower: incorporate seat sensor to Autopilot eligibility status. Tesla can write some lines of codes and that would satisfy Consumer Report's demand.

...Perhaps you can offer a theory as to how this could have happened accidentally in such a way that it was somehow Tesla's fault for not considering a potential scenario in which the driver would need to vacate the driver's seat while driving the vehicle...

NTSB and Consumer Report are not there to trash Tesla.

They are there to advise Tesla on how to save lives and they both also educate drivers on how not to lose their lives.
 
I'm sorry but for someone to blatantly defeat security measures like this, its their own damn fault. Darwinism at work here. Tesla has zero blame for this in my opinion. Sorry(not really) if that sounds callous

Think about this, I rented a Rav4 recently and it had lane centering and TACC. All it had for security (that I'm aware of) was the wheel nag like Tesla's. If someone did the same security measures defeat like this in the Rav4 and everyone died - there is NO WAY it would be blown out of proportion like it is here
 
They can test whether other cars allow sitting on a buckled seatbelt instead of wearing one or not, Still, the result will be the same for those: Without a competent human driver to steer, accelerate, and brake, an undesirable result would happen. With a competent driver, regardless of the seatbelt status, the test trip would be fine with no crashes.



No doubt. The relatives reported the intent: The 2 men went for a test drive/to test out the Autopilot.




With Tesla, there are people who want to test the system out to see what the system can do. Some would brake too late because they thought they could wait a fraction of second longer to see whether the system would do that for them timely. Many reports of summon damages at low speed because people intentionally wanted to test the system out to the max.

I think part of the reasons that people keep intentionally testing the system because they keep hearing from Tesla that the system is getting so much better, drastically better...


The bar that NTSB is asking for is quite low: Driver camera monitoring system. And now, the bar from Consumer Report is also quite much lower: incorporate seat sensor to Autopilot eligibility status. Tesla can write some lines of codes and that would satisfy Consumer Report's demand.



NTSB and Consumer Report are not there to trash Tesla.

They are there to advise Tesla on how to save lives and they both also educate drivers on how not to lose their lives.
Then the issue becomes Elon's personality. Any critique like this is taken as some sort of personal attack. NTSB and CR certainly aren't shorting the stock.
If they were, it would be an idiotic venture. Most other companies would have a legit PR response. Tesla should take a hard look at some of these defeats,
make some minor changes, and then head over to CR and demonstrate. That would make for some good publicity.

Stuff like this takes me back to the guy who accidentally summoned his Tesla into the back of a logging truck.
The guy parked, got out, went into a store, came out and his windshield was impaled into the logs.... obviously he is angry at Tesla.
Tesla and followers go on the attack, guy is a liar, guy is this and that.
Tesla releases the logs: guy parked, summon, activated, door opened 1 second later, closed 1 second later, 30 seconds later car hits truck. 2 minutes later door opens again(when guy comes back)

At the time, Summon would activate, but the screen would prompt to deactivate... otherwise it is going.
Tesla: System works, driver error, he is lying etc etc

Later on, Tesla quietly updates Summon... instead of 'deactivate' prompt, now the prompt asks 'activate', otherwise it auto deactivates.

Rather than go through all of this BS and fighting, they could have simply said:
We can see how this feature could accidentally be activated and have updated the process. Driver will be prompted to confirm summon activation.
Replace the guys windshield and move on.
Instead, we had a month long online fist fight.
 
CR, ban Lexus, seriously. (by the way CR gave the highest scores to Openpilot because it could watch the driver. Guess what... all software is hackable)
lexusmustdie.jpg
 
CR, ban Lexus, seriously. (by the way CR gave the highest scores to Openpilot because it could watch the driver. Guess what... all software is hackable)
View attachment 656317
also, thinking about this. Every time I get a new truck, I go through the motions of turning off the incredibly annoying driver seatbelt alarm.
In a parking lot... or the driveway... or pickup line at school... beep beep beep beep beep SHUT UP!!!

In order to turn the alarm off you have to do the following:
Vehicle off, parking brake on, belt unbuckled.
Turn on vehicle, but don't start ignition.
Wait 60 seconds until seatbelt notice turns off.
Buckle and unbuckle 3 times, leaving it unbuckled at end.
Safety belt warning will come on again.
Buckle and unbuckle once.
Wait for safety belt warning to blink (confirmation)
Done.

It can't happen by accident, the owner has to purposefully go through these motions. And it sucks that safety features have to be dumbed
down for dumb people, but that's our lawsuit happy world.

So far we've seen highly intelligent, successful people make very dumb decisions with their Tesla. What happens when teenagers can start buying
these on used car lots? For every 1 guy who will try to pull off a no-driver showoff stunt, you will now have 100 doing it.
 
Hate to be insensitive, but a couple of dudes f'd up and killed themselves. I feel sad for their families and loved ones. Thank God they didn't kill or injur anyone else.

What CR is doing is just absolute FUD and society/public loves this sh.......tuff. So many arm-chair quarterbacks that have no clue are now all coming up to me since they know I have a Tesla and sharing their uneducated opinions with me. I'm sure most (if not all) of us here are getting the same unsolicited opinions.

What's worse, they're all quoting FUD articles and putting their own twist to it. God help us all!

LOL
 
Liability wise, the Dos and Don'ts are in the manual. Not reading the manual is no excuse. It hasn't been for a while. And yeah, I read mine. Not entirely but probably more than most people.

Coming from a different country, it's funny how in the US the perception is that no one wants to be responsible for their own *sugar*. You have too many laws against idiotic actions.

There's a sign that says "Danger: Don't jump off the cliff". Then your sibling does and, predictably, dies. Then you are hurt and try to sue the cliff owner, because there was a sign (which in my opinion is should be unnecessary), but no fence.

And then there's a fence
And then someone jumps the fence
And then they add a sign saying "Don't jump the fence"
And then someone digs under the fence
And then they add concrete around the fence
And then someone finds a hole in the fence
And then they get hire someone to upkeep the fence

I mean... Back where I came from, you jump off a cliff, there will be no PR statement.
Everyone knows you messed up and that was a possible outcome. Even you.
 
  • Funny
Reactions: rxlawdude
CR is a joke. They could get in any car and put a brick on the gas pedal and it will go. The difference is you have a chance of your Tesla not crashing doing what they did.

1619197145798.png


Do other automakers claim to sell products like "Autopilot" and "Full Self Driving?"

The point was not to show that a Tesla could, sometimes, navigate its own way. The point was to show how easy (or not) it is to get the Autopilot system to run without a driver in the driver's seat. Not comparable to a hypothetical brick scenario.
 
CR is a joke. They could get in any car and put a brick on the gas pedal and it will go. The difference is you have a chance of your Tesla not crashing doing what they did.

View attachment 656384

Do other automakers claim to sell products like "Autopilot" and "Full Self Driving?"

The point was not to show that a Tesla could, sometimes, navigate its own way. The point was to show how easy (or not) it is to get the Autopilot system to run without a driver in the driver's seat. Not comparable to a hypothetical brick scenario.
That's the difference. CR's point is to warn on "steering automation". Self-driving, automated steering, automated navigation, whatever it's called, are extraordinary new products for the general public and require extraordinary safety systems.

This is no brick on the accelerator problem, and CR knows it. These cars steer (and more - see FSD Beta). This is what they say:

CR and other safety advocates—including the Insurance Institute for Highway Safety—recommend that all vehicles that incorporate steering automation and adaptive cruise control also include systems to make sure drivers are present and looking at the road

People are already doing foolish things like pretending to sleep in the back by bypassing existing system. I'll take the next step. Once these cars are out there, what's to stop little Timmy from FSD'ing to the mall? Hopefully some system that ensures he's the owner not a child. You can call it FUD if you want, and say "but kids take their parents' cars on joy rides all the time". Yes, but they don't usually get far, usually crashing, because they don't know how to drive. A car that CAN drive is just going to encourage people to abuse it, sleep, watch a movie etc. As long as it's Level 2 we should at least try to make the obstacles to abuse as strict as possible.
 
  • Like
Reactions: daktari
People keep saying the difference here is that tesla has named the features auto pilot and full self driving. Where is the data that shows that? You can't just claim it as fact. Has there been a survey to customers that asks if they think the car will drive with them asleep or in the back seat? Of course not. Everyone just wants to claim people are too stupid to understand how the car works to keep them from having to take responsibility for their own actions.
 
  • Like
Reactions: helvio