Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Driving 100 hours without an error isn't even close to robotaxi performance.
Yes - not technically. But that is error-free operation for 15k miles in the city.

More like "super human" ? ;)

1639861657438.png


ps : But we'll know what the actual error rate is since there would be millions Tesla's driving nearly a Billion miles a month on FSD by then.
 
Yes - not technically. But that is error-free operation for 15k miles in the city.

More like "super human" ? ;)

View attachment 745701

ps : But we'll know what the actual error rate is since there would be millions Tesla's driving nearly a Billion miles a month on FSD by then.
You shouldn't average 150mph in the city, that would be unsafe. :p
As I said you can't extrapolate from the the crash type distribution for humans. For example, Waymo's crash type distribution over 6 million miles was nothing like that. They have a much higher ratio of more severe crashes than a human driver. It looks like S1 crashes would count as L1 or L2 crashes on the scale you're referencing (Automated Vehicle Crash Rate Comparison Using Naturalistic Data | Virginia Tech Transportation Institute)
1639862672447.png
 
  • Funny
Reactions: EVNow
You shouldn't average 150mph in the city, that would be unsafe.
I guess 30 minutes (each way) x 200 days = 3k miles @ 15mph. Yes, too less to say much.

But with 1 million drivers that would be 3 Billion miles. That tells us a LOT.

As I said you can't extrapolate from the the crash type distribution for humans. For example, Waymo's crash type distribution over 6 million miles was nothing like that. They have a much higher ratio of more severe crashes than a human driver. It looks like S1 crashes would count as L1 or L2 crashes on the scale you're referencing (Automated Vehicle Crash Rate Comparison Using Naturalistic Data | Virginia Tech Transportation Institute)
With 18 actual accidents too small to tell - but even for Waymo S0s are a lot more than S1. In all 4 S1 cases, Waymo was hit.
 
Last edited:
With 18 actual accidents too small to tell - but even for Waymo S0s are a lot more than S1. In all 4 S1 cases, Waymo was hit.
Since we're talking robotaxis you've got to count the collisions that would have happened without a safety driver.

30 S0 vs. 17 S1

vs. the VT human rate in 6 million miles we would expect
650 L1
290 L2
60 L3
21 L4

I suspect most of Waymo's S1 collisions would be classified as L4 (certainly all the airbag deployments, L3's $1500 damage is nothing and 1.3g is Plaid acceleration force. haha).
 
Cruise not performing well and CEO gets the axe?
Quote:
General Motors Co (GM.N) said on Thursday that Dan Ammann, the chief executive of its majority-owned Cruise self-driving car subsidiary, is leaving the company, effective immediately.

The U.S. automaker did not give a reason for the departure of Ammann, a former GM president and chief financial officer.
/quote

Dec 1, Cruise was denied the ability for charging fairs for its robotaxi service, because city didn't like how it always double parked.
WSJ confirms Mary Barra gave Dan the axe.
 
  • Informative
Reactions: EVNow
I think camera and hands on the wheels nag will make it difficult to goof off. But if that doesn't work I vote for electric shock therapy. Everytime you doze off the car sends some juice your way. :p
Great idea! Nag or Electro-shock therapy...

April 1, 2022: Elon Musk announces industry's most advanced and energy-efficient driver monitoring and attention-enforcement system
The new system update is called "Nag or Regenerative electro-shock therapy" aka "NoRest". Utitilizing advanced and proprietary Tesla Coils cleverly integrated with the seat heater elements, and regeneratively charged from Autopilot's ubiquitous phantom-braking activity, the feature gives inattentive drivers a single warning notification in a 6-point light-grey message at the bottom of the main screen. Should the driver not correct the situation within 200 milliseconds, or be caught glancing down at the screen in an unsafe attempt to read said notification, the Tesla Coils will be activated via the Internal Sentry network built into the remaining Driver Lumbar Support module, a new hardware element that the AP team insiders have affectionately dubbed the Tesla SuperDischarger Network.

In making this announcement, Tesla's CEO TechnoKing has explained that he now considers Driver Inattention to be a "solved problem". Tesla's firmware development team is also expected to incorporate the new Tesla Coil system in advanced in-car role-playing games such as the eagerly-awaited Tase Me Bro, and the military-themed Snap to Attention, both to be released probably in two weeks.
 

Elon Musk Says Tesla Vehicles Will Drive Themselves in Two Years​


December 21, 2015 2:00 PM EST


“I think we have all the pieces, and it’s just about refining those pieces, putting them in place, and making sure they work across a huge number of environments—and then we’re done,” Musk told Fortune with assuredness during his commute to SpaceX headquarters in Hawthorne, Calif., where he is also CEO. “It’s a much easier problem than people think it is. But it’s not like George Hotz, a one-guy-and-three-months problem. You know, it’s more like, thousands of people for two years.”

Six years ago today. Was Elon lying or did he just fundamentally misunderstand how difficult solving autonomous driving would be? Those seem to be the only options.
 


Six years ago today. Was Elon lying or did he just fundamentally misunderstand how difficult solving autonomous driving would be? Those seem to be the only options.


Given he has since repeatedly said the problem was much harder than he thought, and this has pointed out several times before, it's weird you keep being confused about it. Almost like you're acting unsure about it on purpose one might suggest.
 
  • Like
Reactions: EVNow


Six years ago today. Was Elon lying or did he just fundamentally misunderstand how difficult solving autonomous driving would be? Those seem to be the only options.

Just repeating my post from another thread ... here was Google exec “lying or did he just fundamentally misunderstand how difficult solving autonomous driving would be? Those seem to be the only options.”


This from 2015 : (before someone says oh, they have robotaxi pilot, see what the implication of highlighted text is.)



Google says it is well on its way to launching self-driving cars within five years. Because if they don't, Google director Chris Urmson's son will have to navigate the roads himself—and we can't have that.​
During this week's TED conference in Vancouver, Urmson told attendees that his eldest son, currently 11, is set to take his driver's test in a scant four-and-a-half years.
"My team are committed to making sure that doesn't happen," he joked.
 
Just repeating my post from another thread ... here was Google exec “lying or did he just fundamentally misunderstand how difficult solving autonomous driving would be? Those seem to be the only options.”

This from 2015 : (before someone says oh, they have robotaxi pilot, see what the implication of highlighted text is.)


Google says it is well on its way to launching self-driving cars within five years. Because if they don't, Google director Chris Urmson's son will have to navigate the roads himself—and we can't have that.​
During this week's TED conference in Vancouver, Urmson told attendees that his eldest son, currently 11, is set to take his driver's test in a scant four-and-a-half years.
"My team are committed to making sure that doesn't happen," he joked.

There is no doubt that everybody underestimated how difficult deploying safe autonomous driving would be. But Google did partially fulfill that promise to launch self-driving cars within 5 years. Google/Waymo did launch true self-driving cars within 5 years, just not everywhere. So, Urmson was not completely wrong about self-driving cars in 5 years, he just missed how long it would take to deploy everywhere.
 
It seems to me that FSD works perfectly well on 99% of paved road segments and that the problem spots could be known and mapped. Why can Tesla not let it drive like a level 3 car and fall back to level 2 well in time before driving into a problem spot?

I think that is a...gross overestimation of how far along FSD is...speaking as someone who has been using it for a couple months now.

Aside from that- there's many spots where it works "well enough to not cause an accident if there's little to no traffic" but not "perfectly well" and if there IS traffic it's not nearly as great there.... (the stuff where it'll randomly dive into a wrong lane for example and then correct back after- if there's no other cars around not a big deal- if there are it's potentially dangerous if you're not actively paying attention)


Ignoring all the rest of that--- even if 99% were true, that's way way too low to be safe.

The math on this has been discussed many times, but you need a bunch of 9s after the 99 before you can think it'd be safe as a human, let alone safer.



All THAT said- if going to single stack would actually fix the "hits stationary things partly in your lane" issue on highways, I DO think they could do L3 highways pretty much immediately- since there's so much less complexity there and it handles the stuff that exists very well apart from that.
 
It seems to me that FSD works perfectly well on 99% of paved road segments and that the problem spots could be known and mapped. Why can Tesla not let it drive like a level 3 car and fall back to level 2 well in time before driving into a problem spot?

But not for avoiding stationary obstacles. Crashes just keep happening as discussed in:


Even new FSD beta still has many issues from undesirable slowdowns to suddenly swerving to opposing traffic...
 
But not for avoiding stationary obstacles. Crashes just keep happening as discussed in:



errr... that's not FSD.

That's a HW2.5 car on regular AP/NoA.

(and is exactly the one issue I said they needed the FSD stack to fix to make highway driving potentially L3 safe)


Even new FSD beta still has many issues from undesirable slowdowns to suddenly swerving to opposing traffic...

Yup. Though the slowdown thing is vastly better than it was on earlier versions.... now it'll sometimes slow down 2-3 mph instead of the ~10 it'd do before when it was confused by opposing traffic/on sharp turns. and on highways there is no opposing traffic nor sharp turns...
 
All THAT said- if going to single stack would actually fix the "hits stationary things partly in your lane" issue on highways, I DO think they could do L3 highways pretty much immediately- since there's so much less complexity there and it handles the stuff that exists very well apart from that.
Have to fix phantom braking, too. Of course they fixed that a half dozen times already....