Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
And not a single real-world stat to show for it! That's how well used it is! :rolleyes:
Supercruise 0 incident/accident in over 6 million miles.
The reason automakers don't go around parading these stats and comparing them to the NHTSA accident stat is because they know they are meanless and not comparable. But leave it to Tesla to spread misinformation with it.
 
Not sure this is the appropriate thread (feel free to delete or move if needed), but tomorrow is the two year anniversary of Tesla's Autonomy Day and I'm struck by how very little of their claims/predictions have come true in that time - both as it relates to FSD as well as their competitors who would "for sure" have to dump LIDAR.
 
... tomorrow is the two year anniversary of Tesla's Autonomy Day and I'm struck by how very little of their claims/predictions have come true in that time - both as it relates to FSD as well as their competitors who would "for sure" have to dump LIDAR.
Interested in more details about predictions. I'm not seeing any LIDAR company being successful. I would classify Tesla as being successful because they are charging $10K.
 
Interested in more details about predictions. I'm not seeing any LIDAR company being successful. I would classify Tesla as being successful because they are charging $10K.

So, how many FSD orders that Tesla has gotten? When Tesla's numbers are big, they often brag it publically, but so far for FSD orders? Never!

It is speculative to assume that FSD is Tesla's commercial's success but when Tesla was talking about LIDAR, it was talking about the science, the function, the crutch that impairs the progress of Autonomy safety.

The fact is: When LIDAR has been used in the field, it has been a scientific success that prevents collisions time after time and years after years.

Uber LIDAR fatal accident was an example of NOT USING LIDAR in the field. It's there on the vehicle as a decoration and it was actually disconnected from the system and instead, it relied on humans rather than LIDAR for that campaign. It's actually was an anti-LIDAR campaign and the poor driver didn't even know until it's too late!

Back to Tesla, whatever system that Tesla has, there's no proof that it can reliably stop for stationary obstacles at a freeway speed.

Some argue that the current Tesla is not L5 so give it some slack and it will soon.

Duh! Others never started with L5 either and by using LIDAR, there have been zero deaths in its history.

In summary: Success in money with deaths or success in science with zero deaths?
 
  • Like
Reactions: DarkandStormy
Interested in more details about predictions. I'm not seeing any LIDAR company being successful. I would classify Tesla as being successful because they are charging $10K.

And sorry, meant to give you more info on details about predictions. These are just a few. From 4/22/2019 - Tesla Autonomy Day Video & Dozens Of Quotes

Elon Musk emphasizes, “Lidar is a fool’s errand. And anyone relying on Lidar is doomed.”
Further, “I think if somebody started today and they were really good, they might have something like what we have today in 3 years, but in 2 years, ours will be 3 times better. … A year from now, we’ll have over a million cars with Full Self Driving computer, hardware, everything.”
Regarding other self-driving vehicle firms using lidar, Elon notes strongly, “they’re all going to dump lidar, mark my words. … In cars, it’s freakin’ stupid. It’s expensive and unnecessary, and as Andrej was saying, once you solve vision, it’s worthless. So you have expensive hardware that’s worthless on the car.”
Elon: “There’s 3 steps to self-driving: there’s feature complete, then there’s feature complete to the degree that … where we think that the person in the car does not need to pay attention, then there’s at a reliability level where we also convince regulators that that is true.

“We expect to be feature complete in self driving this year, and we expect to be confident enough from our standpoint to say that we think people do not need to touch the wheel and can look out the window sometime probably around … in the second quarter of next year. And we expect to get regulatory approval, at least in some jurisdictions, for that towards the end of next year. That’s roughly the timeline that I expect things to go on. …
Elon: “Next year, we’ll expand the product line with Model Y and Semi and we expect to have the first operating robotaxis next year, with no one in them … next year.”
“At some point, you won’t need steering wheels or pedals and we’ll just delete those. As these things become less and less important, we’ll just delete parts. Probably 2 years from now, we’ll make a car that has no steering wheels or pedals. If we need to accelerate that timing, we can always just delete parts, easy. Probably, I’d say long term, 3 years, robotaxis with deleted parts, maybe it ends up being $25,000 or less.”

Of course, Elon has been promising Level 5 for awhile - Elon Musk clarifies Tesla's plan for level 5 fully autonomous driving: 2 years away from sleeping in the car - Electrek
 
That is far from the truth, because in Autopilot mode the driver can always instantly brake, steer, and accelerate. The human driver is always in full control.

Whom have we sacrificed? Has anybody been killed by the autopilot? Or have people killed themselves by misusing the autopilot? These are two very different things.
I don't think you read up on the "swiss cheese model of safety". The theory is you have different entities (sheets of cheese) with small or bigger risk (holes in the cheese). Human, technical, weather etc. Humans make mistakes, and that is hard to reduce. A catastrophe is not a result of one single action. It is the combination of many different risks that happen at the same time that will result in a catastrophe. Therefore, identifying risks in the other entities, you might design these to mitigate the risk so that these human mistakes (or misuse) will not result in a catastrophe.

We know people have slept in the Tesla on AP and survived to tell the story. Add some additional incident at that moment and the result could be opposite.

In detail, Tesla, to reduce risks of disaster, could change the following in AP.

-Tesla AP can be tricked by an apple or water bottle. (could be a capacitive sensors for hand touch only)
-the old nags was way to infrequent (they fied this but could be even more frequent)
-the PlingPlong says "car is in control" (it could have no sound and a slowly onset of steering assist)
-the plong-pling says "driver has control (could be totatlly silent)
-If you try to mildly steer, the car presses the steering wheel in the other direction. (a blended driver/system approach could mitigate this)
-if you try to steer harder, the car firmly resists for a split second, resulting in the vehicle jerking a bit (it could never resist steering wheel input)
-car tries to steer in turns it is not yet designed for (other cars disengage much earlier, could disengange without warning)
- it will allow engagment on roads not suitable for the capabilites of the system (it could be stricter limited to higways only)
-it will go really fast even if the radar then has too short range to handle an obstacle in front. (limit to speed could be enforced)
-it will continue to go straight if you put on the blinker (it could disengage while blinking letting the driver change lanes manually (only AP))
-EAP will try to auto lane change, which is cool but slow and implying "car is in total control" (it could let driver do manual lane change)
-it will not warn if there is a conflict/ when it chooses the wrong lane marker or crack to follow (sensitivity for "system confidence" could give frequent warnings)

Then you have the narrative on FSD, the old 2016 video, robotaxi, the fan crowd, Tesla autonomy leadership etc that make owners even more complacent, adding to the risk. Making a bigger difference in UI between AP/EAP and FSD could be a smart move.
 
Supercruise 0 incident/accident in over 6 million miles.
The reason automakers don't go around parading these stats and comparing them to the NHTSA accident stat is because they know they are meanless and not comparable. But leave it to Tesla to spread misinformation with it.
Supercruise stat is not true according this:
I know this isnt true because my friend got hit by one and sued GM and they settled out of court.
 
  • Informative
Reactions: mikes_fsd
Supercruise stat is not true according this:

Yeah it isn't true because a random poster who signed up on March 29th 2021 with 10 posts and no details said so.
Do you use the same logic to refute anything Tesla says.

Actually you don't. Instead you take anything Tesla says as truth and you fight against anyone who dare utter anything against it even though they come with mountain load of evidence.
 
Not sure this is the appropriate thread (feel free to delete or move if needed), but tomorrow is the two year anniversary of Tesla's Autonomy Day and I'm struck by how very little of their claims/predictions have come true in that time - both as it relates to FSD as well as their competitors who would "for sure" have to dump LIDAR.

True, but obviously the LIDAR claimed is linked to their own vision-dependent progress.

Companies would only drop LIDAR if it was proven that a vision-only system can compete in performance.
 
Yeah it isn't true because a random poster who signed up on March 29th 2021 with 10 posts and no details said so.
Do you use the same logic to refute anything Tesla says.

Actually you don't. Instead you take anything Tesla says as truth and you fight against anyone who dare utter anything against it even though they come with mountain load of evidence.
Sure, we do need more details, but Tesla isn't the one claiming they had zero incidents. All there needs is one incident to dispute the GM account as they claim zero incidents.

As for Tesla, there had been no compilation of data (even if you count single reports online) that suggest Tesla's stat is incorrect. People who naysay against Tesla magnify on anecdotal incidents but present no numbers (even ballpark ones) that suggest Tesla's stats are wrong.
 
Sure, we do need more details, but Tesla isn't the one claiming they had zero incidents. All there needs is one incident to dispute the GM account as they claim zero incidents.

As for Tesla, there had been no compilation of data (even if you count single reports online) that suggest Tesla's stat is incorrect. People who naysay against Tesla magnify on anecdotal incidents but present no numbers (even ballpark ones) that suggest Tesla's stats are wrong.
People who naysay against Tesla point out that the company uses that data to brag about something that isn't reality.
It's like Apple comparing their flagship phone to some $150 junk phone made in 2015 and patting themselves on the back.
They don't. They compare to Samsung. Similar product, similar performance, similar price, similar customer.
 
As for Tesla, there had been no compilation of data (even if you count single reports online) that suggest Tesla's stat is incorrect.

Where would someone who isn't Tesla get such data from?

Nobody else would have info like the state of AP, or active safety features, during all crashes for example.



I think Teslas data is accurate for the narrowest possible definition of accurate- but there's been a lot of valid reasons given for why it's not nearly as informative as Tesla implies it is and why you can't validly draw broad conclusions like "DRIVING ON AP IS TEN TIMES SAFER THAN A DUMB HUMAN IN A HONDA" and such.
 
  • Like
Reactions: Doggydogworld
Where would someone who isn't Tesla get such data from?

Nobody else would have info like the state of AP, or active safety features, during all crashes for example.
Yet that doesn't stop people from claiming Tesla essentially made up those numbers, with no evidence to suggest that.

Even a ballpark estimate, like gathering all the publicly reported accidents so far where AP was said to have been activated (regardless of it was true or not) and then dividing the miles travelled on AP (which Tesla had reported in the past) would at least begin make the argument.

I think Teslas data is accurate for the narrowest possible definition of accurate- but there's been a lot of valid reasons given for why it's not nearly as informative as Tesla implies it is and why you can't validly draw broad conclusions like "DRIVING ON AP IS TEN TIMES SAFER THAN A DUMB HUMAN IN A HONDA" and such.
Yes, it's not nearly that informative, especially with no breakdown of city vs highway roads (although Tesla might not have that data in the first place), but it's still a useful stat. Far better than the info we have on many other L2 systems (which don't report any similar stats).

It might not be any safer than a regular driving, but it does suggest it's not horribly unsafe as the naysayers say.
 
  • Like
Reactions: rxlawdude
People who naysay against Tesla point out that the company uses that data to brag about something that isn't reality.
It's like Apple comparing their flagship phone to some $150 junk phone made in 2015 and patting themselves on the back.
They don't. They compare to Samsung. Similar product, similar performance, similar price, similar customer.
Nope, what naysayers are saying is like saying Apple is worse than a $150 junk phone made in 2015. Have you been in the Jalopnik articles? People there claim having Autopilot is far less safer than without. It's not even the claim that it's roughly the same or no different, it's that it's far worse. Of course, they have no evidence to support that other than pointing to the anecdotal incidents. When you point out the Tesla stats, they say it's BS or likely made up by Tesla.