Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Open Letter to Elon Musk

This site may earn commission on affiliate links.
That's not how Tesla has defined it every time it's talked about "shadow driving".

See these:
Tesla’s new Autopilot will run in ‘shadow mode’ to prove that it’s safer than human driving
Tesla starts pushing new ‘Enhanced Autopilot’ to entire fleet in shadow mode only
Tesla adds ‘calibration period’ for vehicles with new ‘Autopilot 2.5 hardware’

In that last one, we have a Tesla spokesperson describing shadow mode specifically: "During that process, Automatic Emergency Braking will temporarily be inactive and will instead be in shadow mode, which means it will register how the feature would perform if it were activated, without taking any action"

"Shadow mode", at least as it applies to Tesla, is exactly what you're saying you want in terms of real-world data.

I get the legalize. Which is how Tesla believes they get legal protection while wanting and nudging their drivers to let go of the wheel. Are there no videos where Elon let's go of the wheel? Are there no modes where you can or are supposed to be able to let go of the wheel? Please if Tesla really wanted that L2+/L3 to not happen the AP would never allow it for more than a second.
 
But if that is genuinely true then you cannot advocate a form of armchair engineering which has already decided on the limits and approaches which AV research and development should take.

There is a quote which is usually (mis) attributed to Einstein:

"We cannot solve our problems with the same level of thinking that created them"

I know exactly what it will take. See other than LiDAR there is little going on here that aerospace has done. And done 10 it 20 years ago. Folks in Commercial IT think everything they do is new. Fact is most of it is not. And the reason why aerospace and DoD didn't focus on LiDAR before is because they know it has weaknesses that 3D radar doesn't have.
 
I think you are not clear on Tesla's definition of 'shadow driving'.

Does your opinion change IF you were to find that Tesla collects data from users driving on roads, whether or not AP is enabled? If Tesla is just collecting data from someone driving without AP engaged, do you have an issue with that?

No. And the reasons is because that is not all that it is limited to. In spite of the legalize Tesla wants and allows for folks to let go of the wheel. It wants to be protected legally and shift all the risk to their overly trusting customers so they will sacrifice themselves and their families to be Guinea pigs. Is not only despicable it's actually almost totally necessary. If Tesla didn't want that Elon wouldn't make videos with his hands off and the ability wouldn't exist in the car.
 
I get the legalize. Which is how Tesla believes they get legal protection while wanting and nudging their drivers to let go of the wheel. Are there no videos where Elon let's go of the wheel? Are there no modes where you can or are supposed to be able to let go of the wheel? Please if Tesla really wanted that L2+/L3 to not happen the AP would never allow it for more than a second.

That’s not “legalize”, it’s what that term means. So what you’re arguing against isn’t shadow mode at all; it’s allowing any form of AP until it’s capable of L4+ autonomy. You’re welcome to that opinion, though I’ll go ahead and say I’m glad Tesla has chosen not to go that route.
 
  • Like
Reactions: jbcarioca
buttershrimp said:
This open letter is what happens when Marques Brownlee and Mark David Chapman simultaneously pee into a fountain as lightning strikes.


Yikes, you've been thinking about that joke for almost a month? :eek: It's a joke!
 
Here is a report I actually just became aware of that combines many other reports on the dangers of handover or public shadow driving.
Handover issues in autonomous driving: A literature review - VENTURER

Combine that with Rand and Toyota stating it would take one trillion miles (not Elon's 6B) to get to L4 and you can see Tesla's current process is untenable for schedule, cost and safety reasons.

A final confirmation is Waymo's Safety Report. If you follow Waymo you will know they have evolved over the past year to determine handover/public shadow driving is not the way to go. The path they chose is exactly what I have stated. Replace most of the public shadow driving with (aerospace Level) simulation.
Waymo Safety Report - Confirms what I have been saying about Public Shadow Driving etc
https://www.linkedin.com/pulse/waymo-safety-report-confirms-what-i-have-been-saying-michael-dekort/



Do yourselves an Tesla a favor. Ignore me and read the data with an open and objective mind. The data here combined with common sense and the fact that fully autonomy was promised this year, and Tesla's AP is nowhere close, lead to the overwhelming conclusion Tesla needs to change its approach.
 
No one wanted to respond to my last post and the study or data on Waymo's new path?

Now we have Steve Wozniak - Tesla’s Autopilot is under attack by Apple co-founder Steve Wozniak, billionaire shorter, and media

From Woz - “Tesla has in people’s mind that they have cars that will just drive themselves totally and it is so far from the truth so they have deceived us. […] Driving my Tesla, over and over and over there are unusual situations on any road anywhere and every single human being alive — dumb or smart — would be able to get through it and the Tesla can’t,”

From Einhorn - “Some of TSLA’s presumed market lead in areas like autonomous driving may more likely reflect TSLA’s willingness to put inadequately tested and dangerous products on the road rather than a true technological advantage,”

Seriously - the Telsa army should get together and ask Elon to look at the things I have been saying. It's in everyone's best interest.
 
Here is a report I actually just became aware of that combines many other reports on the dangers of handover or public shadow driving.
Handover issues in autonomous driving: A literature review - VENTURER

Combine that with Rand and Toyota stating it would take one trillion miles (not Elon's 6B) to get to L4 and you can see Tesla's current process is untenable for schedule, cost and safety reasons.

A final confirmation is Waymo's Safety Report. If you follow Waymo you will know they have evolved over the past year to determine handover/public shadow driving is not the way to go. The path they chose is exactly what I have stated. Replace most of the public shadow driving with (aerospace Level) simulation.
Waymo Safety Report - Confirms what I have been saying about Public Shadow Driving etc
https://www.linkedin.com/pulse/waymo-safety-report-confirms-what-i-have-been-saying-michael-dekort/



Do yourselves an Tesla a favor. Ignore me and read the data with an open and objective mind. The data here combined with common sense and the fact that fully autonomy was promised this year, and Tesla's AP is nowhere close, lead to the overwhelming conclusion Tesla needs to change its approach.
Handover and public shadow driving are two absolutely different things. Not sure why you are mixing those two together. Do you even know what those two things mean? If you do, your argument is very clearly drawing a false equivalence between the two.

No one wanted to respond to my last post and the study or data on Waymo's new path?
PS: a lot of people have you on ignore, mostly likely because in previous interactions you have failed to demonstrate you have enough familiarity with Tesla's system or with semi-autonomous vehicle technology in general to make a credible argument.
 
Last edited:
Kill thousands? Your hyperbole is funny.

No it is not hyperbole.

How do you think the public shadow driving path gets from here to L4? In order to learn complex and dangerous scenarios, which there are thousands of, they have to be driven thousands of times to train and verify the AI. Those scenarios have to be run in very bad weather which includes bad road conditions. Many of those scenarios are actual crash scenarios.

If you do not think I am correct about this please explain why.
 
Handover and public shadow driving are two absolutely different things. Not sure why you are mixing those two together. Do you even know what those two things mean? If you do, your argument is very clearly drawing a false equivalence between the two.


PS: a lot of people have you on ignore, mostly likely because in previous interactions you have failed to demonstrate you have enough familiarity with Tesla's system or with semi-autonomous vehicle technology in general to make a credible argument.

You are incorrect about public shadow driving and handover not being the same from the standpoint of situational awareness. Which is the point. (I assume you did not read any of the references I suggested?). Most studies and real world data show that regaining control and effecting the right actions after being distracted is extremely difficult. That situation exists in both cases. It is worse in public shadow driving for AI because in those cases unlike L2+/L3 the system is not finished with design and engineering yet.

Please explain exactly where all those studies, NASA. Waymo and Chris Urmson are wrong on this area. Both regarding that handover cannot be made reliably and consistently safe and that using it to train AI on public roads is extremely counter productive. Both because you cannot drive the one trillion miles and the casualties it will cause when you move into dangerous and complex scenarios.
 
Tesla running behind Musk’s driverless promises
Tesla running behind Musk’s driverless promises

How much writing on the wall do you need? We now need another wall.

The technology can be successful. But you need to do it another way.


I have not delved into the sensor area but given Tesla's stance here I will. NONE of the sensor system ANY vendor has is remotely close to L4. Especially in extremely bad conditions. I think it is possible visual and light based systems may never get there. 3D radar may have to be the answer. That however does not deal with colors or possible very fine details like letters on a sign or road markings. The other issue is actual fusion in all conditions and redundancy. You need a system that knows the best possible answer at all time and can double if not triple verify it. The military uses things like a Kalman filter which use probability and priority algorithms determine what if the best answer at all times. Remember many aircraft fly with pilots using instrumentation only. Setting aside cost and size, which may be show stoppers on their own, the system should be designed from handling the worst case scenarios and double if not triple redundancy backward. I do not see how Tesla can rule out LiDAR at this point. Especially if it is also not going to use detailed mapping. And the use of false positive radar data from previous drivers seems like a bad idea. That leaves all those drivers who drive in those locations before those errors are found screwed. Tesla's composite sensor design has too many avoidable flaws.
 
Last edited:
  • "I'm looking for a lot of men who have an infinite capacity to not know what can't be done."—Henry Ford
  • Those who say it can't be done are usually interrupted by others doing it. — James A. Baldwin
Tesla capacity to bring out all the naysayers and vigilantes never surprises me.

On another note I got a 2017 Subaru with lane keeping as a loaner car last week. That system is a joke.
 
You are incorrect about public shadow driving and handover not being the same from the standpoint of situational awareness. Which is the point. (I assume you did not read any of the references I suggested?). Most studies and real world data show that regaining control and effecting the right actions after being distracted is extremely difficult. That situation exists in both cases. It is worse in public shadow driving for AI because in those cases unlike L2+/L3 the system is not finished with design and engineering yet.

Please explain exactly where all those studies, NASA. Waymo and Chris Urmson are wrong on this area. Both regarding that handover cannot be made reliably and consistently safe and that using it to train AI on public roads is extremely counter productive. Both because you cannot drive the one trillion miles and the casualties it will cause when you move into dangerous and complex scenarios.
I don't even need to click your source to know that you are clueless about the definitions. However, I will try to be informative.

Handover (from your own source):
"‘Handover’ typically refers to the staged period during which the AV transfers all controls to the driver (e.g., an alert to indicate imminent takeover required, a delay before manual controls are completely handed over), so that the vehicle can be driven manually, whereas ‘takeover’ tends to refer to the specific time when the driver has regained manual control of the vehicle and automated systems have been deactivated."
http://eprints.uwe.ac.uk/29167/1/Venturer_WP5.2Lit ReviewHandover.pdf

Public shadow driving (emphasized part bolded):
"New Tesla Model S and Model X automobiles will run Autopilot in “shadow mode” and collect driving data that pits a human versus computer. Autopilot vehicles running in shadow mode will not take any driving-assist or self-driving actions. Rather they will only log instances when Autopilot would have taken action and compare those results to the real life actions taken by human drivers."
Tesla pits human vs. computer while cars operate in 'Shadow Mode'

How is that anywhere near the same? There is absolutely no handover in shadow mode. The driver is driving the car like any regular car with no semi-autonomous capability, and all shadow mode does is compare what the driver does with what the system would have done.
 
  • Like
Reactions: WillK
@stopcrazypp I always try to look at what people really say, instead what I think they say...

@imispgh's point is lost in your fight about the terminology. He already asked you to look beyond it, but apparently you couldn't. :)

The articles are actually very clear and that @imispgh relays at least makes complete sense once you do.

https://www.linkedin.com/pulse/waymo-safety-report-confirms-what-i-have-been-saying-michael-dekort/
5 things we learned from Waymo’s big self-driving car report
https://storage.googleapis.com/sdc-prod/v1/safety-report/waymo-safety-report-2017-10.pdf

The arguments are: public shadow driving is not sufficient to teach AI - and a handover-doing self-driving car is not safe enough (i.e. Level 3 is bad). Now, those are fair points to be argued IMO, but just trying to help see the points.

Waymo makes both of these points for two reasons: They believe handovers are dangerous and only full autonomous is the answer. They also believe in simulation training and closed-course training first instead of public shadow driving, because the latter can't possibly cover all the scenarios they need to cover (when @imispgh mentioned crashes, that's what he meant).

Now, none of this is black and white, of course. We know Tesla is hiring simulation people of course, and Waymo drivers their cars in the public too to validate them. But there are different emphasis the companies have. Interesting to see how it plays out.
 
Last edited:
  • "I'm looking for a lot of men who have an infinite capacity to not know what can't be done."—Henry Ford
  • Those who say it can't be done are usually interrupted by others doing it. — James A. Baldwin
Tesla capacity to bring out all the naysayers and vigilantes never surprises me.

On another note I got a 2017 Subaru with lane keeping as a loaner car last week. That system is a joke.

We here are all irrelevant when it comes to self-driving cars (well, excluding possibly the one or two professionals in the area).

However, it should be noted that Waymo has a very different approach to self-driving than Tesla. There seems to be a legitimate disagreement within the industry, in its innovative Silicon Valley hotbed no less (not just in the more traditional global industry), about what can be done, what should be done and how.

To simply chalk concern's about Tesla's approach to naysayers being naysayers is not giving Waymo enough credit IMO.

The reality is, Tesla and Comma.AI seem to be on a different page than much of the rest of the industry, believing that fleet-learning visual neural-nets is the answer. They both have high-profile spokespersons outlining a very bold, almost an easy vision towards autonomous driving based on not much more than a bunch of cameras on tons of cars on the road and a feedback-loop within a powerful computer...

The rest of the industry are taking a very different approach, starting from triple/quadruple-redundant hardware fusion, to the way they approach teachning and implementing the system. But even that rest of the industry has different approaches, regarding say doing handovers or not doing handovers...

We shall see who is right. But history remembers only those who got it right (unless they failed too spectacularly). Those who failed may well also have ridiculed their naysayers, but history just forgot about it... ;)
 
Last edited:
@stopcrazypp I always try to look at what people really say, instead what I think they say...

@imispgh's point is lost in your fight about the terminology. He already asked you to look beyond it, but apparently you couldn't. :)

The articles are actually very clear and that @imispgh relays at least makes complete sense once you do.

https://www.linkedin.com/pulse/waymo-safety-report-confirms-what-i-have-been-saying-michael-dekort/
5 things we learned from Waymo’s big self-driving car report
https://storage.googleapis.com/sdc-prod/v1/safety-report/waymo-safety-report-2017-10.pdf

The arguments are: public shadow driving is not sufficient to teach AI - and a handover-doing self-driving car is not safe enough (i.e. Level 3 is bad). Now, those are fair points to be argued IMO, but just trying to help see the points.

Waymo makes both of these points for two reasons: They believe handovers are dangerous and only full autonomous is the answer. They also believe in simulation training and closed-course training first instead of public shadow driving, because the latter can't possibly cover all the scenarios they need to cover (when @imispgh mentioned crashes, that's what he meant).

Now, none of this is black and white, of course. We know Tesla is hiring simulation people of course, and Waymo drivers their cars in the public too to validate them. But there are different emphasis the companies have. Interesting to see how it plays out.
Again those are two completely different points, which he tries to conflate together:
"Here is a report I actually just became aware of that combines many other reports on the dangers of handover or public shadow driving."
"You are incorrect about public shadow driving and handover not being the same from the standpoint of situational awareness"

He is drawing a false equivalence between the two. The report he linked from Venturer had absolutely nothing to do with shadow driving, even though he claimed it did.

You can argue handover is dangerous (although every level 2+ system does it), but public shadow driving has nothing to do with it because it does not involve handover at all. Also, Tesla doing public shadow driving does not mean they don't also do simulation. We discussed Tesla's simulation related hirings elsewhere already:
Autopilot simulation!

And the reason why I am not forgiving in terms of wrong usage of terminology, is because he had previously appealed to authority and claimed he was an expert. It's forgivable for a general member to be wrong on terminology, but for an expert, I'm going to take issue with it.
 
Last edited: