Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla On Autopilot Slams Into Stalled Car

This site may earn commission on affiliate links.
Lance Eliot (the author of that Forbes article) has an agenda. Ever since the Autonomy Investor Day where Musk doubled down on poopooing LIDAR, this guy's become extra vocal against Tesla. I wonder why. Perhaps he has a vested interest in seeing LIDAR succeed? He likes to flaunt his credentials to give himself more authority. I didn't dig too far, but he's definitely involved with self-driving tech.

Every writer critical of Tesla 100% always has an agenda, don't they? If it wasn't for all these agenda-driven writers, it would be nothing but 100% I love Tesla articles all day!
 
Man, I really wish these “journalists” would take a step back and look at context. How many cars without AP rear end others ? How many slide off the road because of poor over confident drivers ? How many end up in the ditch after falling asleep? Just yesterday I drove by a flaming car on the side of the road and it wasn’t electric.

Yes, AP has room for improvement but it’s infinitely better than doing nothing as so many automakers are doing. If you gave me a choice between a road with 100% Tesla tech or 100% with human drivers that “prefer to feel safe” in an Expedition , Sequoia etc. , I’ll take the former in a heartbeat. Now if the argument is it must have Lidar to be safe that’s along the same lines as don’t implement ABS until all discs are ceramic .
 
  • Like
Reactions: StealthP3D
I took a different read on this in the majority.

The author is a specialist in the field of AI, and I thought his discussion on decision-making for both the human driver as well as the computer was insightful. This paragraph stuck out to me:

Plus, for human drivers, it is difficult to continually keep a mindset that you are presumably the captain of the ship, retaining ultimate responsibility, which is somewhat mentally undermined when you know that you have your second-in-command running things for you, the Autopilot, and then all of sudden, bam, turns out that you were supposed to be the one handling the controls (a Catch-22, as it were).

I've noticed the exact same thing. When I autopilot I find it challenging to remain focused on the road because of the underlying thought that the car is going to act appropriately. I ran into a piece of road debris when the car was two days old on AP because I was not as ready to make an evasive maneuver is I would have been otherwise. This is something I think about every time I get in the car, and a constant reminder to remain vigilant when using AP.

I don't think the author intended this is a hit piece against Tesla at all. It reads to me like some informed commentary on the risks of this type of technology, with some healthy criticism of Tesla's system. I don't know why people seem to get offended at any criticism of Tesla and their way of doing things… How is anybody supposed to get better without feedback?
 
And here's a bio on the author; hardly a hack freelance journalist writing a hit piece.

"I am Dr. Lance B. Eliot, a world-renowned expert on Artificial Intelligence (AI) and Machine Learning (ML). As a seasoned executive and high-tech entrepreneur, I combine practical industry experience with deep academic research to provide innovative insights about the present and future of AI and ML technologies and applications. Formerly a professor at USC and UCLA, and head of a pioneering AI lab, I frequently speak at major AI industry events. Author of over 30 books, 300 articles, and 200 podcasts, I have made appearances on media outlets such as CNN and co-hosted the popular radio show Technotrends. My particular specialty in AI is Autonomous Vehicles and advances in self-driving driverless cars. I’ve been an adviser to Congress and other legislative bodies and have received numerous awards/honors. I serve on several boards, have worked in VC/PE, am an angel investor, and a mentor to founder entrepreneurs and startups. "
 
LIDAR would have prevented the accident, or at least allowed the car to know there was a stationary object in the way and tried to take evasive measures. You won't find reports of accidents like this with Wamo's fleet. But you also won't see LIDAR being used in a Tesla if it bumps up the price tag by $20,000 either.

That said, this accident is not the fault of autopilot. AP is just a driver assist. The driver needed to be watching and take over control soon as the stationary vehicle was seen. AP would allow the driver to look around and be more mindful of the vehicles around him than if he had to focus strictly on steering. If you're taking a nap, watching a movie, etc, then there is no one to blame but yourself if this happens.

I totally disagree. LIDAR would NOT have prevented the accident.

Tesla's radar can see stationary objects.

Mine sees stationary objects EVERY DAY.
 
His conclusion

I have repeatedly forewarned that as we encounter the emergence of Level 2 with ADAS and Level 3 semi-autonomous cars coming into the marketplace, there will be a lot more of these kinds of incidents involving a co-shared human-machine driving effort that inevitably falters or fails to take what might have been suitable action to avoid or reduce a car crash.


The facts:

In the 1st quarter, we registered one accident for every 2.87 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot, we registered one accident for every 1.76 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 436,000 miles.


Tesla releases new Autopilot safety report: more crashes but still fewer than when humans drive - Electrek


It’s facts are wrong, that’s what makes it a hit piece.
 
  • Informative
Reactions: JBT66
Yes, all that is true, Tom, but he should have prefaced the article with the context that autopilot ultimately results in fewer accidents. Because his conclusion “there are going to be more accidents as a result of autopilot” is not supported by the evidence.

I suppose. But from the perspective of somebody with expert knowledge in the field, the presumption that autonomous technologies are safer is almost assumed. Heck, anybody with a basic kindergarten-level understanding of computers should understand that.

The offer made no to mention this, and even broke it out into its own paragraph:

On another notable facet of the incident, the human driver says that there was insufficient time for him to react.

And the linked article to the description of the accident shows the driver praising Tesla for "saving his life", and placing no blame on the car whatsoever.
 
  • Like
Reactions: StealthP3D
Of course it does. The problem is knowing whether said "stationary object" is part of the scene you want to ignore or if you're going to run into it which means you should not ignore it.

That has nothing to do with Lidar if you can reconstruct a 3D scene without it, but with the algorithms behind it, which have to also avoid phantom-braking every time there's a tree at the side of the road which could jump onto the road (don't laugh -- recognizing a tree for what it is is not trivial, and I'm sure MCU2.5's processing power is not up to it) or a cloud shadow across the road.

Given his credentials, the author knows this. Why he chooses to write such a poor article is a mystery to me.
 
  • Like
Reactions: Runt8
His conclusion




The facts:




Tesla releases new Autopilot safety report: more crashes but still fewer than when humans drive - Electrek


It’s facts are wrong, that’s what makes it a hit piece.

Of course there are going to be more of them! Accidents are going to still happen, which means there will be more. Unless Teslas stop getting accidents altogether, that sentence is factually accurate. And because the automomous technologies are going to become exponentially more common on the roads, it is a very safe assumption that there will be a lot more of them.

That doesn't mean there'll be more accidents than had those same miles been driven manually, but there will certainly be more. A lot more.
 
I’m only going to speak for myself here, but I have found that the longer I let the car drive on Autopilot the harder it is for me to keep myself focused on supervising it and being ready to intervene if something goes wrong. I consider myself to be a good driver. I’ve been driving over 35 years and have never hit anyone. So I’m more comfortable driving the car myself than letting Autopilot drive for me.

But there are many bad drivers on the road. So overall I would agree with the general sentiment in this forum that AP is safer than the “average” human driver. And if you are a “below average” driver, by all means keep AP on all the time.

Humans have more accidents than autopilot.

https://electrek.co/2016/04/24/tesla-autopilot-probability-accident/
 
  • Like
Reactions: TomB985
I suppose. But from the perspective of somebody with expert knowledge in the field, the presumption that autonomous technologies are safer is almost assumed. Heck, anybody with a basic kindergarten-level understanding of computers should understand that.

Yes, but his audience isn’t someone with expert knowledge in the field, it’s plebeians whose basic instinct is to fear new things.
 
Of course there are going to be more of them! .... And because the automomous technologies are going to become exponentially more common on the roads, it is a very safe assumption that there will be a lot more of them.

That doesn't mean there'll be more accidents than had those same miles been driven manually, but there will certainly be more. A lot more.

I agree with what you are saying, but that’s not the conclusion of his article though. That’s not how the article is framed.
 
I'm not sure what's going on with your car then....

For example: The only way to detect construction is for the car to recognize stationary construction cones and such.

Detecting a pattern of bright orange would be a way to detect construction without neededing to recognize stationary objects.

I’ve noticed on several occasions at a stop light as I approach it and pass stopped cars those cars do not register on the visualization.
 
Musk is poopooing lidar because it’s espensive, not because it doesn’t work... He has promised fsd to a lot of people with cars already on the road and free retrofits to lidar probably not an option....

until he proves he doesn’t need lidar for FSD (including in situations like these) I think pointing out the lack of lidar is legitimate.

We all remember the tragic accident when a car with the lidar (whas it Uber?) killed crossing person.
Systems are not perfect yet, and until then drivers should not be relaxed, and I agree with the author that autopilot gives false sense of safety.
 
  • Love
Reactions: Garlan Garner
Detecting a pattern of bright orange would be a way to detect construction without neededing to recognize stationary objects.

I’ve noticed on several occasions at a stop light as I approach it and pass stopped cars those cars do not register on the visualization.

The question is about recognizing stationary objects.

My car does it just fine.

My car recognizes stopped cars in front of it just fine at intersections.

My car recognizes the back wall in my garage just fine when it parks itself.

My car recognizes me in front of it during summon.

I'm not sure what people are saying about it not recognizing stationary objects.

Does anyone want to amend their statement about Tesla's not recognizing stationary objects?