Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla On Autopilot Slams Into Stalled Car

This site may earn commission on affiliate links.
...Does Tesla assign greater weight to vision data or radar data? Maybe equal weights? Perhaps the weights are variable depending on the current driving conditions?

I generally agree with you. While I certainly don't know the ins and outs of how Tesla magic works, multi-layer neural networks nowhere near as simple as you've suggested. There aren't that many specs on their neural network at all, aren't Training them results in multitudes of weightings, feeding through layers of convolution and deconvolution, soft max normalizations, feedback loops and probably some voodoo too. Have a look at the diagrams on this post, and I think you'll see what I mean: https://electrek.co/2019/01/04/tesla-leaks-hardware-3-self-driving-computer/
 
  • Like
Reactions: StealthP3D
LIDAR would have prevented the accident, or at least allowed the car to know there was a stationary object in the way and tried to take evasive measures. You won't find reports of accidents like this with Wamo's fleet. But you also won't see LIDAR being used in a Tesla if it bumps up the price tag by $20,000 either.

That said, this accident is not the fault of autopilot. AP is just a driver assist. The driver needed to be watching and take over control soon as the stationary vehicle was seen. AP would allow the driver to look around and be more mindful of the vehicles around him than if he had to focus strictly on steering. If you're taking a nap, watching a movie, etc, then there is no one to blame but yourself if this happens.
I doubt Lidar would have helped here given the blocking vehicle prevents detection until the last second.
Yes, because the traffic in front of me is slowing down and not stopped completely.

I’ve never encountered a situation where there is a stationary object directly in front of me while traveling at highway speeds. I hope I never encounter this.
I have, at a stop light where my vehicle never saw the vehicle ahead moving.
 
His conclusion




The facts:




Tesla releases new Autopilot safety report: more crashes but still fewer than when humans drive - Electrek


It’s facts are wrong, that’s what makes it a hit piece.
You do realize that those stats are incredibly biased right?

It's biased for the following reasons :
1. There are significantly fewer Tesla's on the road compared to other cars
2. Those buying a Tesla are a very specific demographic that is statistically more likely to be a good or safe driver
3. The sample size is still too small

You should take these stats for what they are: an early predictive measure of what it could be but no means accurate or definitive
 
The article described the one in Tesla owner's manual which also happened to me in real life.

The black car in front of me saw stopped cars in front so it moved to the right.

I was driving with Autopilot version 8.1 at 61 MPH and didn't immediately understand why the black car in front changed to the right lane.

I knew that the black car changed lane but I didn't realize I was in danger driving at 61 MPH in a stopped traffic in front.

I was still thinking!

I only realized the danger when the Collision Warning Alarm went off and and told myself "there's no time to think any more!"

I applied the brakes thanks to the alarm.

I could have applied the brake on my own without the alarm but it might be a tad late because I was still thinking "strange, why did that car changed lane?".

So although Autopilot didn't stop in this situation, I still give it credit for alarming me to take over!

 
  • Like
Reactions: Sherlo
You do realize that those stats are incredibly biased right?

It's biased for the following reasons :
1. There are significantly fewer Tesla's on the road compared to other cars
2. Those buying a Tesla are a very specific demographic that is statistically more likely to be a good or safe driver
3. The sample size is still too small

You should take these stats for what they are: an early predictive measure of what it could be but no means accurate or definitive


Negatory.

The fact that there are fewer Tesla’s on the road is irrelevant. The statistics are given per million miles driven. A mile driven in a Tesla is the same as a mile driven in an ice.

How can you support thot two? Where is your evidence to support that? Even if we assume it to be true, the statics also compare tesla drivers without auto pilot to Tesla drivers with autopilot, and Tesla drivers with autopilot go twice as far without getting into an accident as Tesla drivers without autopilot.

You also can’t prove thot three either. Most studies limit thier sample size to somewhere between 100 and 2000 samples (due to cost)... suffice it to say there are more than 2000 Tesla’s on the road that have driven sufficient miles to develope a sample size that provide a relatively high confidence level on the safety of the vehicles auto pilot system.

Again, wrong facts.
 
  • Like
Reactions: StealthP3D
Negatory.

The fact that there are fewer Tesla’s on the road is irrelevant. The statistics are given per million miles driven. A mile driven in a Tesla is the same as a mile driven in an ice.

How can you support thot two? Where is your evidence to support that? Even if we assume it to be true, the statics also compare tesla drivers without auto pilot to Tesla drivers with autopilot, and Tesla drivers with autopilot go twice as far without getting into an accident as Tesla drivers without autopilot.

You also can’t prove thot three either. Most studies limit thier sample size to somewhere between 100 and 2000 samples (due to cost)... suffice it to say there are more than 2000 Tesla’s on the road that have driven sufficient miles to develope a sample size that provide a relatively high confidence level on the safety of the vehicles auto pilot system.

Again, wrong facts.
You're still missing the point. Tesla cars and owners are not representative of a sufficiently random and diverse group to be considered a quality sample for a statistical study. It doesn't really matter the actual numbers or how many miles driven. There are simply too many biased variables to be able to show real Autopilot effectiveness. If you can't understand why that is true then you don't understand how scientific studies are designed nor statistical significance works.

The stats that your "facts" are based on is equivalent to a study that attempts to measure the average weight of all people in the world by selecting a sample of people that all live in the same town.

If you want to truly study AP safety and effectiveness it needs to be installed on non Tesla vehicles and used by a sufficiently diverse group of people spanning the entire socio-economic and geographic spectrum.
 
The heated "discussions" brought on by a simple article never cease to amaze me. Folks on this forum seem great about sticking to the issues at hand rather than turning things personal, though. You guys are fun group to fight with.:D

I think I'm safer when running on autopilot in most situations, but it certainly does cause me to become less attentive over time. I would like to hope that it would react faster to a situation like mentioned in the referenced article than I ever could, so it will be interesting to see if Tesla comments about this in any possible changes they make to the software. I doubt it would ever be made public for fear of liability, though.

caskater47 said:
2. Those buying a Tesla are a very specific demographic that is statistically more likely to be a good or safe driver

Do you have any statistics to back that up, or is that just an assumption? Assumptions aren't helpful here.
 
  • Like
Reactions: qdeathstar
Do you have any statistics to back that up, or is that just an assumption? Assumptions aren't helpful here.

It's an assumption based on a reasonable and educated guess based upon the average cost of a Tesla and the ability of those who can afford it. Even if not true this still a group that introduces additional variables that can't be ruled out as not having a bias.
 
  • Funny
Reactions: qdeathstar
If this incident was an AP1 car, then it is irrelevant to the safety of modern Tesla’s. I have both AP1 and AP2.5 model X’s and the modern system is definitely much, much better. In the AP1 car I often have to take over to stop rear ending stationary cars. In the AP2.5 the car sees and reacts appropriately every time even when I was on German autobahn at 90MPH and came around corner to find stationary traffic ahead. Autopilot saw and applied brakes before I could react and we came to safe stop.
 
How can you support thot two? Where is your evidence to support that? Even if we assume it to be true, the statics also compare tesla drivers without auto pilot to Tesla drivers with autopilot, and Tesla drivers with autopilot go twice as far without getting into an accident as Tesla drivers without autopilot.
The biggest problem with those safety reports is probably that Autopilot is primarily used in traffic situations that have lower accident rates to begin with (limited access roads). So comparing it to all miles driven without Autopilot is meaningless. If Tesla was serious about this, they'd release the raw data so they could be independently evaluated.
 
In my experience with Mercedes and BMWs, the forward collision systems work well in these types of situations. The systems will override the human driver regardless of whether the human is pressing the accelerator or not and the system will apply maximum braking. The only downside is you do get false positives on occasion which can be a bit surprising.

My concern about the article is whether Tesla has a reliable and effective forward collision system. In the described accident situation, such a system would have applied the brakes even if the car could not stop in time. As reported, the car did not apply brakes, suggesting that the forward collision system was not functioning properly.

As far as I can tell, this is not so much an issue of the Autopilot as it is with a more fundamental safety feature. Note that other cars with effective forward collision systems do not use Lidar, so this is not a Lidar vs. Tesla issue.
 
I have been using Navigate on Autopilot a lot the last few days. 3 trips to Buffalo and back since Friday (approx 160 KM/100 miles each way) using this feature for the entire trips. The trip involves 4 highways with seamless lane changes and on ramp/off ramp. V9.0 (2019-16-2) in a Model 3. The current system does not recognize stationary objects such as cars on the shoulder so manually touching the turn signal will force a lane change in the event of emergency vehicles at the side of the road. This will come with future versions I am sure but that is what the human interface is for.
The car alerts me when it is about to change lanes with a chime and vibration so I can cancel the maneuver if I chose. It will not change lanes unless have your hand on the wheel as it will prompt you even between the nags.
Someone has suggested that driving on Autopilot is like sitting beside a beginner driver. You must remain alert and ready to intervene if necessary. It is not there so you can text or read the news.
Overall very happy with this version. Now if they could stop the vehicles beside you from dancing at stop lights.
 

Attachments

  • settings.jpg
    settings.jpg
    391.1 KB · Views: 77
The article described the one in Tesla owner's manual which also happened to me in real life.

The black car in front of me saw stopped cars in front so it moved to the right.

I was driving with Autopilot version 8.1 at 61 MPH and didn't immediately understand why the black car in front changed to the right lane.

I knew that the black car changed lane but I didn't realize I was in danger driving at 61 MPH in a stopped traffic in front.

I was still thinking!

I only realized the danger when the Collision Warning Alarm went off and and told myself "there's no time to think any more!"

I applied the brakes thanks to the alarm.

I could have applied the brake on my own without the alarm but it might be a tad late because I was still thinking "strange, why did that car changed lane?".

So although Autopilot didn't stop in this situation, I still give it credit for alarming me to take over!


I looked at your video and I am surprised that your car did not stop - especially given the car in question was still moving very slowly and is not fully stopped. This is precisely the scenario AP2 identifies well in advance and reduces the speed even before the car in front of you switched lanes. For me my M3 has stopped many times in the same situation
 
And here's a bio on the author; hardly a hack freelance journalist writing a hit piece. [...]

I also don't believe this is a "hit piece", but I did a little background check on the author on LinkedIn and some website searches. He appears to be a bit of a resume padder. None of the permanent jobs he has held since 1996 have been in self driving cars or AI. He ran his own system integration consulting business for years and has worked as an IT head or CIO for medium sized companies. All his AI and self-driving affiliations seem to be one-man organizations with nothing to show for. He has self-published a couple of books on amazon.com on these topics with hardly any reviews.

While I don't doubt that he is a smart guy and well-read in AI and self-driving topics (probably more so than most of us who are ready to slam the guy), I don't see anything in his body of work that shows that he is a real technical expert on camera vs lidar or that his conclusion of camera+radar vs lidar is based on first-hand expertise.

I think the guy has a good IT background and sees a future in AI and self-driving car technology. He has chosen to make himself appear to be an expert in those areas by writing superficial level articles, books, and a resume, rather than show detailed technical research or work background to backup his claims.

All that said, I still think that us Tesla owners should not be so quick to dismiss these "running in to stationary objects" issue and blame it all on the driver. It will be good to understand how Tesla plans to avoid these cases in the near future because many things I hear from Musk make it sound like FSD is imminent and, in the near future, the only reason for the driver to pay attention would be due to regulatory reasons.
 
...This is precisely the scenario AP2 identifies well in advance and reduces the speed even before the car in front of you switched lanes...

True. I've seen my Autopilot has been able to display 2 cars in front on the instrument cluster. A visible one from immediate front and the one that's hidden from front view.

That's how Autopilot could apply brakes even before the second car in front becomes visible.

But not in this case. It displayed one car in front at a time: It had the ability to label the moving-out-of-lane car as red then it transferred that red color to the now visible stopped car in front.

The system did react before I applied the brakes. You could see the two teal icon still lit up but the speed of the car slowed down from 61 to 53 MPH and that's when I heard the alarm and overrode the system and the Autopilot icons were then turned off at 53 MPH.

Usually, Autopilot would just stopped my car for stopped traffic in front of me quietly and it has never alarmed before.

So why did it alarm this time? It's possible the Autopilot did not slow the car enough and the Collision Warning had to independently alert me.

I think the 61 MPH (which is too fast) might be a factor in this case for Autopilot and thus, the alarm for driver's help.