Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Mobileye rips Tesla Autopilot, Chairman says it dumped Tesla

This site may earn commission on affiliate links.
A stationary vehicle blocking the lane is certainly one of the most common causes for accidents under such conditions, yet AP fails to recognize that the lane is blocked. How can AP be safer than a human driver if the human driver has to step in to prevent AP from causing an accident?

The way I have always understood it is that it is a human driver PLUS autopilot is safer than autopilot. The human driver does not step in. The human driver is always the one in control, hands on wheel, paying attention. AP is a beta program designed to help in the long and boring places. That people, obviously, think a human has to "step in" shows that people don't understand what AP is or how it works. A human driver PLUS autopilot IS safer than the human driver alone.

By the way, I drive with AP, have had it for over a year with 28,000 miles on my 90D, and knew and understood it before I started. That may be why I think it is an amazing and fantastic tool. Like Cruise Control, which people didn't understand and misused at first, I think it will get better.
 
Here's another good article that sheds some light on the bigger game being played here, and why Tesla is going to win and Mobileye is going to lose:

Why The Tesla/Mobileye Fight Defines An Industry-Wide Schism

I feel as if the MAJOR reason there is huge debate over this is it comes down to a Trolley car problem.

Camp A of this trolley car problem passively accepts the deaths of thousands of people a year because they can't bring themselves to pull the switch to knowingly kill a few people. They want a slow rollout so they're not at fault, but they're not saving peoples lives very quickly either.

Camp B of this trolley car problem feels as if the death of a few people is justifiable because it will save the lives of thousands and thousands of people. They'll say they didn't do it knowingly because after all people are supposed to take responsibly. But, we know that's bullshit because we knew from the beginning that AP would change peoples behavior.

For those of us that own a Model S with AP it's rather interesting because we're part of the trolley car experiment. Our fellow owners died as a direct result of what we feared would happen. From the very beginning of the rollout of 7.0 we knew there would be crashes and a fatalities. We knew people would be irresponsible with it. But, we also knew we would learn from it. What we learned is that the AEB system in the Tesla sucks. The article that Amped posted points out that it sucks. I personally think MobileEye is full of crap because MobileEye knew this from the beginning of the AP rollout, but they didn't say anything. They never said anything to the media to say Tesla was doing something with it that it simply didn't support. They only said that when something did happen. Hell I'm not even sure why I missed such a glaring incompatibility when mixing a poor AEB system with a Level 2 driver assistance package, but I did miss it.

Going forwards I think we should should be even more vigilant in reporting things to Tesla or here when it comes to bugs and problematic points. When it comes to things that happen either as a result of human psychology or the AP doing something stupid. We also have to give thanks to those that perished. Because they're deaths are why Tesla is rolling out a vastly improved AEB system in cars we own. This means that we're safer and it means anyone who we hand the keys to are safer. We also know that whoever owns the car next is safer. The car we own will forever be safer for the rest of it's life.
 
  • Love
Reactions: EarlyAdopter
Or, to say it another way, the people who use AP are more enthusiastic than those who misuse it. The "meh" people usually don't understand it, don't use it, or outright don't understand how to use it. They probably didn't like Cruise Control, either.
That gross generalization doesn't apply to those of us that use it most and have used it longest under real world conditions on busy populated highways. "Meh" just means it's fine but "love" in its current iteration? Umm not really. ;)
 
Last edited:
That gross generalization doesn't apply to those of us that use it most and have used it longest under real world conditions on busy populated highways. "Meh" just means it's fine but "love" in its current iteration? Umm not really. ;)

"Umm", it's beta. I expect improvements. I expect them over the air. Part of my enthusiasm is seeing this thing grow. Like raising kids. We may not like them in the present iteration, but we love them because of what we can see in their future. Actually, I liked my kids when they were little, too, and now, they're amazing.
 
"Umm", it's beta. I expect improvements. I expect them over the air. Part of my enthusiasm is seeing this thing grow. Like raising kids. We may not like them in the present iteration, but we love them because of what we can see in their future. Actually, I liked my kids when they were little, too, and now, they're amazing.
I love my kids too but I "like" my autopilot. See the difference? Rather than continuing this debate try and accept that it's a different experience on crowded, always under construction busy freeways here in Southern California than it is in the beautiful wine country. I've driven in both and AP is squirrelly down here. I agree it's getting better.
 
I feel as if the MAJOR reason there is huge debate over this is it comes down to a Trolley car problem.

Camp A of this trolley car problem passively accepts the deaths of thousands of people a year because they can't bring themselves to pull the switch to knowingly kill a few people. They want a slow rollout so they're not at fault, but they're not saving peoples lives very quickly either.

Camp B of this trolley car problem feels as if the death of a few people is justifiable because it will save the lives of thousands and thousands of people. They'll say they didn't do it knowingly because after all people are supposed to take responsibly. But, we know that's bullshit because we knew from the beginning that AP would change peoples behavior.

For those of us that own a Model S with AP it's rather interesting because we're part of the trolley car experiment. Our fellow owners died as a direct result of what we feared would happen. From the very beginning of the rollout of 7.0 we knew there would be crashes and a fatalities. We knew people would be irresponsible with it. But, we also knew we would learn from it. What we learned is that the AEB system in the Tesla sucks. The article that Amped posted points out that it sucks. I personally think MobileEye is full of crap because MobileEye knew this from the beginning of the AP rollout, but they didn't say anything. They never said anything to the media to say Tesla was doing something with it that it simply didn't support. They only said that when something did happen. Hell I'm not even sure why I missed such a glaring incompatibility when mixing a poor AEB system with a Level 2 driver assistance package, but I did miss it.

Going forwards I think we should should be even more vigilant in reporting things to Tesla or here when it comes to bugs and problematic points. When it comes to things that happen either as a result of human psychology or the AP doing something stupid. We also have to give thanks to those that perished. Because they're deaths are why Tesla is rolling out a vastly improved AEB system in cars we own. This means that we're safer and it means anyone who we hand the keys to are safer. We also know that whoever owns the car next is safer. The car we own will forever be safer for the rest of it's life.
I don't think that's correct. I do not feel that "the death of a few people is justifiable because it will save the lives of thousands" But I also don't think that AP has caused ANY deaths, and I know for a fact that AP in it's current form CAN NOT cause even a single death. Luckily AP in it's current form can, and DOES, save lives.

What angers me is that a few people get themselves killed (emphasis on themselves) and that causes progress to be halted risking thousands of more lives that could have been saved had the technology improved instead of being limited further.
 
  • Like
Reactions: ModelX and rog
I don't think that's correct. I do not feel that "the death of a few people is justifiable because it will save the lives of thousands" But I also don't think that AP has caused ANY deaths, and I know for a fact that AP in it's current form CAN NOT cause even a single death. Luckily AP in it's current form can, and DOES, save lives.

What angers me is that a few people get themselves killed (emphasis on themselves) and that causes progress to be halted risking thousands of more lives that could have been saved had the technology improved instead of being limited further.

AP in it's current form can cause a death. The reason AP can cause a death is because of truck lust. Truck lust is when the car loses the line near a semi and the car searches for the line again by moving towards the semi. This freaks a driver out and then the driver ends up over correcting. To my knowledge it hasn't caused a death, but it can. I didn't experience it that much with Version 7, but I experienced it a few times with version 7.1.

AP can also cause a death indirectly if you take into account human psychology. If you take into account how an average driver loses situational awareness when two or more controls are automated.

What angers you also angers me when it comes to people getting killed as a result of trying to limit it versus having it free, and working out the problems as they are identified. The only reason the media seemed to care is when the deaths/crashes coincided when AP was involved. If AP had been a clone of what BMW/MB/etc uses then it wouldn't have received any attention. The media attention actually turned out to be a good thing in that it was what will save lives. By blaming the machine they blamed something that could be improved. When the media blames humans we can't do anything about it.
 
Last edited:
We are getting perilously close to an all out discussion of personal responsibility here.

The media and the NHTSA don't buy the argument that all the responsibility rests on the driver. They've come out and said the manufacture can't use that as an excuse to hide behind.

Whether one agrees with that or not is pointless at this point. A good portion of the Version 8 update is about personal responsibility. Tesla is doing two rather contradictory things in order to try to prevent these types of crashes. They're restricting AP further in attempt to push responsibility on the driver, and they're improving AEB in a really exciting, but unproven way. The improvement to AEB is a shift to putting more responsibility on the car. Where the car can use the radar as a primary and only source to trigger the AEB.

There is some good, and some bad that comes from shift.

The good news is people will stop crashing into stalled vehicles in front of them.

The bad news is our cars will stop when they're not supposed to. Tesla promises to try to keep this at a minimal, but it will happen.

We'll likely see way more rear end crashes of a Tesla than a Tesla crashing into someone.
 
Last edited:
The media and the NHTSA don't buy the argument that all the responsibility rests on the driver. They've come out and said the manufacture can't use that as an excuse to hide behind.

Whether one agrees with that or not is pointless at this point.

There is some good, and some bad that comes from shifting the blame from the human to the machine.

The good news is people will stop crashing into stalled cars in front of them.

The bad news is our cars will stop when they're not supposed to.

We'll likely see way more rear end crashes of a Tesla than a Tesla crashing into someone.
The question becomes are we going to get another vicious cycle where people become reliant on the car to to the braking for them and when it inevitably fails to do so in a certain circumstance, then it's the same car blaming.

I would also be careful about characterizing NHTSA's response vs the media. They are not ruling out a malfunction of Tesla's system being a cause (for example if it overrode or didn't respond to driver input), but that does not mean that they will blame Tesla if it turned out the system acted as it was designed. Some in the media would blame Tesla regardless if the system actually malfunctioned.
 
AP in it's current form can cause a death. The reason AP can cause a death is because of truck lust. Truck lust is when the car loses the line near a semi and the car searches for the line again by moving towards the semi. This freaks a driver out and then the driver ends up over correcting. To my knowledge it hasn't caused a death, but it can. I didn't experience it that much with Version 7, but I experienced it a few times with version 7.1.

This issue is addressed in the upcoming release v8.0
 
  • Like
Reactions: S4WRXTTCS
I missed the NHTSA saying that: do you have a link?

I don't think they've publicly stated anything at this point as it's still being investigated.

I posted that because it was my understanding that Tesla worked with the NHTSA when it came to the new restrictions. At least that's the impression I've got from everything I've read on it. What I was getting at is Elon and Company were in some pretty hot water. I can't even imagine the kinds of conversations Tesla was having with regulatory agencies around the world.
 
Last edited:
  • Disagree
Reactions: green1
The question becomes are we going to get another vicious cycle where people become reliant on the car to to the braking for them and when it inevitably fails to do so in a certain circumstance, then it's the same car blaming.

The way I see it is we're going to go through one cycle after another.

The cycles is how we're going to get to autonomous driving.

Autonomous driving won't just happen in one sudden instant. It's going to be a gradual shift of responsibility. Where in each case there will be a certain circumstance it fails at. But, I see the circumstances being smaller and smaller each time through the cycle. This one was a rather huge one because stalled vehicles covering half the lane aren't a small edge case.

As to the NHTSA you're entirely correctly. Apparently I'm in need of an editor. :p
 
  • Like
Reactions: mblakele and Vitold
... Autonomous driving won't just happen in one sudden instant. It's going to be a gradual shift of responsibility...

Except there is a crucial switchover - at the point where the manufacturer takes over responsibility from the driver.

With lawyers sharpening their pencils already my bet is this isn't going to happen soon, and when it does is will have to be regulator led.
 
I don't think they've publicly stated anything at this point as it's still being investigated.

I posted that because it was my understanding that Tesla worked with the NHTSA when it came to the new restrictions. At least that's the impression I've got from everything I've read on it. What I was getting at is Elon and Company were in some pretty hot water. I can't even imagine the kinds of conversations Tesla was having with regulatory agencies around the world.
It's confirmed in the 8.0 press conference posted on electrek last week. Elon said that they worked with the NHTSA on the new system and warnings and that he "thought" they agreed with it. Either part 7 or part 8. I think @MP3Mike posted the link.