Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Motor Trend buries dramatic "Tesla wins" results in new self steering comparison

This site may earn commission on affiliate links.
I can't say if the writers at Motor Trend have an anti Tesla bias, but their readers sure do. Just go to the Motor Trend or Car & Driver Facebook pages and look at any post having anything to do with Tesla. All of the false arguments concerning EV adoption or Tesla in general undoubtedly come up.
 
  • Like
Reactions: Genshi
the problem with the truck or stopped objects lies in the software, since for now they aren't sure of what it is, it's better not to act. i greatly suspect than a new software update can solve this problem almost completely.
the new HW is needed for prevent and hard-breaking, when there is low visibility or similar, but it's probably not the case of the truck
This has been discussed in other threads in details if you want to read those, but basically ultrasonic sensors are not sufficient (too short distance plus miss trailers and stuff), camera cannot tell distance (no stereo) and doppler radar is not directional enough to be able to tell whether you're heading for a stopped car or something on the side of a curved road. So unfortunately, very unlikely to be fixed by software alone.
 
  • Informative
Reactions: Genshi
I agree with a majority of the posts here that it largely WAS Tesla positive, but I think what's being missed here is that it was really, really strange to just end with some charts and no synopsis/conclusion. That was the really jarring part to me.

When I got done reading the report, I legitimately spent 3 minutes searching the page to try to find a link to page 2 and the conclusion. No hyperbole, I was certain I'd missed it and there was more to the article by way of conclusion. Leaving off with nothing after showing just how incredibly superior Tesla Autopilot is in regular use seemed incomplete. Maybe I'm just not used to Motor Trend's style of article, but it actually left me confused and feeling like I'd only read half an article.

That said, the comparison itself was incredibly cool, and much more scientific and content laden than I was expecting. It's really hard to assess functional usage in a test environment, I thought they did a pretty darn good job.
 
I agree with a majority of the posts here that it largely WAS Tesla positive, but I think what's being missed here is that it was really, really strange to just end with some charts and no synopsis/conclusion. That was the really jarring part to me.

When I got done reading the report, I legitimately spent 3 minutes searching the page to try to find a link to page 2 and the conclusion. No hyperbole, I was certain I'd missed it and there was more to the article by way of conclusion. Leaving off with nothing after showing just how incredibly superior Tesla Autopilot is in regular use seemed incomplete. Maybe I'm just not used to Motor Trend's style of article, but it actually left me confused and feeling like I'd only read half an article.

That said, the comparison itself was incredibly cool, and much more scientific and content laden than I was expecting. It's really hard to assess functional usage in a test environment, I thought they did a pretty darn good job.

You're certainly correct that they didn't really wrap it up and draw conclusions. I wasn't really surprised by that, though, because there was the whole section about not being sure how to test and letting us see what's going on behind the curtain. The article is an unfinished work in progress just like the testing is.

Having said that, they talked about the results of most of the test segments, then just threw up graphs for the last one for whatever reason. Given the Tesla positive thrust of the article in general, I don't think it was manipulation or an axe to grind, but I'm not sure why they didn't explain what they saw in that part.
 
  • Like
Reactions: Genshi
You're certainly correct that they didn't really wrap it up and draw conclusions. I wasn't really surprised by that, though, because there was the whole section about not being sure how to test and letting us see what's going on behind the curtain. The article is an unfinished work in progress just like the testing is.

That's a very good point. They did make that clear at the outset. I also appreciated that they'd already put a header on the article addressing the fatal crash, and how ill timed this report is.

I agree it was far less likely that they had an axe to grind than the explanation that it was simply an oversight, or that they felt the graphs spoke for themselves. (Which they do.) Occam's razor and all that. Preliminary first impression articles often seem unfinished, though most end with "we look forward to reporting on this more once we've been able to spend appropriate time with _____."
 
I recently drove a new "Volt" with GM's basic lane keeping assist. If you ever used it, it was an annoyance at best, somewhat disconcerting at worst. It drives like a bumper car going from the shoulder to the centerline (if it sees either at all). Would get you a DUI if police were watching. Worst of all, if you forget to use your turn signal to change lanes, it tries to nudge you back into the lane. Easy to overpower, easy to turn off, but the technology used is not even "beta" capable. I know it isn't anything like the Tesla system, but something that is useless is a waste of money and resources shouldn't be put on the market. Regulators, lawmakers, lawyers, the press and concerned citizens are all luddites when it comes to something new and innovative but it takes time and common sense to perfect the technology that will lead to the goal of autonomous driving vehicles. I just hope we don't revert to something like the light aircraft industry where mainstream powerplant development stopped in the 1920's. Every time something new was tried, it was blamed for whatever happened during an accident. Even modern glass cockpit composite aircraft are still pulled along by inefficient air cooled piston engines sparked by magnetos, and fueled by leaded gasoline controlled by (at best) archaic fuel injection systems.
My wife's car is a 2016 Lexus ES350. The lane assist is so annoying I don't use it. IK it's just meant as a warning but does steer back into the lane...............sometimes. I had been keeeping it on because I like to try out the new tech stuff but eventually gave up on it. Useless!
 
One thing to keep in mind about lane steering is it's really hard to do a scoring comparison when all the manufactures other than Tesla purposely fail.

Even the newest MB drive pilot system is dumbed down. They purposely make it worse than it really is so that there are more interruptions. That way it keeps people from trusting it, but what's the point of paying for something that barely functions?

Mercedes-Benz's 2017 E-Class won't let you nap behind the wheel

I do think the possibility that manufactures might be dumbing down the systems should be mentioned in the Motor Trend article. That maybe it's not a fair comparison because MB isn't even trying.

As an aside I'm getting sick of all the references to the video of the person sleeping behind the when in a Tesla. If you saw someone sleeping behind the wheel would you simply film it, and then upload it? Or would you try call 911 to report it, and get their license plate? From everything I've heard from the sleeping video it's likely a hoax made to make money off youtube. But, now it gets mentioned every time an article mentions autopilot.
 
  • Like
Reactions: Breezy
This has been discussed in other threads in details if you want to read those, but basically ultrasonic sensors are not sufficient (too short distance plus miss trailers and stuff), camera cannot tell distance (no stereo) and doppler radar is not directional enough to be able to tell whether you're heading for a stopped car or something on the side of a curved road. So unfortunately, very unlikely to be fixed by software alone.
You raised an interesting point. I never thought of that.
 
One thing to keep in mind about lane steering is it's really hard to do a scoring comparison when all the manufactures other than Tesla purposely fail.

Even the newest MB drive pilot system is dumbed down. They purposely make it worse than it really is so that there are more interruptions. That way it keeps people from trusting it, but what's the point of paying for something that barely functions?

Mercedes-Benz's 2017 E-Class won't let you nap behind the wheel

I do think the possibility that manufactures might be dumbing down the systems should be mentioned in the Motor Trend article. That maybe it's not a fair comparison because MB isn't even trying.

As an aside I'm getting sick of all the references to the video of the person sleeping behind the when in a Tesla. If you saw someone sleeping behind the wheel would you simply film it, and then upload it? Or would you try call 911 to report it, and get their license plate? From everything I've heard from the sleeping video it's likely a hoax made to make money off youtube. But, now it gets mentioned every time an article mentions autopilot.
What makes you think that they purposely fail? Because a "journalist" who get advertising / freebies from MB says so? My take is this is coded language from journalists to not upset automakers.
 
This has been discussed in other threads in details if you want to read those, but basically ultrasonic sensors are not sufficient (too short distance plus miss trailers and stuff), camera cannot tell distance (no stereo) and doppler radar is not directional enough to be able to tell whether you're heading for a stopped car or something on the side of a curved road. So unfortunately, very unlikely to be fixed by software alone.

It is a common misconception that mono cameras can't tell distance. They certainly can. Mobileye's AEB depends on it.

The camera needs more dynamic range and resolution to overcome the current shortcomings from glare.

Even the next generation Mobileye with three cameras (rumored to appear on Autopilot 2.0) isn't for stereo vision. It's for three different focal lengths.

One source: Vehicle Detection - Mobileye
 
Last edited:
  • Like
Reactions: TaoJones
This has been discussed in other threads in details if you want to read those, but basically ultrasonic sensors are not sufficient (too short distance plus miss trailers and stuff), camera cannot tell distance (no stereo) and doppler radar is not directional enough to be able to tell whether you're heading for a stopped car or something on the side of a curved road. So unfortunately, very unlikely to be fixed by software alone.

That's not exactly it. The radar almost certainly has the precision to see where everything is. It's the processing part that's the challenge - To make the data set manageable, the standard approach is to throw out all the stationary returns - thus getting rid of roadsigns, pavement cracks, rocks beside the road - and unfortunately stopped cars.

To capture the stopped cars, you need some way of separating them from the rest of the clutter - most likely some variation of sensor fusion. Tesla's current architecture has a major bandwidth limit that is a challenge for this - the AP computer (EyeQ3) is with the camera in the windshield, and connected to the rest of the systems only by CANBus.

That means that even if the EyeQ3 could handle the raw take from the radar, there's no way to get it there.

I'm not convinced that means there's no answer with the current hardware. The camera is trained to recognize cars visually. What if it passes requests to the radar by bearing? - "I see cars at 345 degrees, 356 degrees, and 002 degrees. Tell me the relative velocity of all of these."

Or it might be simpler, actually. We think the radar is currently passing bearings and velocities for all the moving cars it sees, right? If that's the case, then if the camera sees a car that isn't on the list from the radar, it must not be moving...

This might not address the recent incident, though - there's no reason to believe the camera recognized the truck was a truck. For that, you might need some sort of "I can see under the obstacle/I can't see under the obstacle" logic too.
 
That's not exactly it. The radar almost certainly has the precision to see where everything is. It's the processing part that's the challenge - To make the data set manageable, the standard approach is to throw out all the stationary returns - thus getting rid of roadsigns, pavement cracks, rocks beside the road - and unfortunately stopped cars.

To capture the stopped cars, you need some way of separating them from the rest of the clutter - most likely some variation of sensor fusion. Tesla's current architecture has a major bandwidth limit that is a challenge for this - the AP computer (EyeQ3) is with the camera in the windshield, and connected to the rest of the systems only by CANBus.

That means that even if the EyeQ3 could handle the raw take from the radar, there's no way to get it there.

I'm not convinced that means there's no answer with the current hardware. The camera is trained to recognize cars visually. What if it passes requests to the radar by bearing? - "I see cars at 345 degrees, 356 degrees, and 002 degrees. Tell me the relative velocity of all of these."

Or it might be simpler, actually. We think the radar is currently passing bearings and velocities for all the moving cars it sees, right? If that's the case, then if the camera sees a car that isn't on the list from the radar, it must not be moving...

This might not address the recent incident, though - there's no reason to believe the camera recognized the truck was a truck. For that, you might need some sort of "I can see under the obstacle/I can't see under the obstacle" logic too.

I believe the system is designed like you suggest. It tags objects, places a box around them, and sends 3D coordinates, possibly with vectors. The EyeQs are designed to hadle radar sensor fusion as well. That's very little data compared to raw video and radar streams. It's probably how it's mapping the roads with very few kb per mile.
 
  • Informative
Reactions: TaoJones
One thing to keep in mind about lane steering is it's really hard to do a scoring comparison when all the manufactures other than Tesla purposely fail.

This is the key. None of the other manufacturers claim to have a system that steers for you, only a system that helps you steer. So they can't meaningfully be compared on the basis of metrics like number of times or length of time with hands on the wheel. The other systems strongly encourage or demand that you keep your hands on the wheel at all times.
 
It is a common misconception that mono cameras can't tell distance. They certainly can. Mobileye's AEB depends on it.

The camera needs more dynamic range and resolution to overcome the current shortcomings from glare.

Even the next generation Mobileye with three cameras (rumored to appear on Autopilot 2.0) isn't for stereo vision. It's for three different focal lengths.

One source: Vehicle Detection - Mobileye
It can make some estimations, but I wouldn't want to bet my life on it - when it helps prevent an accident that would have happened anyways, that is one thing, but when it causes an accident because it was controlling the car in the first place and missed or mis-estimated something, that is a whole different level of reliability requirement. For the purposes of this discussion, we the AP 1.0 camera does not reliably and accurately estimate distance. How do we know? I've observed it get it wrong as discussed in AP camera distance sensing assumes vehicle sizes, sometimes incorrectly. I also observed a few times a warning about impending collision even though the car that it thought I was going to hit turned and was nowhere in sight anymore (the car didn't hit the brakes, which means Tesla knows those are not reliable - but it did show me a ghost car in red while beeping at me). We also know that AP 1.0 doesn't stop for already stopped vehicles - it it could reliably tell you're going to hit a vehicle, it would stop. The only time it works is when it's combined with the radar ranging (sensor fusion) and the only way that works if the car in front of you is moving when it first comes into view.
 
That's not exactly it. The radar almost certainly has the precision to see where everything is. It's the processing part that's the challenge - To make the data set manageable, the standard approach is to throw out all the stationary returns - thus getting rid of roadsigns, pavement cracks, rocks beside the road - and unfortunately stopped cars.
I oversimplified in order to keep the post short and digestible to most, hence my comment to go read details in other threads. I agree with you that radar as a technology absolutely could do it, however the issue is the particular sensor and how it's implemented in AP1.0 is not capable, IMHO.

I'm not convinced that means there's no answer with the current hardware. The camera is trained to recognize cars visually. What if it passes requests to the radar by bearing? - "I see cars at 345 degrees, 356 degrees, and 002 degrees. Tell me the relative velocity of all of these." Or it might be simpler, actually. We think the radar is currently passing bearings and velocities for all the moving cars it sees, right? If that's the case, then if the camera sees a car that isn't on the list from the radar, it must not be moving...
Clever idea, however how did you conclude that the radar used by AP1.0 can take or return bearings and if so with what accuracy? I haven't taken mine apart, but it doesn't look like a high precision phased array type antenna that can provide directionality (those tend to not have as much depth, and are also rather expensive as they require very good quality LNA's) - such antennas are used in military applications, but some of that technology is also available in very high end radar detectors such as the Stinger - each antenna alone retails at $1,500, actual detector not included and for a good front and back coverage you need 2 antennas - I doubt AutoPilot would sell for $2,500 if it had sensors of that precision, quality and therefore cost.
 
Clever idea, however how did you conclude that the radar used by AP1.0 can take or return bearings and if so with what accuracy? I haven't taken mine apart, but it doesn't look like a high precision phased array type antenna that can provide directionality (those tend to not have as much depth, and are also rather expensive as they require very good quality LNA's) - such antennas are used in military applications, but some of that technology is also available in very high end radar detectors such as the Stinger - each antenna alone retails at $1,500, actual detector not included and for a good front and back coverage you need 2 antennas - I doubt AutoPilot would sell for $2,500 if it had sensors of that precision, quality and therefore cost.

I'm still looking for detailed information about the AP radar, which I believe is a Bosch unit?

In the mean time, I'm assuming it is roughly comparable to the capabilities of the Delphi unit used by some of the competition. When I was researching AP some months back, I came across this page describing the capabilities of that unit - which include tracking up to 64 targets across a 90 degree field of view out to 100m with updates 20 times per second.

I think it's reasonable to expect that the AP radar is up to the tasks I described, though I'm still hoping to find real specs from the vendor.
 
  • Informative
Reactions: Genshi and TaoJones
It can make some estimations, but I wouldn't want to bet my life on it - when it helps prevent an accident that would have happened anyways, that is one thing, but when it causes an accident because it was controlling the car in the first place and missed or mis-estimated something, that is a whole different level of reliability requirement. For the purposes of this discussion, we the AP 1.0 camera does not reliably and accurately estimate distance.

It may be true that the AP 1.0 camera fails to accurately estimate distances from time to time, but it isn't because it is or isn't in stereo. I have some experience with this phenomenon because I too lack stereo vision and as a result do not have true depth perception. Nonetheless, I can reliably determine which objects are closer than others and when my car is closing in on another object. It works the same way you can detect different depths in photographs and on television. Your brain gets millions of cues that allow it to assemble a multidimensional view of the world without using stereo.
 
  • Informative
Reactions: MarkS22
I can reliably determine which objects are closer than others and when my car is closing in on another object. It works the same way you can detect different depths in photographs and on television. Your brain gets millions of cues that allow it to assemble a multidimensional view of the world without using stereo.
Your brain can do that because it has billions of neurons, high bandwidth visual input, and a lifetime of experience navigating through the 3 dimensional world we inhabit. That is not comparable to a few very limited sensors, much much lower I/O bandwidth, relatively low power CPUs, and software. Orders of magnitude difference.