Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Vision NOA aces ADAS test vs Mobileye Supervision (Zeekr 001) and others

This site may earn commission on affiliate links.
I did a search and didn't see this posted elsewhere, but I stumbled on a fairly interesting set of tests done on the ADAS systems in the leading EVs in China in November 2022. They did 10 different scenarios (7 daylight, 3 at night) including a bunch of tests that typical standardized testing don't get to test (only made possible because they rented out a national test center that has a full scale model of public roads). Video has English subtitles so is fairly easy to follow.

Some background: Tesla is a 2022 Model 3 equipped with the FSD package, but video mentions in China it does not respond to traffic lights even though it recognizes them (so effectively it is just the same as EAP/NOA). It has no radar (AKA it is running Tesla Vision), given that was removed in around November 2021 (video mentions this too versus other cars that have radar):
Giga Shanghai follows Fremont's lead and ditches radar

For all the flack Tesla gets for phantom braking in the forums, removing radar in Tesla Vision, and all the FUD about FSD Beta running over child dummies, you would think they have the least capable, most unsafe, ADAS system. These tests show how the Tesla system is tuned to be safer in more scenarios (at the cost of false positives).

Test results (out of 10 tests):
Tesla Model 3: 10 good
Zeekr 001: 4 good, 2 average
Changan SL03: 2 good, 1 average
BYD Han EV: 1 good
BYD Seal (Atto 4): 2 average

LightingScenarioBYD Han EVBYD Seal (Atto 4)Changan SL03Tesla Model 3Zeekr 001
DayFollow car at red lightPoorPoorAverageGoodAverage
DayCross intersection with no carPoorPoorPoorGoodPoor
DayCross intersection with carPoorAveragePoorGoodPoor
DayMerging Car in motionGoodAverageGoodGoodGood
DayPedestrians running red lightPoorPoorPoorGoodAverage
DayMerging car stopped on sidePoorPoorGoodGoodGood
DayTraffic ConesPoorPoorPoorGoodPoor
NightLeft turn into crossing pedestrianPoorPoorPoorGoodPoor
NightCrossing BicyclePoorPoorPoorGoodGood
NightDisabled vehicle in fog, opposing headlightsPoorPoorPoorGoodGood

Unfortunately they didn't give details on the sensor mix of each one, but they did mention none of them are equipped with Lidar, but rather a mix of cameras, radar, and ultrasonic sensors. From my search the SL03 was promoted to have Lidar, but it might not be equipped in this particular car (or didn't end up in production).

SL03 uses a Qualcomm platform for its ADAS, similar to what GM is looking to move to for Supercruise.

They didn't test the Nio ET5 (which does have Lidar) because it was a preproduction car and didn't have the software ready yet.

The Tesla passed with flying colors getting a good rating on 10/10 tests, with others not even coming close. Zeeker 001 with the Mobileye SuperVision came a distant second, although in 2 tests reviewers note it did better than the Tesla: in the night bike crossing test it was able to perform an "Antelope Avoidance" maneuver instead of just braking, and in the disabled smoking car test it was able to turn on the fog lights automatically as it approached (an improvement after the latest software update, where in a previous test it did poorly responding to a vehicle in fog).

 
Last edited:
Tesla could ramp
down the sensitivity and reduce phantom braking but it would likely impact the results in safety tests, heck there’s probably a simple slider for this in the dev tools.
Ah the old slider theory, much maligned here.

Anyway it is sad that the perception & prediction is so poor/noisy that false positives are even an issue (seems to be an issue for much more sophisticated systems too!). Attentive humans have very few false positives.

Getting the perception accurate and precise enough to make false positives a non-issue seems like it should be a priority.

What are we even doing, if not trying to get perception as good as human? Seem pretty far from it so far.

Oh well.

It’s very interesting how difficult this problem seems to be (for everyone).

I thought about four years ago here that we would be there in another 5-10 years. I think I may have been way too optimistic. It doesn’t even seem like the safety features are able to really work yet, let alone the more difficult task of driving.

Six years for excellent performance on these tests? Not so sure. Hopefully!
 
My own opinion is that phantom braking is a result of heightened sensitivity in the OEDR and so you’d expect the system to perform really well in tests like these. Tesla could ramp
down the sensitivity and reduce phantom braking but it would likely impact the results in safety tests, heck there’s probably a simple slider for this in the dev tools.
That's my feeling too. Tesla can probably easily make the TACC/Autosteer work like other manufacturers by tuning down sensitivity and response to so much stimuli. Basically focus only on things in the immediate lane (allowing for common cases like merging cars) and ignore pedestrians, debris and other objects. Maybe even disable speed limit based changes, and map based speed reductions (like for curves), leaving the decision to slow down/speed up the car completely up to the driver. But that's about as likely as them adding back a dumb cruise control option, as it'll be like admitting defeat in terms of eliminating false positives.