Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Training uses inputs, which wouldn't include lidar, and the "correct answer". The training process involves repeatedly tweaking the weighting coefficients until the NN output best matches the correct answers. Correct answers for image recognition NNs typically come from humans. When Captcha asks you to click on all the pictures with buses, or the squares that include traffic lights, you are providing "correct answers" for Waymo to train against. Captcha does not ask you "how far away is the red car in the picture" or "how fast is it moving" because you'd get it wrong. Lidar gets it right, so you can use lidar data to train your distance and velocity estimation NNs. You can also use high-res radar data, or ideally both together.
What he's saying and I'm saying is that they are not getting the correct answers for distance and speed because they don't have Lidar on the cars nor high quality surround radar. Hence their NN models would underperform compared to if they did.

But it's not just about distance and speed. There's a whole lot more that could be better if they trained with Lidar and high-res radar.

Lidar doesn't just measure distance; it gives you a 3D snapshot of the object. This could seriously level up performance and accuracy when training all sorts of networks, like Freespace detection networks, VIDAR, Occupancy networks, Nerfs, AutoLabeling, you name it. So yeah, there's a lot of potential there that's just waiting to be tapped into.
 
That is not their business, they aren't in the business of collecting images from fleet, they collect data for their REM. They already have one of the largest automotive data set.

Ok that's great that we've established that Mobileye doesn't collect raw images or video from their current fleet.

Perhaps they do with their SuperVision fleet but not the legacy front-facing fleet.
 
Ok that's great that we've established that Mobileye doesn't collect raw images or video from their current fleet.

Perhaps they do with their SuperVision fleet but not the legacy front-facing fleet.
I told you that already in my original reply to you. They have one of the largest automotive data sets in the world, what does having a large fleet for collecting videos and images allow them to do that they aren't already doing? They supply ADAS systems to manufacturers like Ford, GM, Hyundai, BMW, etc they are currently supplying Zeekr with Supervision and they are developing their robotaxi service Moovit testing in Germany and Israel.
 
What do you reckon they have in their dataset if they aren't collecting images and video from their immense fleet?
I provided you with a link that describes what they have in their dataset. But for their REM which they build using fleet data from various manufacturers, they don't collect videos and images. That does not mean they don't have data collecting vehicles for images and videos. A lot of people don't realize this, but GM has the capability to collect videos and images from any vehicle with a camera and OnStar service. They have for years.


  • Information about your vehicle: such as license plate number, vehicle identification number (VIN), mileage, vehicle status (such as oil/battery status, ignition, window, and door/trunk lock status), fuel or charging/discharging history, electrical system function, gear status, battery diagnostic and health, and diagnostic trouble codes.
  • Information about the use of your vehicle, including operational and safety related information: such as geolocation, route history, driving schedule, speed, air bag deployments, crash avoidance alerts, impact data, safety system status, braking and swerving/cornering events, event data recorder (EDR) data, seat belt settings, vehicle direction (heading), audio or video information such as information collected from camera images and sensor data, voice command information, stability control or anti-lock events, security/theft alerts, and infotainment (including radio and rear-seat infotainment) system and WiFi data usage.
  • Information about your devices and how you interact with our Connected Services: such as IP address, browser type, unique device identifier, cookie data, energy usage (such as your charging and discharging of electric vehicles and stationary storage), associated identifying and usage information from your mobile phone, laptop, or other device.
 
Mobileye perhaps had the original vision to develop fsd using only cameras and a large fleet.

Unfortunately, they never got their fleet until now. Even now, it's unclear if they can engineer their way to Tesla's FSD achievements.

We can commend Mobileye for their original thought and perhaps Elon saw that and ran with it.

These days, Mobileye mostly releases marketing fluff to invigorate their investors.
 
  • Like
Reactions: DanCar
I'd day it's a defendable position. They are evaluating NAP on the highway essentially. And from a driver-assist perspective, not from a someday autonomous vehicle view. Take "collaborative steering" for example, as it's lack is a ding against Tesla in this article. Great assist idea but useless in a working FSD model.

Heck, I'd love to be able to nudge the steering wheel when extra lanes appear to show my preference without disengaging or let FSD choose followed immediately by a lane change I request. But I doubt that's on Tesla's radar.

The Mercedes is the only other one I used and I thought it was well behind Tesla but it wasn't their L3 system. The article may be accurate.
 
  • Like
Reactions: diplomat33
I’m not that surprised but a bit disappointed that CR did not include a review of Tesla’s latest offering, since they have it. I guess the highway version wasn’t available probably during the article data gathering. I doubt it would have placed on top or anything but curious about where it would land per these requirements. Next year I guess? Maybe they’ll add it separately.

CR is understandably heavily biased towards safety and not terribly concerned about user experience, so the ratings aren’t too surprising. (Also Tesla stopped nearly all development in 2021.)

I’m curious about how Ford’s BlueCruise works. I should go test drive it sometime but never will. Disengaging after being stationary 30 seconds (if I understand correctly) would get old pretty quick though.
 

@willow_hiller @EVNow @powertoold Hmm I kind of rememeber telling you all this...

Mobileye is currently where Tesla was 5 years ago, even worse because they don't have their full stack cars in the USA.
Hmm this company whose tech is is 5 years behind Tesla is somehow being picked by VW and other companies instead of FSD Beta....
Why isn't anyone licensing FSD Beta?
 
Hmm this company whose tech is is 5 years behind Tesla is somehow being picked by VW and other companies instead of FSD Beta....


"This new effort builds on our strategy of advancing autonomy through evolution, starting from today’s eyes-on, hands-on driver assist systems through SuperVision-based systems that enable hands-off operation for identified use cases, leading to eventual eyes-off, hands-off autonomy."

So they're releasing a level 2 ADAS system, and they believe it has the hardware necessary for greater autonomy, but that will come later in an OTA update. Yep, definitely way ahead of Tesla....
 
I assume you are being sarcastic.

But remember the key difference between the Tesla approach and the Mobileye approach is sensor redundancy. Both Tesla and Mobileye are developing a vision system capable of doing all driving tasks in a full ODD. Both companies are starting with vision-only. The question is do you need to add anything to get to "eyes off"? Tesla believes vision-only is good enough for "eyes off" with just more training, whereas Mobileye believes that adding radar and lidar is needed for "eyes off".

What is your metric for one company being ahead of the other? Surely we can't just look at the number of sensors and say that makes one approach more advanced.

Take a look at the feature-set of SuperVision: Mobileye SuperVision™ | The Bridge from ADAS to Consumer AVs

1683639321191.png


Most of these were available with Navigate on Autopilot. And keep in mind this is still very much Level 2. "Equally important however, is that this is an ADAS system, so it still requires human oversight – meaning eyes on the road at all times, even if Mobileye’s “hands” are on the wheel."

So by what metric does this exceed Tesla? Sensor redundancy alone?