Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla is dumping Mobileye???

This site may earn commission on affiliate links.
I've played with briefly with image recognition algorithms (pre neural network) and the first thing you generally do is throw away the color information, compress the histogram, and generate contrast for edge detection. I have no idea if this is what MobileEye is doing, but by having a monochrome camera you reduce the color information to a single channel (essentially giving you a tinted grayscale image) which I think would give you a headstart on that process. That's just a rough guess at what's happening.

Probably no color channels on these sensors. You were just adapting data from a bayer filtered sensor as is found on typical color cameras.
 
"Tesla believes". Tesla had no choice. They started late, and leveraged mobileye technology. It will not be Tesla choice to determine how many people they get to kill testing autonomous driving. One reason to start on the slow speed end, like Google Car, is fewer customer decapitations.
One customer decapitated in 130m miles of use, where that customer was apparantly trying to get himself killed by not paying attention to what was in front of him on the road, is a pretty positive record for the technology they have chosen to use.
 
I'm not clear what this means. Is everything from South Sudan off limits? Could, say the headliner, material be from a country that has conflict minerals, but not be a conflict material? What is 3TG minerals?
Not trying to be argumentative - please just educate me a tad on this important issue..
Wow! I was away from TMC for a couple of days and now there are multiple pages of posts since you posted your question. A number of the other posters covered the main points, but there are a couple of things I'll add.

For companies that want to follow the spirit of this regulation, they make sure to communicate that "conflict-free" does not equal "DRC-free". The goal of the legislation is to promote responsible sourcing practices in the DRC region, not to create an embargo of minerals from the region. Companies like Apple, Tesla & my employer communicate this in their Conflict Minerals policy & in their annual Form SD & Conflict Minerals Report (CMR).

The whole point of the process is to determine where the 3TG content is originating from. To do that, data is tracked at the smelter level. The Conflict-Free Sourcing Initiative (CFSI) has created a program to audit smelters to validate that their 3TG content can be traced back to mines that are not funding the armed groups operating in the DRC. Tesla, as a CFSI member, has access to the data that the CFSI has gathered regarding the country of origin of the 3TG content for smelters that have been audited. However, the CFSI does not publish the specific country that is the source of the minerals.

The legislation applies not only to the DRC, but also to the surrounding region. This includes all the countries that border on the DRC. There is smuggling that goes on across country borders, which is why those countries are included within the list of the "covered countries" as part of this rule. This is where South Sudan comes in to play. South Sudan is one of the "covered countries". However, South Sudan does not have any sources of 3TG content (see pages 11 & 12 of this PDF from Claigan). As is shown on page 12, listing South Sudan as a country of origin is likely an error due to companies listing all of the covered countries as countries of origin. Companies should be not only relying on the groupings of countries provided by the CFSI for mineral sourcing, but doing additional investigation to trace back to the actual country of origin. This is an extremely difficult process. Because of how difficult it is, the Executive Team that I report to regarding my employer's conflict minerals process has decided that we will not list smelters or countries of origin in our report, despite pressure from NGOs to disclose that information, because we believe that the data quality is too poor.
 
One customer decapitated in 130m miles of use, where that customer was apparantly trying to get himself killed by not paying attention to what was in front of him on the road, is a pretty positive record for the technology they have chosen to use.


I agree that the safety record so far looks pretty good. What happens with a better system that people trust more and less sophisticated drivers?

Well, we are going to find out. It's a big experiment that Google is performing at 25 mph with trained employees and Tesla is doing at 75 mph with customers. But as long as Tesla customers are only killing themselves, the blowback may not be too bad. Few people blame the motorcycle manufacturer for stupid bike deaths.
 
The problem with radar is probably resolution. Is that a short fat person about to step into the street, or a mailbox?
The idea is to combine the camera and radar with the radar providing accurate position and velocity information and approximate size. The image sensing would use the information to locate what it was seeing. This is likely much more accurate than relying on optical stereo vision with small separation between lenses.

Radar that is used in Tesla has problems with detecting stationary objects. It can easily detect objects that are moving related to the background with good enough resolution, but not static objects.
This seems to be a common misconception on these forums. Automobile radar can detect both stationary and moving objects, providing 3D position and velocity as well as approximate size. The horizontal positioning is quite a bit more accurate than the vertical and the distance and velocity information are quite accurate.
 
Last edited:
I never said it can't detect stationary objects, it's just not very good at it, especially at large distance. It's just that detecting a moving object is very easy by measuring the Doppler shift.
Yes, that seems to be what many people think. However, the radar used in car applications detects stationary objects with exactly the same accuracy and sensitivity as moving objects. The technology is generally referred to as Continuous Wave Radar which is a modification of Doppler radar.
Continuous-wave radar - Wikipedia, the free encyclopedia
If you still think this is a problem, please do some Internet research.
 
Mobileye will thrive with BMW & Intel partnership, indepdent from Tesla. Tesla will scale AP rather quickly, too. Business as usual. Now, it'll be interesting to see IF Mobileye will sue or litigate, to say the least, for any patent infringements or the sort.
 
I think what Cyberax means to say is NOT that the radar can't detect stationary objects well (it can), but that it has trouble discerning the exact nature of the stationary object -- which is where the system falls back to the monochrome camera for whatever image recognition algorithms MobileEye/Tesla are running.

Even in the case of the European left lane stopped van collision, Autopilot knew enough to warn the driver, but still didn't automatically brake. Tesla/ME isn't confident enough about false positives to fully brake for a stationary object that suddenly appears in the car's path. Maybe AP 8.0 or a future version will have some improvements in the algorithm. Musk has already hinted such improvements are possible.
 
My point was (and is) that the radar is quite capable of giving the system the distance information that a dual camera system would. In fact, unless the camera lenses were widely separated, e.g. on the outside edges of the car, I'm pretty sure the radar distance information would be much more accurate at far longer ranges. I can well believe that there would be problems with overlapping vehicles, but that's going to be a problem for a vision based system as well.

The development issues I suspect are to integrate the radar and camera information and to have enough compute power to build and maintain a real time model of the area around the car. That all sounds like a real software challenge, but not at all impossible, given a strong enough compute system.

A rear facing camera and radar would be a great addition as well.
 
Yes, that seems to be what many people think. However, the radar used in car applications detects stationary objects with exactly the same accuracy and sensitivity as moving objects. The technology is generally referred to as Continuous Wave Radar which is a modification of Doppler radar.
Continuous-wave radar - Wikipedia, the free encyclopedia
If you still think this is a problem, please do some Internet research.

I don't see how this distinguishes from other stationary clutter. Yes, the radar "sees" the stationary car but it also "sees" the road and any signs or overpasses.
 
...That all sounds like a real software challenge, but not at all impossible, given a strong enough compute system...

During the 2016 Q2 call, Elon Musk seemed to agree with you that it is not the hardware, it's the software challenge:

"Well, full autonomy is really a software limitation. I mean the hardware is just to create full autonomy, so it's really about developing advanced, narrow AI for the car to operate on....So increasingly sophisticated neural maps that can operate in reasonably sized computers in the car. That's our focus. I'm very optimistic about this. It's exciting, it blows me away, the progress we're making. So I think if I'm this close to it and it's blowing me away, it's really going to blow other people away when they see it for the first time."
 
  • Like
Reactions: bhzmark
The media software is not a priority for them. The AP is critical.
@RDoc was just using MP as an example. If you go through the existing software so much is left unfinished (Navigation, Trip Planner, Media Player, etc). Many of these are essential driving tools. While I agree from a marketing point of view AP will sell more cars, these other things are important too. Hoping this will all be addressed in 8.0.
 
Last edited:
We have to remember that Tesla have a lot of different priorities, whereas MobilEye are focused on one thing.

Best case is that Tesla continues to treat Autopilot as "mission critical", and develops a robust and futureproof system, with continuing R&D to make sure that it remains safe, industry-leading, and desirable to have for the long term.

The cynic in me sees this as a premature and high commercial risk solution to the problem; a good example where a strategic partnership with a specialist 3rd party supplier (with a history of developing robust, real-time, life-critical software) would be preferable.
 
I've said it before and I'll say it again. When JB is involved in internal engineering projects, it is awesome. When's he's not involved, it barely works. Contrast how well the drivetrain and inverter works (JB) versus model S and X door handles, nav software, browser software, etc.
 
  • Like
Reactions: Lex