I have struggled with this problem on NVDA. I am a programmer and I know a little bit...
NVidia was a sleeper hit... they make graphics cards that seemed like an unimportant PC subsystem... but it's turning out to be more important than the cpu... It can be used for games, crypto-currency mining, AR, VR and machine learning. The opportunity in front of them is huge. They may be bigger than Intel down the road.
Their advantage imho is CUDA. It's a proprietary api that runs on their cards and is considered superior (faster) to competitors and open standards like openCL/GL. A lot of open source machine learning software uses cuda at the lowest levels.... So it has an ecosystem around it that is going to be hard to overcome, much like windows operating system.
However, there may be competition. Currently AMD is their main competition. Intel also recently bought Nervana to compete with them.
Google has written TensorFlow, an open source machine learning library, which runs on cuda or on their own proprietary chips called TPUs. These chips are not available for sale but you can use them in the Google cloud. Google cloud has not really taken off yet and I dont know if it will.
Also, Tesla has been hiring people with expertise in chip making... so they may also build their own chips to process their self driving/machine learning apps. Tesla is buying chps from Nvidia in 2 ways. One way is in the cars... to do the self driving (inference) and the other way is the presumably have a data center that processes the (training) data from all the cars and generates the program that is downloaded to the cars. I suspect Tesla will also need the same chip technology in their factory automation.
Elon recently said that it was close when deciding between NVidia or competitors... I assume that was a bullshit statement for negotiation purposes... Here is a thread where machine learning programmers bemoan NVidia's dominance and links to papers that indicate CUDA is 4x faster than OpenCL
Deep learning is so dependent on nVidia. Are there any alternatives even on the horizon? • r/MachineLearning