Today, cameras are perhaps the lowest cost, highest bandwidth sensors available in the world. They're a few dollars. There was a talk from Amnon Shashua (Mobileye CTO) who mentioned that the cameras they used in their systems (i.e AP1 cameras) were only around 5-6 dollars each, and that talk was a couple years old (so presumably they're even cheaper now). The cameras they use are monochrome, low-contrast and relatively low resolution compared to the cameras in, say, your phone. Relatively low-res and monochrome is actually useful, because it's more efficient to process and classify per-frame than hi-res/colour. However, it's important to note that successful computer vision doesn't actually need colour, or benefit particularly from higher resolutions.
Tesla's HW2 cameras are mostly monochrome (you can see this from the Tesla Vision
promo video), presumably with the exception of the rear camera (because mine's in colour!)
Anyway, compare this to other sensors commonly used in self-driving tech: the high-end Velodyne LIDAR is still very expensive: $8k for the cheapest puck, but more like $30k for a usable unit that would be even somewhat useful in a few scenarios. The BOSCH radar that Tesla's use is cheaper, but still reportedly a couple thousand dollars (
Bosch Mid Range Radar (MRR) Sensor - System Plus Consulting). I'm sure these figures are lower in reality, due to volume discounts and corporate deals etc, but still - comparatively, the cameras are extremely cheap.
Of course, doing per-frame detection on 8 camera feeds, simultaneously, is a strenuous computational task. This is why it makes sense to discard colour, and keep the cameras relatively low res... otherwise the throughput requirements go through the roof. However, I'd have thought low-light cameras and HDR cameras would be useful in a lot of situations, but it'd probably require separate optics, which would get more expensive (both financially and computationally) very quickly.