I just drove around with the HUD on for awhile to figure out exactly why they suck.
The problem with HUDs in the real world is that they cannot detect what background color exists. They would have to have an eye level video camera sync'd to your exact POV. What some HUDs do to minimize this defect is to crank up the intensity, which is not a good solution since it makes objects invisible where the HUD image exists.
Instrument panels can control background contrast, HUDs cannot.
While the problem is worse at night, it still occurs during daylight hours. When a HUD works OK, it works. But you do not get to pick when it doesn't work. Driving into the sun aggravates it even more.
And what many people do not understand is how the human eye works. They believe your eye sees everything in front of you at once. No, your eye is constructed with a high resolution field only in the very center of your line of sight.
Here's an experiment for you that you can do at your desk:
Stare at some text you can read that is 2 feet away. Now move your focus down just 1", that's all. Do not move your eye, just read the text again. Notice that you can't? Your eye does magic trick to overcome this. It is constantly moving to fill in the low resolution areas in front of you. This happens in millisecs. It takes no longer to glance at the instrument cluster than the HUD text.
Yeah, HUD does sound like it's an improvement over instruments, but reality screws up the theory.