No, it is not just semantics. No, it is not because of varying voltage from the utility. The nominal, rated, no-load voltage for residential power in the US is 120V per phase, period, end of story. Utilities have precise voltage regulators on their equipment and keep the grid voltage within 1% of that value. Local variations are temporary, and are caused by large reactive load shifts.
The voltage you see at the wall outlet is entirely dependent on the load on the circuit, and to some degree the load in your house, or even on your street. However, the proper voltage that it is supposed to read on an unloaded circuit is 120V, not 110V. You may plug in, get 120V, and then when the circuit is loaded to 15A the voltage will drop along the circuit's wires, resulting in 120V at the panel but only 114V at the outlet. Remove the 15A load and the outlet voltage goes back up to 120V. There are specs in the National Electric Code for how much voltage drop is permissible on a fully loaded circuit. Drops that are more than that indicate poor wiring, bad connections, etc.
Just use proper reference. 120V per phase (240V for the residential two-phase connection) is correct. Saying "110" or "220" refers to outdated standards from long ago.