To spell out mongo's last post, say the meter injects 0.2 mA DC as suggested, and you are measuring the resistance of a 400 ohm resistor. It will see (measure) 80 mV DC of voltage, do the math, and read 400 ohms on the display.
Now suppose you take 10 of those resistors in series and hook them up to a 1.5V alkaline battery. Each resistor should drop 150 mV across it (assuming negligible internal battery resistance, not sure if that's accurate here), and the battery current would be 0.4 mA. Now try to measure the resistance of one resistor. The meter will see either 230 mV or 70 mV of DC voltage, depending on the polarity of the battery vs the meter, so it should read either 1150 ohms or 350 ohms, respectively. [Actually for the case of 70 mV DC voltage, it could tell the voltage polarity is wrong, and conceivably throw an error. Not sure.]
Now for the case of the OP, it's possible the saving grace here is that the external source is AC, not DC. So if the AC grid voltage has no DC offset of its own, and if the multimeter is sampling the voltage fast enough and then averaging those samples over a long enough time period , that process will cancel out the AC voltage. So the meter will just see the DC offset resulting from its injected current, and maybe it would work.
But regardless, as the manual instructs, just turn the power off before making any resistance measurements.
Cheers, Wayne