How much power in watts or amps is sucked up by the cord itself ?
It's a sliding scale. The higher the load, the more the voltage drop per X feet of Y size wire.
As the voltage drops, the amperage goes up: amps = watts / volts.
So say your fridge draws 800w. If your wire size/length is sufficient so the voltage doesn't drop much, if any, then 800w / 120v = 6.66a load on the wire (and generator).
But if the wire size is smaller (or longer), and with an 800w load, the voltage drops to say 100v - then it'll be 800w / 100v = 8a load on the wire (and generator).
So, it's not like the wire itself consumed an extra 1.34a (8a - 6.66a = 1.34a). Instead, it caused the voltage on the circuit to drop, which caused the entire circuit (wire and fridge) to draw at a higher amperage - but the watts being consumed is still the same.
(Technically, it's not exactly the same, as there will be a few extra watts consumed to heat up the wire if it's being overloaded...but for a quickie explanation, we can ignore that.)
That same size/length wire, would have less voltage drop if the load were only 400w and more if the load were 1600w.
At some point, you reach the limit of the wire. Say an 1800w load and 12 ga. wire. #12 is good for up to 20 amps.
1800w / 120v = 15a. No problem.
Now lets make the wire longer so the voltage drops to 100v:
1800w / 100v = 18a. Getting close, but not there yet.
Let's make the wire longer still and drop the voltage to 90:
1800w / 90v = 20a. Ding!
If you make the wire any longer, the voltage will drop enough, so that an 1800w load would exceed 20a draw and now you've overloaded the wire.
(Not to mention, that quite a few electrical devices will gag and fall over on their faces if fed 90v.)