I think there is some confusion going on in this thread.
By running 240v as opposed to 120v you ARE allowed to run more lights off of a given circuit/panel.
Amps = Watts/volts
Watts do not change, voltage and amperage does. The amount of amperage is what dictates how much power can run through said panel.
Running a single 1k at 120v, pulls 8.6 amps. 1000/120 = 8.6 amps.
Running a single 1k at 240v, pulls 4.3 amps. 1000/240 = 4.3
So with a 200 amp panel, assuming nothing but lights are going to be on said panel, it is recommended to not exceed 80% of circuit capacity. So we're working with 160 amps. If we divide 160 (total amps available) by 4.3 (a single 1k at 240v) we get 37.21. So basically, 36 or 37 lights. If we were to run the same lights on 120v, we'd divide 160 by 8.6, as running at half the voltage, we will double the amperage... this works out to 19.28 lights... so 18 or 19 lights.
It is an obvious choice when it comes to running lights on 120v vs. 240v when running more that 1k or 2k... and even then, it allows you to have more wattage on a specific circuit. Smaller circuit size, which means smaller wire. It allows you to more efficiently wire your setup, it doesn't affect the actual efficiency of the lights or power consumption.
Hope that all makes sense, good luck, and be safe!
Source: I am an HVAC professional/licensed electrician.
Edit: voltage varies up to 250v depending on specific area, always calculate by 240v, because the lower the voltage, the higher the amperage. When considering larger loads, this does make a difference, and you want to stay within 80% of maxing a circuit with a full, constant load.