dont know if purple is bad, it is not as good as some others though. i would say that it is the dollar v the amount better. when you take the pictures, turn off the purple, then no one will say anything looks purple, if that is the only issue. the sun is the best, barring the dollars to make that happen, then full spectrum is closest to the sun (or they try to make it so), but again the dollar to benefit value has to be weighed.
just check the usage at the wall, you will have the proper numbers then. To calculate an LED's power use, multiply the LED's voltage (and i am only guessing, but with a tad of experience-that rocket science thing) by the LED's current (pretty well known - samsung diodes 65mA or a 5mm LED the maximum current is about 20mA, but a guess too). The result (measured in watts) is the amount of power your LEDs use, times the number of LEDs... anyway, so if your LED has a voltage of 3.6 and a current of 20 mA, it will use 72 milliwatts, times 120 is about 8.64 watts times 8 bulbs is about 70 watts total - or 3.6 times 65mA ends up with 225w with all 8, so if your diodes are the "good ones" the samsung, it is still only 225W with all 8 bulbs. but again, that is a guess (with a modicum of experience). just check, then you know! but the LEDs are all labeled like that it is how the laws say they do it, sort of like the "feels like" temp, it is a "lights like" thing. what you showed us doesnt say "true" or "real" just shows what the industry standard says they should, which is the "lights like" numbers they are told to use.
if you have had them a bit, you can check your power bill for usage. look at what it was pre-lights, then note the usage the month you had them (if it is 640 watts that will defnitely be noticable on the bill!)... you will have to do some calculations, but you can come close estimating. just remember pie (p=ie)