f1ydave said:
It is not my intention to sound condescending and apologize in advance if I come across that way.
:evilgrin0040:
f1ydave said:
Like I said, please do your own research, but since you refuse too. I'll lay it out for you.
Oh. I'm getting excited.
f1ydave said:
You admitted to understanding that Digital Ballast are more efficient, indirectly implying your understanding whyt Magnetic Ballast are inefficient. I suspect you still don't understand why this is.
Actually, f1ydave, your post reveals that
you don't understand why this is. We'll get there.
f1ydave said:
Do you notice when you are drinking cleaner water?
I'm not sure you could pick a less appropriate or more bizarre comparison . . .
f1ydave said:
For example, say the 240v line fluctuates (208v/240v) and drops just 20v to 220v, the ballast has just lost 20% of its effective power output, if it drops to 210v, that is a 30% drop, thus "dimming" the light
Um. Wow - of the several paragraphs that reveal what you don't know about electricity or electric loads, this might be the most revealing -
See - f1ydave - when an electric load is placed into a circuit of a given voltage - and that load consumes a steady amount of power - a voltage drop results in a higher draw of amperes to compensate for that voltage drop -
and the wattage remains the same.
f1ydave said:
(Power = Voltage X Current)
^^^ That.
f1ydave said:
Since power constantly fluctuates you have a constant uneven powerband effecting the ballast, power typically fluctuates 10% on local grids throughout the day, with many other factors that can effect that further, i.e. neighborhood power draw, business power draw, peak power, weather, quality; just to name a few.
Of the various forces that you refer to that causes "neighborhood" voltage to drop - none of that really matters at some 10% of 120V.
10% is the typical tolerance that (non-integrated) electrical componets are designed with - so this is probably the worst number you could've used . . .
But anyway, the load will compensate immediately via amp draw and the device will not stop working or produce erratic output if the current jumps around from 105 - 130V.
Whatever the input voltage, the output at the bulb will still be 1000w - especially at 10%+/- of 120 or 220V.
f1ydave said:
With a magnetic ballast this will be much higher, most cases it will be 150-300w or could be worse depending on the brand/quality.
Naww. Any of the magnetic ballasts built in the last 10 years or so are very efficient by some historical comparison - I think you'll find magnetic ballasts typically run at about 9 - 12% efficiency.
So I agree for large scale growers, across so many lights, there are some significant cost advantages - but you are overstating the numbers a bit.
f1ydave said:
With a Digital Ballast, you do not have these fluctuations. It is a constant 1000w of output at all times. The ballast acts as a voltage regulator.
^^^ Um. Sort of.
f1ydave said:
Digital Ballast are so efficient they even require a lot less power to operate.
"Efficient" means "less power to operate" - so I'm not sure what this circular sentence is supposed to mean . . . but let's get to what you are trying to convey:
And the voltage regulator doesn't have anything to do with the electric efficiency -
A digital ballast has a voltage regulator
in it - so the integrated circuitry - which does need a precision input voltage (unlike a giant magnetic core) - gets a constant voltage with which to operate.
Furthermore, voltage regulators use power themselves, so by virtue of the voltage regulator alone, the digital ballast would actually be
less efficient than a magnetic counterpart if it were not for the one salient feature of a magnetic ballast that you, f1ydave, have completely missed:
See - f1ydave - the comparative efficiencies of digital ballasts come from the fact that it has
far less mass to energize - less mass - less electrical resistance - more effiecient electrical usage.
The total electrical resistance of eight - sixteen ounces of silicon plated with platnium and gold - is far less than the combined electrical resistance of ten or fifteen pounds of steel and copper wire core.
It's sort of a basic fact of electronic gear - devices based on integrated circuits are always more efficient than devices based on coils.
Get it? It has almost nothing to do with output frequency, input voltage, voltage regulators, or anything else.
Now - let's get back to operating frequency:
f1ydave said:
You admitted that the bulb light is on longer, but also claim the plants wouldn't notice. Do you really think the plants wouldn't notice a better/longer light source?
Again - you clearly don't understand what it is you're talking about.
The bulb isn't "on longer" - it's just energized more times per second - with less energy per time - to keep the bulb lit.
So - a bulb operating at 60hz would get the top end of the sine-wave 60 times per second.
Where a bulb operating a 60khz would get the top end of the sine-wave 60k times per second - at 1/10,000th of the current.
But the cumulative amount of power applied to the bulb over that second - the peaks of the sine-wave summed together -
is still the same.
So the bulbs operating at the higher frequencies tend to have better bulb-life since the 60khz is easier on the internal wiring of the bulb that 60hz - but that's it.
And If you think cholorphyll cares, then you have some serious misunderstandings about photosynthetic function.
So, just like I said, the difference is in the numbers - not the bud. And that's about it.
f1ydave said:
I hope you have taken something away from this and you spread your new found knowledge of ballasts to those who need it!
Ha. Ahaha.
ROFL.
:tongue0011:
Back at ya homie.