"...the voltage and current are always changing sinusoidally."
This clears things up a bit more. Definition of sinusoidally [phonetic: sinus-oi-dally]: A frustratingly slow running nose.
Who's on first. Watt's on second.
Here is UL's answer to the question why toy transformers can be listed in either watts or volt-amps. I'm not sure it clears all up but it agrees with the reasoning expressed so far. The "ratings as shown below" were in a gif format I'm still fussing with but they were pretty mundane.
Jim
Thanks for your question. Toy Transformers are investigated per UL 697 Standard. For a Listed Toy Transformer, the electrical ratings should be included on the transformer. The requirements for electrical ratings are shown below. The output can be rated in either Watts or VA. I understand your point with respect to power factor and differences between True Power and Apparent Power. In a circuit having both reactive and resistive components, the Watts will always be less than or equal to the VA rating. However, UL 697 contains a Power Input and Output Test, which is specifically conducted in WATTS. Part of the test checks for verification of manufacturer's rating (in Watts)....and it needs to be within 90%. Further, if a manufacturer rates it in VA, then we assume a best-case power factor of one (1), where Watts would equal VA. Since testing is based on Watts, I don't see the potential for inflation of the nameplate ratings.
I came across this and thought it might help clear things up a bit:
http://www.abbottandcostello.net/clips/watts.wav
Chris
Bob, in the thread titled "4851 Transformer--AC or DC?" I mention hoping you can chime in on Lionel's cryptic ratings from the MPC/Fundimensions/Kughn period... 50 watts equals 7.5 VA? I don't think this can be close to right...
Any idea how they made their conversions/interpretations of the power ratings? K-Line did something similar w/ identical transformers - a 110 watt became like 40 or 50 VA w/o changing anything else.
Also, the MW & RS-1 transformers outperform by a longshot any 4150/4850/4851 50 watt transformers. I've used these w/ 2333's, eg, and they pull very well(better than any 1033/1044). The manuals, especially for the RS-1, seem to interchange the use of "VA" & "Watts", but for our purposes, it seems a 50VA transformer has A LOT more punch than a 50 watt, or even a 90 watt 1033/1044/4090.
Rob
It's not malicious. It better describes what the transformer can do. If I were to put a 180-watt load with a unity power factor on the 18-volt secondary of a transformer rated for a resistive load of 180 watts, I would draw 10 amperes. But, if my power factor were .5 (60-degree lead or lag), I would draw 20 amperes and destroy the transformer. Specifying the transformer as a 180-volt-ampere transformer instead would tell me that it is designed to put out only 10 amperes into a load of any power factor and that I really need a 360-volt-ampere transformer to get the 180 watts of power that I want for my (very reactive) load.
Since there is no limit in principle to how low the power factor can be, a transformer would literally have to be designed to handle infinite current for any finite power rating. Toy trains probably have fairly reasonable power factors, not too far below 1; so a transformer rated in watts probably won't be called on for infinite current. But I don't see any harm in specifying it in volt-amperes for those who understand the difference.
You could specify a toy-train transformer as capable of putting out, for example, 180 watts at a power factor of .9. But I think that would be even more of a puzzle than volt-amperes for most customers.
Bob Nelson
Several countries in Europe use the volt-amp rating instead of watts.
Far as using volt-amp ratings on train transformers it sounds to me like the company doing this is trying to deceive people about the true rating of the transformer.
Lee F.
A transformer generates heat from circulating currents induced in the magnetic materials (despite the use of laminations to reduce them). This effect is pretty much independent of the load. It also generates heat from the currents flowing through the wire windings, which have some resistance. This effect depends on the loading, but is significant even with no load because of the magnetizing current in the primary winding. The heat from these effects raises the temperature of the transformer, but slowly, because it has considerable thermal mass. The materials from which it is made put a limit on how hot the transformer can be allowed to get ultimately.
It is perfectly reasonable that a transformer could heat up to that limit after a long period of operation at 180 watts. But it is also reasonable that the same transformer could put out much more power, even the 95 extra watts that Lionel claims for the later ZW, if it has not yet reached its temperature limit. The copper windings would generate about twice as much heat (in proportion to the square of the current) so the temperature would rise faster than with the 180-watt load. But, until the whole thing reached its temperature limit, no harm would be done.
Notice that the Z and the two ZWs are all rated at 180 watts continuously, but 250, 250, and 275 for short times. That probably reflects someone's judgment that the later transformer's wiring, which does not have much thermal mass, could stand the higher short-term current, rather than any change in the capability of the actual transformer component.
I obviously misunderstood the meaning of peak power. You (two) guys obviously know what you're talking about, and make sense -- even I can grasp the volts, resistance, and current idea.
But my original point was (and is) that the difference between modern and older transformer ratings and actual useful power is a difference in the rating system, not (primarily) the result of warming up (which it seems to me can only account for a very small part of the drop from 275 to 180 watts for a ZW); nor the result of internal losses (input v. output) which, again, could not reasonably amount to 30 or 40 percent. Prewar and postwar transformers from Lionel and others (AF, Marx, Ives, Jefferson, etc.) were rated by peak power, while recent transformers are rated in terms of useable power (or whatever the technical term is).
And just BTW, when I said "the practice continued" into the 1970s, I was referring to toy train manufacturers, not the electrical industry as a whole.
The current flowing is a function of the voltage. If we all agree Peak Voltage is 1.4 times the RMS voltage than Peak current MUST be 1.4 times the RMS current for any fixed resistance R. RMS Current is nothing more than calculated or measured current based upon the RMS voltage and the load applied (Resistance). And Peak Current is calculated or measured based upon the Peak Voltage and the Load applied. Apples and Apples or Oranges and Oranges.
Man this has to be the geekiest thing I've ever done !!
Roland
I have never heard of such a thing. I started my engineering education in 1959 and would surely have heard about it if it was going on until the seventies. In any case, the ratings would have been off by a factor of 2, not 1.4, since the peak power is at least twice the average power (more if the power factor is less than 1).
The Lionel service manual says, for example, "Type 'Z' transformer, rated at 250 watts, can supply continuously 180 watts..." It seems to me that the distinction they are making is between the short-term and long-term capabilities of the transformer. The main thing that limits the power a transformer can put out is heating; and it takes a little while for the considerable mass of the transformer to get hot. In the meantime, it can be putting out more power than it can sustain in the long term. This is much like the situation with traction motors on locomotives, which have time limits put on how long they can be overloaded at various levels.
As I understand it, the difference between the nominal wattage rating of older (prewar and postwar) Lionel (and others) transformers and their actual, useful output has nothing to do with input v. output (There are not 30 percent losses in a Lionel transformer), nor to warming up, nor to volt-amps v. watts.
The apparent "discrepancy" is due to the use of power ratings based on peak voltage v. RMS voltage.
AC, as we all know, cycles from a peak voltage, down through zero, and then back up to the peak in the opposite direction. Effective/useful voltage is less than peak voltage; the peaks are roughly 1.4 times the effective (Root Mean Square, or RMS) voltage.
Now, early in the 20th century, in the formative years of the electrical industry and more particularly, the consumer radio business, radio manufacturers -- in a power race which you might liken to the auto horsepower race of the 1950s and 1960s -- managed to get "peak" power established as the industry standard. IOW, power ratings were based on peak voltage, rather than RMS voltage. In the Teens and Twenties, when transformers were being brought into the toy train world, they were considered "power transformers" and rated that way, leading to nominal ratings a third or so greater than the actual power available based on RMS voltage. This practice continued until sometime in the 1970s.
This is why a 270 watt contemporary transformer (MRC, for example) will actually deliver close to 270 watts (allowing for some modest internal losses), while a 275 watt (peak) postwar ZW is hard put to manage 200 watts of useful power (RMS).
And just BTW, one reason for all the confusion is that Lionel dissembled over the years, talking about "warming up," etc., in instruction books and other material, probably because they didn't want to get into the technical issues of peak v. RMS, etc., and in effect, admit that they were exaggerating the power.
The link shows the relationship between watts, volt amps, and reactive power or "vars". You can see where for resistive loads (no vars) the va and watts would be the same but then for other loads having more vars the va would be more. I am wondering if UL didn't prefer a conservative rating number in watts because the general public is familar with watt ratings because of the incandescent lamp? We know what a 75 watt bulb is. The 78 va fluorescent lamp and ballast.......now, that I'm not sure about. 8-)
http://upload.wikimedia.org/wikipedia/en/3/3a/Power_Triangle_01.png
It's true that the Lionel ratings were for input power. However, although a transformer does waste some power (and smaller ones typically waste a bigger share of what they handle), the power factor looking into even a toy-train transformer (the ratio of the volt-ampere product to the actual power consumed) largely reflects the power factor of the transformer's load. So they probably should have rated them in volt-amperes. But Lionel may just not have felt like following the custom of making the "volt-ampere" versus "watt" distinction.
lionelsoni wrote: Some devices, like incandescent lamps, are rated according to the amount of power that they can handle. Their power-handling capabilities are specified using the unit of power, the watt.Some other devices, like transformers, have a capability that is the product of their voltage rating and their current rating. Although a watt is a volt times an ampere, it is customary not to specify those devices in watts, so as not to give the impression that the rating is a power rating. Instead, the rating is given in volt-amperes.
Some devices, like incandescent lamps, are rated according to the amount of power that they can handle. Their power-handling capabilities are specified using the unit of power, the watt.
Some other devices, like transformers, have a capability that is the product of their voltage rating and their current rating. Although a watt is a volt times an ampere, it is customary not to specify those devices in watts, so as not to give the impression that the rating is a power rating. Instead, the rating is given in volt-amperes.
Does that explain why PW Lionel transformers were rated in watts? Their ratings were INPUT ratings--therefore usage--not OUTPUT--a capability rating.
Wattage is the amount of work force.
Your get the same work force from 1v at 10a as you get from 10v at 1a.
You need the voltage to properly run what you want. the appliance draws the current needed to operate correctly.
The only time you have to worry about led and lag phase is with massave inductive loads, (motors and transformers) as a factory. then the power supplier requires Capacitive banks.
amperage is what you have to look at. the higher amperage the heaver the or lower number wire size. The higher voltage the better insulation required.
A Watt is the measure of useful electrical power. A Volt-Amp is the total power including unuseful things like heat, electrical/magnetic energy given off as a byproduct of transmitting the good power. That's why uninterruptible power supplies (UPS) have a Volt-Amp rating which is higher than the Watts consumed by the devices they power.
There is a formula that converts one to the other but it is non-linear. In our normal train world, there is no practical difference.
This is the perspective of a non-EE person.
Efficiency really has nothing to do with it. Let me try an example:
Suppose you shut off everything electrical in your house except for 20 60-watt incandescent lights. This is 1200 watts of power; and the power company will supply you with 10 amperes of current at 120 volts from the transformer on the pole outside your house. They will use 1200 volt-amperes of the rating of the transformer to do this.
Then suppose you turn off the incandescent lights and turn on 30 40-watt fluorescent lights. This is also 1200 watts of power. But suppose that the current waveform drawn by your fluorescent lights lags the voltage waveform by 45 degrees. (This is a bit much, but it is the sort of thing that fluorescent-light-fixture ballasts do.) Then the power company will have to supply you with 14 amperes of current at 120 volts. Those 14 amperes are equivalent to 10 amperes in phase with the voltage and another 10 amperes lagging the voltage by 90 degrees. The in-phase 10 amperes multiplied by the 120 volts is your 1200 watts of power. The out-of-phase 10 amperes multiplied by the 120 volts is 1200 volt-amperes of "reactive" power. It is shuttling back and forth between your lamp ballasts and the transformer outside 120 times per second; but on the average it is zero--no power into your house nor registered on your meter. But now the power company needs a 40-percent bigger share of the transformer to supply your house, because the transformer size doesn't depend on the power delivered, just on the product of the voltage and the current.
What Bob was trying to expain is the difference between watts(power in a purely resistive circuit) and VARS(Volt-amps reactive). Volt-amps and watts are only the same if the circuit is pure resistance- like a toaster's heating element. When you have a transformer, it has inductance(due to the winding being a coil). When AC is impressed upon a coil, it reacts to that AC by causing the current waveform to lag behind the voltage waveform, thus creating a phase difference between the two. The amount of difference is quantified as VARS. What this means is that the efficency of the transformer or of a motor is always less than 100%, because what is being consumed by the device is a combination of watts and vars-- and you want to keep the vars to as small a number as possible----
So, Martinden is essentially correct, in that for model railroading purposes watts and VA are almost the same thing, and the minor differences can be ignored-- just remember that watts will always be less than volt-amps---
A Day Without Trains is a Day Wasted
The average power passing through a circuit is the average of the instantaneous voltage and current. With AC, the voltage and current are always changing sinusoidally. If their waveforms are in phase, the average power is the product of the RMS voltage and the RMS current. But, if the waveforms are not in phase, the average power is less, in proportion to the sine of the phase difference.
A device like a transformer has a limit to how much voltage it can stand and a limit to how much current it can stand. But these limits have nothing to do with the phase of the voltage and current waveforms. The transformer is therefore rated in terms of the product of the voltage and current limits, regardless of the phase difference and the resulting actual power transmitted. We use the units of volt-amperes instead of watts (which are indeed dimensionally identical) as a reminder that the device is rated in this way.
I too am curious to hear what some of our EE types have to share on this subject.
I don't understand why companies put VA ratings on AC transformers instead of using output wattage. I always thought the VA designation was for DC battery systems that will have a charge and/or load placed on them, or drawing off of them.
From what I have read it seems that the VA rating (when used for AC devices) is higher than the actual watt rating of the device. Which, if true, could lead someone to miscalculate (undersize) the power available for their trains. (Had to tie this back to trains somehow).
Can anyone clue me on on the difference between "watts" and "volt-amps" when it comes to toy train transformers? When I was in school, I remember learning that watts = volts X amps. So do the two terms mean the same thing?
And while we're at it, I'm confused about the #4851 starter set transformer. What is its wattage rating? Also, the guidebooks list it as a DC transformer, but mine is putting out AC voltage. Anyone else find this to be the case?
John
Our community is FREE to join. To participate you must either login or register for an account.
Get the Classic Toy Trains newsletter delivered to your inbox twice a month