Recently, I posted to a thread on this forum suggesting that DC was less susceptible to dirty wheels, track, etc in part because DC can put out a higher track voltage. In response, I received a PM that pointed out - corrrectly - that most HO power packs are rated at 12vdc, while DCC runs typically around 15-16v peak-to-peak. This caused me to check my layouts.
I have 2 DC systems. One is an MRC Tech 4 220 controller. It's rated at 23vdc output and I measured 21vdc open circuit on the track, using a Simpson VOM. My other system uses an MRC Model 76 controller powered by an old Bachmann 6607 controller strapped to run at max output voltage. The 6607 is rated at 17vdc output and I actually measure 23vdc open circuit across the tracks.
When I look at the instruction sheet that comes with my N-scale Atlas DC-equipped locomotives, it plainly states: "Maximum operating voltage: 12vdc."
Does this mean that I am running these locos at as much as 100% overvoltage? I would think that's a bad idea. Does it mean that the controllers are rated at much too high an output voltage for the motors they are supposed to run? I'd have thought that MRC especially wouldn't do that. Am I risking frying my equipment? Worse, since I often use the MRC 220 for break-in runs on DCC locos - at 20+ volts, am I likely to lose a decoder?
What are the proper operating voltages for these locomotives, and are my controllers putting out too much voltage? Should I be using some sort of voltage-limiting circuit to protect my equipment?
First off, just what voltage were you measuring?
Analog DC is, typically, unfiltered - so the nominal 12 volts is the RMS voltage, 70.7% of the VPP. Add a filter capacitor and the no-load apparent voltage rises to the VPP, just under 17 volts.
Now, add in the fact that the output voltage of a transformer is a direct ratio from the input voltage (which is almost never exactly 120VAC) and that the original design used a selenium rectifier while the current production model uses silicon diodes - with the same number of turns in the transformer's primary and secondary coils...
IIRC, the old silicon bridge rectifier swallowed about four volts, or 2.5 volts more than a silicon diode bridge rectifier. That meant that the AC input had to be 16V RMS - or 22.6VPP. If the transformer has the same specs, the silicon bridge rectifier with a filter capacitor will yield a no-load smooth 21.2 VDC.
Now for the critical difference. DCC has a constant voltage on the rails. Analog DC is varied by the speed control, which is never at 100% unless you operate TGV at full prototype speed. Back EMF from a moving motor also lowers the voltage at the rails. Bottom line - DCC always has more voltage available to punch through dirt unless the DC loco has stalled out on dirty track.
The solution - run a John Allen slider car (or several) and keep the railheads clean.
Chuck (modeling Central Japan in September, 1964 - analog DC, MZL system)
Dr. Frankendiesel aka Scott Running BearSpace Mouse for president!15 year veteran fire fighterCollector of Apple //e'sRunning Bear EnterprisesHistory Channel Club life member.beatus homo qui invenit sapientiam
DC
http://uphonation.com
Measuring the voltage with no train running is inaccurate because many power packs have very poor regulation and will always measure high with no load, but the voltage will drop under load. To get a more accurate reading, you need a power resistor across the tracks, something in the range of 10 Ohms at 20 Watts, or you need to have one or more locomotives running.
Even though most motors are rated for 12 Volts, they usually can operate at a higher voltage without harm, but will run too fast.
The DCC voltage is a high frequency square wave, which a typical voltmeter cannot accurately measure because they don't react fast enough to the changes in polarity. Even RMS calculations will not give you an accurate measurement, but will be closer than the actual meter reading.