dstarr wrote: scubaterry wrote: If I could just borrow this thread for a moment. On a similarLED question. I have a 3.3 v computer PP that I am using to powermy structure lights. I have one complex that so far has 12white LED's. Combination of 3mm and 5mm. I currently havethem wired in parallel directily off the power supply. Someonesaid that I should have a limiting resistor for currentprotection? Is that a valid statement. If so how large of aresistor should I use. These LEDs have about 10 hours on them asis to date with no problems. Thanks Terry from FLorida Hmm, interesting hookup. 3.3 volts is the newer logic level, replacing the traditional 5 volts. The forward voltage drop for most LED's is not far from 3.3 volts. So there isn't much extra voltage to drop across the current limiting resistor. It's not a circuit that I would use, but from what you say, the white LED's are lighting and not going poof. You could put a 27 ohm resistor in series with each LED, but if they are lighting and surviving, adding the resistors won't change things much. Was I doing is from scratch, I'd want to use a higher voltage supply, so I could drop more volts across the resistors. What's happening here is the LED forward drop seems to be just enough to limit the current to a safe (non LED smoking) value. Forward voltage drop on LED's can vary quite a bit from batch to batch. It's considered better design to depend upon the resistor which is good to 5 % and the supply voltage (usually good to 1 %) to determine circuit current, rather than LED forward voltage drop which can vary by 30 % or more. On the other hand, if it's working for you, there is no reason not to leave well enough alone. Long as the LED's don't smoke and the brightness is reasonable, it's working. LED's blow out in milliseconds, faster than the eye can see. If yours have been lighting for some time, they will continue to light for a long long time. LED's don't wear out like incadescent lamps, they will glow happily forever. .
scubaterry wrote: If I could just borrow this thread for a moment. On a similarLED question. I have a 3.3 v computer PP that I am using to powermy structure lights. I have one complex that so far has 12white LED's. Combination of 3mm and 5mm. I currently havethem wired in parallel directily off the power supply. Someonesaid that I should have a limiting resistor for currentprotection? Is that a valid statement. If so how large of aresistor should I use. These LEDs have about 10 hours on them asis to date with no problems. Thanks Terry from FLorida
If I could just borrow this thread for a moment. On a similarLED question. I have a 3.3 v computer PP that I am using to powermy structure lights. I have one complex that so far has 12white LED's. Combination of 3mm and 5mm. I currently havethem wired in parallel directily off the power supply. Someonesaid that I should have a limiting resistor for currentprotection? Is that a valid statement. If so how large of aresistor should I use. These LEDs have about 10 hours on them asis to date with no problems. Thanks
Terry from FLorida
Hmm, interesting hookup. 3.3 volts is the newer logic level, replacing the traditional 5 volts. The forward voltage drop for most LED's is not far from 3.3 volts. So there isn't much extra voltage to drop across the current limiting resistor. It's not a circuit that I would use, but from what you say, the white LED's are lighting and not going poof. You could put a 27 ohm resistor in series with each LED, but if they are lighting and surviving, adding the resistors won't change things much.
Was I doing is from scratch, I'd want to use a higher voltage supply, so I could drop more volts across the resistors. What's happening here is the LED forward drop seems to be just enough to limit the current to a safe (non LED smoking) value. Forward voltage drop on LED's can vary quite a bit from batch to batch. It's considered better design to depend upon the resistor which is good to 5 % and the supply voltage (usually good to 1 %) to determine circuit current, rather than LED forward voltage drop which can vary by 30 % or more. On the other hand, if it's working for you, there is no reason not to leave well enough alone. Long as the LED's don't smoke and the brightness is reasonable, it's working. LED's blow out in milliseconds, faster than the eye can see. If yours have been lighting for some time, they will continue to light for a long long time. LED's don't wear out like incadescent lamps, they will glow happily forever. .
I think I agree with dstarr. I would be tempted to add 27 Ohms in series with each LED, since I think that at 3.3V you are probably at the high end of current through the LEDs, and probably reducing their life. It's better to add a resistor for each LED, rather then for a group, since the resistor size would change depending on how many LEDs are in the group.
Jeff But it's a dry heat!
scubaterry wrote:If I could just borrow this thread for a moment. On a similarLED question. I have a 3.3 v computer PP that I am using to powermy structure lights. I have one complex that so far has 12white LED's. Combination of 3mm and 5mm. I currently havethem wired in parallel directily off the power supply. Someonesaid that I should have a limiting resistor for currentprotection? Is that a valid statement. If so how large of aresistor should I use. These LEDs have about 10 hours on them asis to date with no problems. Thanks Terry from FLorida
David Starr www.newsnorthwoods.blogspot.com
Wow thanks guys. Lots of great info and links here.
Gonna be reading for a while I see :D
Thanks again :)
dstarr wrote: LEDs are not lightbulbs. A lightbulb wants a constant voltage drive, and it will limit its current all by itself. LEDs want a constant 20 mA current and do nothing the limit that current. Think of an LED as a rectifier that just happens to emit light when it's conducting. Rectifier are either OFF or ON, like a switch. If you were to hook a switch from plus to minus on your power pack and turn it ON, the power pack will see a dead short circuit. Current thru a short circuit goes to infinity. In the real world something melts before you get very close to infinity. Hooking up a LED to power without a series resistor is just like closing a switch, you get a short circuit. LED's are fairly tender and they blow out in microseconds (Faster than the eye can see) thus protecting the power pack at the cost of their own life. Very noble minded components, LEDs are. So, LED's always need a series resistor. Find the needed value with Ohm's law, R = V/I. I is always 20 milliamps (0.02 Amps) and V is the driving voltage from whatever you are powering the LED with. This calculation makes a short cut, we are assuming that the voltage drop across the LED is zero. If you want to be a little more accurate, subtract the voltage drop across the LED from the driving voltage. LED's usually drop about 2.75 volts at full brightness. I don't pay much attention to the 'voltage rating' of individual LED's from the catalog or data sheet. It usually a worst case number ("All my LED's will be less than 3.5 volts at max current on a cold day"). When you measure the actual voltage drop across a glowing LED, its pretty close to 2.75 volts no matter who made the LED in question. If you are driving the LED from something like 12 or 14 volts, the LED voltage drop is not very significant. If you are using 5 volts, then it is. The 20 mA current is the max current that the average LED will tolerate before going "poof". Most LEDs will glow plenty bright enough on less, say 10 mA. So the resistor value isn't very critical, you can go 20-50% high on the resistor and the LED will glow just fine.
LEDs are not lightbulbs. A lightbulb wants a constant voltage drive, and it will limit its current all by itself. LEDs want a constant 20 mA current and do nothing the limit that current. Think of an LED as a rectifier that just happens to emit light when it's conducting. Rectifier are either OFF or ON, like a switch. If you were to hook a switch from plus to minus on your power pack and turn it ON, the power pack will see a dead short circuit. Current thru a short circuit goes to infinity. In the real world something melts before you get very close to infinity. Hooking up a LED to power without a series resistor is just like closing a switch, you get a short circuit. LED's are fairly tender and they blow out in microseconds (Faster than the eye can see) thus protecting the power pack at the cost of their own life. Very noble minded components, LEDs are.
So, LED's always need a series resistor. Find the needed value with Ohm's law, R = V/I. I is always 20 milliamps (0.02 Amps) and V is the driving voltage from whatever you are powering the LED with. This calculation makes a short cut, we are assuming that the voltage drop across the LED is zero.
If you want to be a little more accurate, subtract the voltage drop across the LED from the driving voltage. LED's usually drop about 2.75 volts at full brightness. I don't pay much attention to the 'voltage rating' of individual LED's from the catalog or data sheet. It usually a worst case number ("All my LED's will be less than 3.5 volts at max current on a cold day"). When you measure the actual voltage drop across a glowing LED, its pretty close to 2.75 volts no matter who made the LED in question. If you are driving the LED from something like 12 or 14 volts, the LED voltage drop is not very significant. If you are using 5 volts, then it is.
The 20 mA current is the max current that the average LED will tolerate before going "poof". Most LEDs will glow plenty bright enough on less, say 10 mA. So the resistor value isn't very critical, you can go 20-50% high on the resistor and the LED will glow just fine.
Well done! To reinforce LED characteristics, the current flow is the factor to concentrate on as the voltage drop is based on the LED type and semiconductor material it is made from. Once you reach the IR drop of a LED, diode or transistor, the current flow can vary with the voltage drop changing only slightly, if at all. As Dstarr indicated use a resistor that will give you less than maximum rated current. The difference in brightness is slight and your LED will not be subjected to spike currents that could damage it.
Carl in Florida - - - - - - - - - - We need an HO Amtrak SDP40F and GE U36B oh wait- We GOT THEM!
reklein wrote:I just happened to read this AM In the sondtraxx manual,that certain soundtraxx decoders prefer incandescant bulbs in the 12- 16V range.
Here is info concerning this issue. I use this fix with LEDs in LC conversions for my HO steam engines.
http://www.tonystrains.com/technews/soundtraxx-lcleds.htm
http://www.members.optusnet.com.au/nswmn1/LEDs_DSDs.htm
http://www.members.optusnet.com.au/nswmn1/Lights_in_DCC.htm
rich
If you ever fall over in public, pick yourself up and say “sorry it’s been a while since I inhabited a body.” And just walk away.
You're best off with 1k-1/4watt resistors. Many decoders produce a surge-current for lightbulbs when powering up, (some do have a CV setting just for LEDs). This surge-current can cause an LED to fail over time, even if you have the "correct" calculated value resistor. The 1k will protect the LED and extend it's life.
All the calculations are correct in theory, but not neccessarilly right for the real world. The 1k gives you a good fudge factor and works with any LEDs in the 2 to 4 volt range without affecting the brightness much and you don't need to keep a whole bunch of different resistors around.
Jay
C-415 Build: https://imageshack.com/a/tShC/1
Other builds: https://imageshack.com/my/albums
mmartian22 wrote:what you are saying a 14 vt with a 450 ohms woulld work for most apps. i was told a 12 vt with a 750 ohms would work .but wasn't sure
The values I previously mentioned were just on the work bench measurements. I would rather not run at the current limit of 20 ma. I use 750 ohms in my engines just to be cautious.
Rich
I have some 1.6mm OD white LEDs that are rated for 3.2 volts and 20 ma. calculated resistor is 470 ohms.
I used at 470 ohm resistor and 12 .2 volt battery and measured 18.5 ma with 3.2 volts on the LED.
I have some 5mm inverted cone white LEDs that measure the same.
I use 1/4 watt resistors. Below is a wattage calculator link.
http://www.anderson-bolds.com/calculator.htm
cacole wrote:If you're certain your LEDs are rated for 12 volts, you don't need a resistor other than one of maybe 10 Ohms to protect the decoder against a power surge, because most decoders have an output of 12 Volts on their light function output.
If you're certain your LEDs are rated for 12 volts, you don't need a resistor other than one of maybe 10 Ohms to protect the decoder against a power surge, because most decoders have an output of 12 Volts on their light function output.
cacole wrote: If you're certain your LEDs are rated for 12 volts, you don't need a resistor other than one of maybe 10 Ohms to protect the decoder against a power surge, because most decoders have an output of 12 Volts on their light function output.
12 LEDs - are you referring to LEDs with resistors presoldered to them? LEDs are 2.5V to 3.5V typically; BTW you should calculate using 14 volts and not 12 volts for DCC.
In any case most LEDs for DCC locos work well with a 1000 ohm 1/4 watt resistor. Light output is not linear as with a light bulb; using a lower resistor value in most cases increases the current with not much difference in light; any spikes through the LED working at maximum current may cause it to fail. I've had to replace a few under these circumstances; in each case the installer used the recommended resistors values from Miniatronics and others.
EDIT: Sorry for the mistake. I meant to type 100 Ohms, not 10. In order to drop 2 Volts at 30mA, you would need 66 Ohms, but 100 Ohms would be better to protect the LED and decoder.
Thanks to betamax for pointing that out.
LEDs have a spec known as forward voltage; it describes the optimum voltage across the LED and is usually paired with an optimum forward current. This represents a point on the operating curve that is usually safe. For white LEDs, forward voltage is in the 3.0 - 3.5 volt range, normally 3.3 volts. Optimum forward current is typically 20mA; these ratings should be printed on the LED package.
In order to achieve this using a 12-volt power supply, you will need a 470-ohm resistor in line with the LED. If it's too bright, put in a larger resistor. If you put in a smaller resistor the LED will be much brighter, but won't last very long.
Hope this makes sense.
Chris
mmartian22 wrote:Lm begining to install leds in my loco 's what i need to know is .i m using 12vt leds for rear and front lites . what i don't know is what resistor is needed for them .using digitrax decoders for them.wattages .ohm's
Lm begining to install leds in my loco 's what i need to know is .i m using 12vt leds for rear and front lites . what i don't know is what resistor is needed for them .using digitrax decoders for them.wattages .ohm's
Here is another link. There is also a online resistor calculator.
http://led.linear1.org/1led.wiz
I normally use 750 ohm since I run 14 volts for DCC and I like to keep the current a little under 20ma.
http://led.linear1.org/