Trains.com

Subscriber & Member Login

Login, or register today to interact in our online community, comment on articles, receive our newsletter, manage your account online and more!

Trying to Understand Resistors w/LEDs

1870 views
7 replies
1 rating 2 rating 3 rating 4 rating 5 rating
  • Member since
    January 2001
  • From: US
  • 406 posts
Posted by donhalshanks on Saturday, September 5, 2009 4:26 PM

Thanks for the help, and now I know!

Hal

  • Member since
    February 2007
  • From: Christiana, TN
  • 2,134 posts
Posted by CSX Robert on Friday, September 4, 2009 5:24 PM
12v supply - 3.2v drop through LED = 8.8 v drop through the resistors. You can use a voltage divider calculator to calculate the voltage drop through each resistor. The voltage drop through the 50 ohm resistor = 2.93 and the voltage drop through the 100 ohm resistor = 5.87. Watts = voltage x current. Current = .060A(60mA, the same through all three devices since they are all in series). Watts dissipated by 50 ohm resistor = 2.93 x .060 = 0.176. Watts dissipated by 100 ohm resistor = 5.87 x .060 = 0.352. You could use a 1/4 watt for the 50 ohm and a 1/2 watt for the 100 ohm.
  • Member since
    February 2001
  • From: Poconos, PA
  • 3,948 posts
Posted by TomDiehl on Friday, September 4, 2009 3:28 PM

donhalshanks

Thanks for the references and help!  The calculators show that a 60ma, 3.2 v, 12v DC supply LED........ requires a 150 ohm resistor of 1w.  Now, if I use a 100 ohm and a 50 ohm resistor in series with the LED(instead of a 150 ohm resistor) ....... does each resistor need to be 1w, or can the 100 ohm be 1 watt, and the 50 ohm 1/2w?  Not sure if the references showed me the answer.  Of course, I'm looking for the answer to also give me the concept, so I can apply a rule to any dual combination I might put in series with an LED of any value.... for total resistance as required from the calculators. 

Hope I'm not being a pest.  Thanks for your patience.

 Hal    

If you're connecting resistors in series to achieve a given total resistance, their wattage value should be the same, or at least higher than required by the circuit (the one you're talking about is around .75 Watt) If space is an issue, picking up single resistors in the right value would be a good idea.

Smile, it makes people wonder what you're up to. Chief of Sanitation; Clowntown
  • Member since
    July 2006
  • From: Vail, AZ
  • 1,943 posts
Posted by Vail and Southwestern RR on Friday, September 4, 2009 1:36 PM

Resistors in series will dissipate power in proportion to their value.

You can calculate the actual power that will be dissipated by any resistor pretty easily...

Either voltage x current, current squared x resistance, or voltage squared divided by resistance.

I think most calculators give a conservative power rating which is fine.  The only trouble in using a larger rating than required is that the resistor will be physically larger.

Jeff But it's a dry heat!

  • Member since
    January 2001
  • From: US
  • 406 posts
Posted by donhalshanks on Friday, September 4, 2009 1:14 PM

Thanks for the references and help!  The calculators show that a 60ma, 3.2 v, 12v DC supply LED........ requires a 150 ohm resistor of 1w.  Now, if I use a 100 ohm and a 50 ohm resistor in series with the LED(instead of a 150 ohm resistor) ....... does each resistor need to be 1w, or can the 100 ohm be 1 watt, and the 50 ohm 1/2w?  Not sure if the references showed me the answer.  Of course, I'm looking for the answer to also give me the concept, so I can apply a rule to any dual combination I might put in series with an LED of any value.... for total resistance as required from the calculators. 

Hope I'm not being a pest.  Thanks for your patience.

 Hal 

     

  • Member since
    February 2009
  • From: Enfield, CT
  • 935 posts
Posted by Doc in CT on Thursday, September 3, 2009 7:30 PM

Co-owner of the proposed CT River Valley RR (HO scale) http://home.comcast.net/~docinct/CTRiverValleyRR/

  • Member since
    February 2001
  • From: Poconos, PA
  • 3,948 posts
Posted by TomDiehl on Thursday, September 3, 2009 4:48 PM

 To the first part of the question, if they're tied in series, yes, resistance adds in a series circuit. The wattage is related to the current drawn in the circuit.

The purpose of the resistor is to drop the voltage across the LED. If you know the voltage rating and current (mA) drawn by the LED, you can calculate its resistance. R=E/I where resistance (in ohms)equals voltage (Volts) divided by current (in Amps). For the correct calculation, remember that 1 mA is 1/1000 of an Amp. The voltage drop across each of the components in a series circuit is proportional to the resistance of each component. For example, a 500 ohm LED in series with a 1500 ohm resistor with 12 volts in the circuit, the LED will drop 3 volts and the resistor will drop 9 volts.

To calculate the wattage of the resistor, power (in watts) equals current squared (in Amps) times resistance (in ohms). Always go the next higher wattage rating available, never lower.

Smile, it makes people wonder what you're up to. Chief of Sanitation; Clowntown
  • Member since
    January 2001
  • From: US
  • 406 posts
Trying to Understand Resistors w/LEDs
Posted by donhalshanks on Thursday, September 3, 2009 4:26 PM

Are resistors accumulative in a circuit?  For example, if I need 150 ohm resistor in the circuit for an LED, will a 100 ohm and a 50 ohm resistor do the job?  As to watts... I'm a little unclear.... do I need to be concerned what the watt of a resistor is in a circuit with an LED, as long as I have the right resistance and voltage and the ma of the LED?

Thanks for the help.  Hal

Subscriber & Member Login

Login, or register today to interact in our online community, comment on articles, receive our newsletter, manage your account online and more!

Users Online

There are no community member online

Search the Community

ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
Model Railroader Newsletter See all
Sign up for our FREE e-newsletter and get model railroad news in your inbox!