Are resistors accumulative in a circuit? For example, if I need 150 ohm resistor in the circuit for an LED, will a 100 ohm and a 50 ohm resistor do the job? As to watts... I'm a little unclear.... do I need to be concerned what the watt of a resistor is in a circuit with an LED, as long as I have the right resistance and voltage and the ma of the LED?
Thanks for the help. Hal
To the first part of the question, if they're tied in series, yes, resistance adds in a series circuit. The wattage is related to the current drawn in the circuit.
The purpose of the resistor is to drop the voltage across the LED. If you know the voltage rating and current (mA) drawn by the LED, you can calculate its resistance. R=E/I where resistance (in ohms)equals voltage (Volts) divided by current (in Amps). For the correct calculation, remember that 1 mA is 1/1000 of an Amp. The voltage drop across each of the components in a series circuit is proportional to the resistance of each component. For example, a 500 ohm LED in series with a 1500 ohm resistor with 12 volts in the circuit, the LED will drop 3 volts and the resistor will drop 9 volts.
To calculate the wattage of the resistor, power (in watts) equals current squared (in Amps) times resistance (in ohms). Always go the next higher wattage rating available, never lower.
helpful links on LED and resistors
http://www.kpsec.freeuk.com/components/led.htm
http://ledz.com/?p=zz.led.resistor.calculator
Co-owner of the proposed CT River Valley RR (HO scale) http://home.comcast.net/~docinct/CTRiverValleyRR/
Thanks for the references and help! The calculators show that a 60ma, 3.2 v, 12v DC supply LED........ requires a 150 ohm resistor of 1w. Now, if I use a 100 ohm and a 50 ohm resistor in series with the LED(instead of a 150 ohm resistor) ....... does each resistor need to be 1w, or can the 100 ohm be 1 watt, and the 50 ohm 1/2w? Not sure if the references showed me the answer. Of course, I'm looking for the answer to also give me the concept, so I can apply a rule to any dual combination I might put in series with an LED of any value.... for total resistance as required from the calculators.
Hope I'm not being a pest. Thanks for your patience.
Hal
Resistors in series will dissipate power in proportion to their value.
You can calculate the actual power that will be dissipated by any resistor pretty easily...
Either voltage x current, current squared x resistance, or voltage squared divided by resistance.
I think most calculators give a conservative power rating which is fine. The only trouble in using a larger rating than required is that the resistor will be physically larger.
Jeff But it's a dry heat!
donhalshanks Thanks for the references and help! The calculators show that a 60ma, 3.2 v, 12v DC supply LED........ requires a 150 ohm resistor of 1w. Now, if I use a 100 ohm and a 50 ohm resistor in series with the LED(instead of a 150 ohm resistor) ....... does each resistor need to be 1w, or can the 100 ohm be 1 watt, and the 50 ohm 1/2w? Not sure if the references showed me the answer. Of course, I'm looking for the answer to also give me the concept, so I can apply a rule to any dual combination I might put in series with an LED of any value.... for total resistance as required from the calculators. Hope I'm not being a pest. Thanks for your patience. Hal
If you're connecting resistors in series to achieve a given total resistance, their wattage value should be the same, or at least higher than required by the circuit (the one you're talking about is around .75 Watt) If space is an issue, picking up single resistors in the right value would be a good idea.
Thanks for the help, and now I know!