resisters for leds

Lm begining to install leds in my loco 's what i need to know is .i m using 12vt leds for rear and front lites . what i don’t know is what resistor is needed for them .using digitrax decoders for them.wattages .ohm’s

See the DCC wiki for more info: http://www.dccwiki.com/LED_Lighting_and_Resistors

Here is another link. There is also a online resistor calculator.

http://led.linear1.org/1led.wiz

I normally use 750 ohm since I run 14 volts for DCC and I like to keep the current a little under 20ma.

http://led.linear1.org/

rich

LEDs have a spec known as forward voltage; it describes the optimum voltage across the LED and is usually paired with an optimum forward current. This represents a point on the operating curve that is usually safe. For white LEDs, forward voltage is in the 3.0 - 3.5 volt range, normally 3.3 volts. Optimum forward current is typically 20mA; these ratings should be printed on the LED package.

In order to achieve this using a 12-volt power supply, you will need a 470-ohm resistor in line with the LED. If it’s too bright, put in a larger resistor. If you put in a smaller resistor the LED will be much brighter, but won’t last very long.

Hope this makes sense.

Chris

If you’re certain your LEDs are rated for 12 volts, you don’t need a resistor other than one of maybe 10 Ohms to protect the decoder against a power surge, because most decoders have an output of 12 Volts on their light function output.

EDIT: Sorry for the mistake. I meant to type 100 Ohms, not 10. In order to drop 2 Volts at 30mA, you would need 66 Ohms, but 100 Ohms would be better to protect the LED and decoder.

Thanks to betamax for pointing that out.

12 LEDs - are you referring to LEDs with resistors presoldered to them? LEDs are 2.5V to 3.5V typically; BTW you should calculate using 14 volts and not 12 volts for DCC.

In any case most LEDs for DCC locos work well with a 1000 ohm 1/4 watt resistor. Light output is not linear as with a light bulb; using a lower resistor value in most cases increases the current with not much difference in light; any spikes through the LED working at maximum current may cause it to fail. I’ve had to replace a few under these circumstances; in each case the installer used the recommended resistors values from Miniatronics and others.

That will work for about a second.

The whole point of the series resistor is to limit the current flow. Ten ohms is a waste of time, and of LEDs. LEDs are not light bulbs, and do not share many characteristics of them either.

I had my multi-meter on my DH123 yesterday and it was putting out 13.68V on the lighting wires. So it sounds to me that the 14V figure is correct. To remove doubt go to the above website and do the math and save your LED’s and decoders.

I have some 1.6mm OD white LEDs that are rated for 3.2 volts and 20 ma. calculated resistor is 470 ohms.

I used at 470 ohm resistor and 12 .2 volt battery and measured 18.5 ma with 3.2 volts on the LED.

I have some 5mm inverted cone white LEDs that measure the same.

I use 1/4 watt resistors. Below is a wattage calculator link.

http://www.anderson-bolds.com/calculator.htm

rich

what you are saying a 14 vt with a 450 ohms woulld work for most apps. i was told a 12 vt with a 750 ohms would work .but wasn’t sure

also i will look at links that was sent they look qute helpfull thanks guys , when i started this project i told by several people do it one or another way in which got i confussed about it

If you take RMC they had a great series on DCC I collected all the articles and compiled them into a little binder for reference. My point is If its possible take a lokk back through those issues. I got a set of 3mm LED’s from Eberle trains ,now defunct I think that also supplied 1K resistors with em. I think you’ll do alright provided you know for sure the values of your components. Guessing in electronics kinda equates with smoke and mirrors. Good Luck

The values I previously mentioned were just on the work bench measurements. I would rather not run at the current limit of 20 ma. I use 750 ohms in my engines just to be cautious.

Rich

You’re best off with 1k-1/4watt resistors. Many decoders produce a surge-current for lightbulbs when powering up, (some do have a CV setting just for LEDs). This surge-current can cause an LED to fail over time, even if you have the “correct” calculated value resistor. The 1k will protect the LED and extend it’s life.

All the calculations are correct in theory, but not neccessarilly right for the real world. The 1k gives you a good fudge factor and works with any LEDs in the 2 to 4 volt range without affecting the brightness much and you don’t need to keep a whole bunch of different resistors around.

I just happened to read this AM In the sondtraxx manual,that certain soundtraxx decoders prefer incandescant bulbs in the 12- 16V range.

Here is info concerning this issue. I use this fix with LEDs in LC conversions for my HO steam engines.

http://www.tonystrains.com/technews/soundtraxx-lcleds.htm

http://www.members.optusnet.com.au/nswmn1/LEDs_DSDs.htm

http://www.members.optusnet.com.au/nswmn1/Lights_in_DCC.htm

rich

LEDs are not lightbulbs. A lightbulb wants a constant voltage drive, and it will limit its current all by itself. LEDs want a constant 20 mA current and do nothing the limit that current. Think of an LED as a rectifier that just happens to emit light when it’s conducting. Rectifier are either OFF or ON, like a switch. If you were to hook a switch from plus to minus on your power pack and turn it ON, the power pack will see a dead short circuit. Current thru a short circuit goes to infinity. In the real world something melts before you get very close to infinity. Hooking up a LED to power without a series resistor is just like closing a switch, you get a short circuit. LED’s are fairly tender and they blow out in microseconds (Faster than the eye can see) thus protecting the power pack at the cost of their own life. Very noble minded components, LEDs are.

So, LED’s always need a series resistor. Find the needed value with Ohm’s law, R = V/I. I is always 20 milliamps (0.02 Amps) and V is the driving voltage from whatever you are powering the LED with. This calculation makes a short cut, we are assuming that the voltage drop across the LED is zero.

If you want to be a little more accurate, subtract the voltage drop across the LED from the driving voltage. LED’s usually drop about 2.75 volts at full brightness. I don’t pay much attention to the ‘voltage rating’ of individual LED’s from the catalog or data sheet. It usually a worst case number (“All my LED’s will be less than 3.5 volts at max current on a cold day”). When you measure the actual voltage drop across a glowing LED, its pretty close to 2.75 volts no matter who made the LED in question. If you are driving the LED from something like 12 or 14 volts, the LED voltage drop is not very significant. If you are using

[quote user=“dstarr”]

LEDs are not lightbulbs. A lightbulb wants a constant voltage drive, and it will limit its current all by itself. LEDs want a constant 20 mA current and do nothing the limit that current. Think of an LED as a rectifier that just happens to emit light when it’s conducting. Rectifier are either OFF or ON, like a switch. If you were to hook a switch from plus to minus on your power pack and turn it ON, the power pack will see a dead short circuit. Current thru a short circuit goes to infinity. In the real world something melts before you get very close to infinity. Hooking up a LED to power without a series resistor is just like closing a switch, you get a short circuit. LED’s are fairly tender and they blow out in microseconds (Faster than the eye can see) thus protecting the power pack at the cost of their own life. Very noble minded components, LEDs are.

So, LED’s always need a series resistor. Find the needed value with Ohm’s law, R = V/I. I is always 20 milliamps (0.02 Amps) and V is the driving voltage from whatever you are powering the LED with. This calculation makes a short cut, we are assuming that the voltage drop across the LED is zero.

If you want to be a little more accurate, subtract the voltage drop across the LED from the driving voltage. LED’s usually drop about 2.75 volts at full brightness. I don’t pay much attention to the ‘voltage rating’ of individual LED’s from the catalog or data sheet. It usually a worst case number (“All my LED’s will be less than 3.5 volts at max current on a cold day”). When you measure the actual voltage drop across a glowing LED, its pretty close to 2.75 volts no matter who made the LED in question. If you are driving the LED from something like 12 or 14 volts, the LED voltage drop is not very significant.&n

Wow thanks guys. Lots of great info and links here.

Gonna be reading for a while I see :smiley:

Thanks again :slight_smile:

If I could just borrow this thread for a moment. On a similar
LED question. I have a 3.3 v computer PP that I am using to power
my structure lights. I have one complex that so far has 12
white LED’s. Combination of 3mm and 5mm. I currently have
them wired in parallel directily off the power supply. Someone
said that I should have a limiting resistor for current
protection? Is that a valid statement. If so how large of a
resistor should I use. These LEDs have about 10 hours on them as
is to date with no problems. Thanks

Terry from FLorida