Looking for some LED help. I model in HO. I was wondering what the optimum voltage that power should be resisted down to in order to maximize LED life and also for best realism. I love LED’s but most have forgotten more about LED lighting than I even know. Any suggestions? Please weight in.
Most LED’s run with about 2-3 volts at no more that 20 ma. If you have a 12-15v power source(like an old power pack, or the output of a DCC decoder function output), then you should be safe with a 1000 ohm resistor. I have seen anywhere from 680 ohm to 1000 ohm used with most LED’s - exceeding the current rating of the LED is what kills them. I have never had a LED ‘burn out’, but exceeding that rated current limit will do them in. They are usually rated on 10,000 - 50,000 hours of life.
Jim
I agree with everything Jim says. All of my locos have LED headlights and ditch lights.
By the way, 10,000 hours of life is equivalent to a little over a year of constant use. Meaning that if you left the LED on at full power, it would last a little over a year. Of course most people don’t leave their headlights on on their locomotives constantly for over a year.
Jim is defiantly right in every thing he says. The other thing i would add to that is although most LED’s run at about 2-3 volts its more of a “depends which ones you get” situation. I am an electrical engineer and i can tell you that you can get LED’s that will run any voltage and handle a variety of currents. Digikey.com is a great place to get just about every electrical component you could ever want. They sell all sorts of sizes and shapes.
Another thing that people tend to forget is that it takes a minimum of 1.7 volts to turn on a basic LED. That also means there is a 1.7V drop over the LED its self. In other words 2 LED’s in series would take a 3.4V source to turn on at minimum.
If you have not purchased any thing yet i would advise planing things around a 12V or 18V system. Those are common voltages in electronics and a great deal of components are available in those ratings.
To answer the original question i guess i would say the optimal voltage is the right voltage. Every package should tell you what voltage they should be run at, that is typically the best. It will take more than one resistor in series to get the voltage drop you are looking for, unless you know the current output of the power source. I would advise a voltage divider made of 2 resistors. One in series, one to ground. this will divide the voltage, regardless of current. Sorry for the tangents.
Best of luck
Dave
Since LEDs are current driven and not voltage driven, the supply voltage really doesn’t matter as long as you have the correct value resistor. Full for brightness ont he LEDs, the voltage has to be at or slightly above the LED’s rating. Too close to the rating and you won;t be able to get a resistor that is the proper value. ANythign 5 volts and above will easily run LEDs, and resistors to limit the current to within the LED’s safe range will be easily available.
Remember that the current rating for an LED is the MAXIMUM. DOn’t size resistors based on that value - if you power supply actually peaks at higher than the voltage rating (common for unregulated power supplies) you coudl actualyl be overdriving the LEDs. If the LED specification is, say 20ma, calculate a resistor using 10-15ma, not 20. That way if the supply voltage is a bit higher than expected, the LED is still safely protected.
–Randy
LED’s do not have an “optimum voltage”. LED’s want a constant current, not greater than 20 mA. If you exceed 20 mA, the LED becomes a Darkness Emitting Diode (DED for short). 10 mA will light them up good and bright.
To a first approximation, the needed current limiting resistor is given by Ohm’s law,
R = V / I where V is the voltage of the power supply and i is LED current (20 mA or less).
For a 2nd approximation, let V equal the power supply voltage minus the voltage drop across the LED, which is about 2.5 volts.
An LED best practice that I’ve not seen mentioned in this thread.
Use 1 resistor per LED.
You can do all the aforementioned calculations or you can keep it simple just as Selector explained. For 99.99% of our MR needs, the following is all you need to know:
I base everything on 12-14 volts, whether it’s a decoder* or power pack (for lighting buildings, etc). Add a 1000 ohm/ 1/4 watt resistor in series (on one leg of each LED). Hook up all LEDs in parallel. I have yet to have any fail this way. Simple.
Best practices: Use heat shrink tubing to insulate all solder joints. If the LED doesn’t light up at first, flip the leads.
*skip the resistor if the decoder has LED outputs.
Modelmaker - on the heat-shrink tubes, is a hair dryer sufficient to give enough heat to shrink it down?
Generally, no. But you don’t need a heat gun, either. Instead of a simple soldering iron, which keeps heating perpeturally while it’s pluggedin, get an inexpensive temperature controlled soldering station. Mine was under $50. This helps for many reasons, mainly that it doesn’t get insanely hot and allow the tip to oxidize more quickly. Clean tip = faster heat transfer = better joints and less chance of overheating or melthign anything. And when you dial it back it’s hot enough to shrink the shrink tube without melting it and turning the tip into a big plastic mess.
–Randy
I’ve been working on a signal bridge that involves a total of 10 LEDs in green, yellow and red. After a lot of frustration, I discovered that it was not my wiring that was flaky, it was my circuit. Putting a red LED in parallel with a green one, it turned out, caused the red one to work and the green one to stay dark. Once I established that, I re-designed the circuit to all-series connections, and all the LEDs worked just fine.
Moral: Wire LEDs in series, not in parallel.
Oh, yeah. Test, test, test. I set up a terminal block on my workbench. I connected a couple of resistors, and I run clip-leads to a 9-volt battery. As I solder each new connection, I can immediately check out the wiring right there on the bench. I bought a set of 5 color-coded clip-leads at Radio $hack, which have proven very useful in this kind of testing.
Your problem isn’t due to parallel or series wiring, your problem was incorrect resistor values for two different LEDs. Red LEDs are much more efficient than green. Current will ALWAYS take the path of least resistance. Assuming you had the same value of resistor on both the red and green LEDs - with the red being more efficient, the bilk of the current flow will through the red LED, starving the green LED of sufficient current to illuminate it. If you were to have either increased the resistor on the RED LED or decreased the resistor value on the green LED - they would work fine in parallel … the trick is finding the balance point of the resistors unless you have exact specifics of the LEDs in question.
I consistently wire two white LEDs in parallel with a single resistor for headlights. I also run strings of LEDs in series with NO resistors (7 - 2.2v LEDs = 14.4volts) in structures. I even have LEDs that I purposely short out so they turn off when another LED is turned on (cab interior light shorts to off when headlight is turned on). After 15+ years of operation, I have yet to have any fail ! Granted, none of the afore-mentioned designs are “best practices”, but unconventional methods DO work as well.
Mark.
I should have been more specific, I was speaking of “like” LEDs, not different colored ones with different values.