This question’s for you electronic savy guys out there. I’m looking for a little help to connect a series of LEDs for lighting the inside of a building. I read the articles in MR on it in Nov. 09, and the one in May 08, a dozen times but my brain just freezes up on anything electrical. So I’m asking if there’s a simple formula for putting a series of LEDs together. I want about 6 LEDs in a series. I’m using a 12 to 14 volts DC. Can anyone tell me what size resistor I’ll need? Is there a simple ( and I mean simple) formula when it comes to resistor size? Thanks for any help!
This question would probably be better off in the electronics/dcc forum but I’ll give it a shot. I am not 100% positive but I do not believe you can “daisy chain” LED’s. I would simply run a pair of power buss wires inside the structure and hook them to the buss. It would be fairly easy to do being as you would not need to run nothing more then probably 22gage wire for a power buss. This way you would only need one resistor to knock down the voltage.
Personally, I’ve never understood why people want to put LEDs in series. It’s so simple to wire them in parallel. However, here’s the formula for determining the size of a resistor for LEDs. (supply voltage minus LED voltage) divided by LED current in amps. Example: operating an LED on 12 volts. The voltage for an LED is usually 2 volts, but some may be more or less. The current for an LED is usually 20 milliamps (0.02 amps) but again some may be more or less and 15 milliamps is better for the LED and doesn’t make a noticeable difference in brightness. So our calculation is 12 minus 2 equals 10 divided by 0.02 equals 500. That’s not a standard resistor value so we go for the next largest standard value which is 560 ohms. 1/4 watt resistors are just fine. Important: EACH LED needs its own resistor. Now, to put the LEDs in series, you need to add the current of all the LEDs. Say you want 6 LEDs in series. That’s 0.12 amps. For a variation, I’ll take your 14 volts. 14 minus 2 equals 12 divided by 0.12 equals 100 which is a standard resistor value. Using the formula for power dissipated in the resistor (either voltage times current or current squared times resistance) we come up with 1.44 watts, so you would need a 2 watt resistor. That’s a second disadvantage to putting LEDs in series–you need a higher wattage resistor. The first disadvantage is that if a connection opens up anywhere or an LED blows, all the LEDs go out just like the old Christmas tree lights. Well, that’s my opinion for what it’s worth.
This is NOT correct. The currents do NOT add when things are put in series, the voltages do add.
People do not put LEDs in series for several reasons, as Paul mentioned; but the big reason is that you have to precisely know the voltage drop for each LED, and that is something that the manufacturer does not control well.
Run them in parallel, deal with each LED individually. It is a much more robust situation.
The LEDs can be put in series if you know what voltage it takes to power each one. If it takes 2 volts to start each LED, and the limit is 3 volts (which is pretty common), then put 6 LEDs in series with your 12-14 volts, and you won’t need resistors.
Or you can handle them individually, with a resistor on each one. If an LED ever fails, it will be a lot easier to find it than with series LEDs.
Simple reason - to minimize current requirements. Six LEDs wired in parallel drawing 10 ma each consumes 60 ma, 6 LED’s wired in series drawing 10 ma each consumes 10 ma. The current is the same throughout a series circuit.
You don’t have to know the voltage drop any more precisely than when wiring up one LED.
There are only two real problems with wiring LEDs in series: the first is if one blows, the whole series will quite working(not as much of a problem with LEDs, because of there long life, as it is with bulbs), and you can not easily adjust the brightness of individual LEDs(you can dim them individually by placing a resistor parallel with the LED that you want to dim). Since all LEDs placed in series will run at the same current, you generally want to use the same type of LED, otherwise you will likely get variations in brightness.
Here is a neat LED series/parallel array wizard where you give your source voltage, forward voltage drop of your LEDs, current you wish to run them at(which is usually LESS than the peak current listed in an LED’s specs), and number of LEDs you want, and it will give you a series/parallel circuit for driving the LEDs(if the total voltage drop of the LEDs is less than the supply voltage, it will give you a series circuit).
CSX Robert and Nigel, you’re right. It’s the voltage drop that’s added, not the current. My mistake. I knew that, my mind just wandered when I made the post and I confused the two factors.
I’d like to thank all those who replied, an extra thanks to CSX Robert for that link to the LED wizard. Like I said I’m inept at many things ,electronics being up there near the top, so I’ll take everyones suggestion and wire them in parallel. Thanks again.
LED’s can be put in series and it’s usually done for the same reason that incandescent lamps are put in series, to use a power supply that has an output several times the voltage rating of the bulb or LED. The only difference is with LED’s, you need to observe polarity and have to use a DC power supply. With a lot of LED’s or lamps, you’ll have to be concerned with current draw, and one thing not mentioned above, if you wire all LED’s in parallel, you’ll need to use resistors, which means you need to calculate for the curent draw of the resistors as well as the LED’s to find out if you’re going to overload your power supply.
When I buit my 9 stall roundhouse, I used 3 volt LED’s, 3 tied in series, each group tied in parallel, and all driven by a 9 volt wall wart power supply from a defunct rechargable drill.