Light bulbs, resistors, and volts

Who can remind me of a formula by which I could match light bulbs and resistors in order to drop voltage to safe levels?

E.g. I have 1.5V bulbs and a 4.5V power supply. What value in ohms – and watts – should I use in series with the bulb?

It all depends on the current draw needed for your bulb.

To figure out the value of a dropping resistor in order to operate a light bulb from a voltage that’s higher than the bulb is rated for, do this. Subtract the bulb’s voltage from the power supply voltage. Divide that answer by the bulb’s current in amps. Most bulbs will have their current expressed in milliamps (thousandths of an amp). To convert milliamps to amps, simply move the decimal 3 places to the left. To use your example, 4.5 - 1.5 = 3. I don’t know how much current your lamps draw, so let’s pretend it’s 30 milliamps. That translates to 0.030 amps. 3 divided by 0.03 is 100, so you’d use a 100 ohm resistor. To figure out what wattage the resistor needs to be, either square the current and multiply by the resistance (in ohms) or multiply the voltage drop (that’s 3 in your case) by the current (in amps). It works out to 0.09 watts, so a 1/4 watt resistor will be fine. In this case, the resistor value worked out to be one of the standard values, but if it doesn’t, then use the next largest standard value.

Hope this helps.

That should do it – thanks, SeaMonster

[2c] If you know how much voltage a bulb is rated but don’t know how much amperage the bulb is drawing, use a multi tester connected in series with the bulb, or in other words, one lead of the tester is on the power source (same output as the bulb rating) and the other lead goes to the bulb.

Here are some links that may help you.

http://www.hydrogenappliances.com/ohms_law/ohmslawcalculator.html

http://www.gbronline.com/radioguy/ohmslaw.htm

You guys rule. Thanks.

Seamonster, that was handy- In measuring electric power, neither the current in amperes nor the pressure in volts tells us the amount of power in a circuit at any moment. A combination of the two does tell us the answer very simply, because volts X amperes = watts. 3 volts x 120amps = 360 watts 6 volts x 60 amps = 360 watts 12 volts x 30 amps = 360 watts 60 volts x 6 amps = 360 watts 120 volts x 3 amps = 360 watts 360 volts x 1 amp = 360 watts * * a 1 hp motor is a 746 watt motor also
if you think about it in model train talk 1 1/2 watts would equal 1/500 th hp** Al

Yup, voltage, current and resistance are all closely entwined with each other, and it’s that combination which determines the power, and its the power which does the work. It wouldn’t do much good to apply 12 volts to a model locomotive if you could only get 10 mA (1/100 of an amp.) out of it. If you think about it, an electric arc welder uses a voltage which is less than we use on our models (at least as far as my very limited knowlege of welding goes), yet it is capable of delivering a huge amount of current which allows it to cut and melt metal. Not much voltge multiplied by lots of current equals lots of power. Of course, there are many, many other factors that get into this which is way beyond the scope of this forum to discuss, and is way too much knowlege for the average modeller to need to know.