Let me start by saying that I am no electronics expert. There was a time when I didn’t know the difference between a resistor and a diode.
That said, I use resistors extensively on my layout to control LED lighting on signals and control panel lights.
I pretty much use resistors that I purchase in 5-packs from Radio Shack.
All of my signals, search lights and dwarfs, are from Tomar Industries.
All of my control panel LEDs are from Miniatronics.
When I first got into this back in 2004, I was advised to use resistors ranging from 330 ohms to 1500 ohms. The higher the ohms, the dimmer the LED. The lower the ohms, the brighter the LED.
That has been the extent of my knowledge until I recently bought my latest search light signal and found out that the LED was much brighter than previous LEDs with the same value resistor.
So, yesterday, I wound up experimenting with higher value resistors from Radio Shack including 2.2K, 3.3K, 3.9K, 4.7K, and 5.9K ohms.
Previously, I might have thought that a 5.9K ohms resistor would essentially prevent the LED from even lighting. But now I discover that even with a 100K resistor, an LED will light, albeit very dimly.
Now I worry that these LEDs will have a shorter life if the resistance is too low.
Can the experts weigh in here and give us electronics novices a primer on the proper use of resistors?
Rich