I’ve decided to try my hand at wiring LED’s as turnout indicators on my layout fascia. I know the value of resistor I need, but in shopping for them, I’ve found two types - “metal film” and “carbon film” resistors. Is there a difference in function, and is one type to be preferred over the other for what I’m doing? Thanks.
I believe the metal film ones offer better thermal stability. For LED circuits I’m sure that standard Carbon-film resistors will work just fine for you. And they should be cheaper and easier to obtain.
Metal film resistors also have a tighter tolerance… they are usually more “dead on” resistance wise (1-5% typ) than a carbon film resistor (10-20% typ). However your LED’s won’t notice the difference.
For this application, don’t waste the money on metal film resistors. If you were building a precise instrument or something that needs to be dead on, like an RF oscillator circuit,t hen you’d want the more precise components.
This is also why when calculaing a resistor value for an LED, you don’t calculate the current at the LED’s max rating, but rather somewhere in the middle. That way the 10% tolerance is meaningless, too high and the LED will still light, too low and it won’t exceed the LED’s rating.
5% tolerance resistors, including the Carbon-film type, are the norm. I haven’t seen a resistor listed from one of my suppliers with a tolerance worse than that in years. Even Radio Shack is stocking 5% parts.
As I recall metal film resistors are rated for higher wattage applications which lends them to better stability in in most applications because they are not as likely to be heated in the circuit as much. For LED applications quarter of half watt carbon resistors should be more than adequate and be lower in cost.