So I’m working on a little project that involves a bunch of LEDs and I ran into something I didn’t understand. I had always understood that LEDs effectivly had no resistance and had a maximum current they would pass before frying. For most that’s about 20-30ma. This is why whenever you hook up a LED you put a resistor in series. It all follows Ohms law very easily.
The trouble came yesterday when I noticed a circuit diagram that uses 6 LEDs in series, direct from the power supply to ground with no resistors. With what I knew, if that power supply could push more than 6x20ma = 120ma the LEDs should blow up. Obviously they weren’t so what is going on?
Here’s what I’ve come up with, and I’d appreciate it if you would tell me if I am wrong.
LEDs have a fairly unique (I think) property called forward voltage drop. An average LED always drops 2.0 volts no matter how much you put in or what else falls on the other side of the circuit. An LED has two important ratings. Voltage and max current. An average LED might be 2.0v and 20ma. The LED does not conduct until the forward voltage reaches the voltage rating and then at that voltage the LED will draw the current rating. So, if you feed an LED exactly 2.0v it will draw 20ma no matter how much the power supply is willing to provide. So, with those numbers we can calculate that the LED has an effective resistance of 100 Ohms. That’s the part I am not sure I am right on 🙂
With that in mind, if we give the LED 5.0v, assuming it’s internal resistance doesn’t change then the current is going to be higher. We get 50ma which is getting the point of the LED’s maximum current before it’s damaged.
The problem comes when you provide the LED with more voltage than it’s forward voltage drop. When the current gets to ground the voltage has to be 0, so if you provide more voltage than the LED is willing to drop it’s going to have to use more current to make up the difference. Also throw in the mix that an LED’s resistance goes down as current goes up, so the increase is exponential instead of linear. I haven’t been able to find much information on that yet, so I’ll just let it stand.
So, what I think is going on is that if the LED voltage drops all add up to the total voltage supplied then the voltage at ground is 0 (as it should be) and the LEDs don’t have to consume more current than they want to. If the voltage varies by a volt here or there it’s okay because the LEDs have a little headroom for how much current they can handle.
Does that make any sense? I hope it does, cause I feel like I *get* it now 🙂