This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

GPIO drive high Vf LED w/o current limiting resistor?

Specifically, sinking a Cree C503B green LED (Vf 3.2V at 20ma) from a GPIO configured for high drive, without a current limiting resistor, from VDD 3.6V?

Figure 23 “GPIO drive strength vs Voltage, high drive VDD=3.0V” of the nrf52832 product spec, seems to show that the voltage drop across the GPIO rises with current. My reasoning is that the circuit would stabilize at about 0.5V drop across the GPIO, 3.1V across the LED, and 13mA current. I know an LED should be driven from a constant current source, is a GPIO in some sense regulating the current, at least in this circuit? I am not a electrical engineer.

  • it's not a very good idea. High drive will still limit current, although much less-so than low drive. However it will let you pull about 15mA through the pin. That's ok as long as nothing else is pulling current through pins. If it is you're exceeding the total GPIO current draw of 15mA and bad things are likely to happen.

    If you really want to drive a 20mA LED you should really just put a driver circuit on it, an sot-23 mosfet is plenty good enough, and a current limiter (unless your power supply is sufficiently close to 3.2v you're on Vf anyway). The nRF series aren't like the chips they put into arduinos which are designed to drive things directly, it's low power, doesn't have a big fat power bus in there.

  • I should explain more. I don't want to drive 15ma, only 1 or 2ma (that provides all the brightness I need.)

    You might exceed limits when a GPIO load is low impedance. This LED (green, high brightness) is high impedance and drops 3.2V. If the GPIO drops more than 0.4V (at say a few mA), then the LED turns off since the voltage across the LED is less than Vf? You could say it is a special case, a hack. Am I analyzing the circuit properly? Does it resonate at such a high frequency that it doesn’t produce any light?

    The typical Vf for this LED is 3.2V, and the max Vf is 4.V. I will need to qualify the LEDs to make sure instances have actual Vf below 3.2V. I should probably go with a resistor/mosfet design just to insure that all my board instances have the same brightness.

    I also don't want a resistor to limit current (a resistor wastes power into heat?) or board space for a mosfet. I am probably optimizing too much, the wattage through the resistor is small compared to the wattage through the LED, and the board space is no big deal.

    I want blue/green: more visible to night vision. Produces one candela of brightness (in a 15 degree cone, at 2mA), visible a mile in the dark? energy as possible.

  • I think what you are saying might work in theory. The voltage drop across the LED and the GPIO's internal voltage drop will reach some sort of equilibrium and the LED will draw a current that coincidentally happens to be within the range you need. However, this it is definitely not how the GPIO is intended to be used so if you chose to go down this path you are doing so at your own discretion.

    Anyway, as you say, a resistor wastes power into heat, but if you don't use an external resistor the exact same power is wasted within the nRF52 instead. Power = Voltage * current. It doesn't matter if the voltage drop occurs over an external or internal resistance or over an LED. Hence, if board space isn't important to you I would recommend using an external resistor.

  • I was actually considering using PWM to the LED to limit the current, without a resistor. Like a switching power supply, slightly more efficient and less heat. Isn't the real limit not current but a thermal limit, the chip can't dissipate the heat? My internal, electrical model of a GPIO is a MOSFET, but a poorly characterized one. Thats why some suggest using an external MOSFET, because it is more fully characterized and you know what you can safely do with it. I'm not sure Fig. 23 corresponds to a simple MOSFET model.

  • I'm not familiar with the magic and physics going on inside the MOSFETs, but what I do know is that we have rated the maximum output current to 15mA and as a Nordic employee I cannot endorse using our SoCs outside of our recommended operating conditions. Doing so will degrade the lifetime of the SoC or even break it.

    I'll advice you to do some current measurements if you go for the first solution (no PWM or resistor) to actually confirm that the current is within limits. Note though, that I'm pretty sure there can be small variations of the Vf from diode to diode which can cause significant variations in current with this setup. It will also cause significant differences in brightness of the LED.

    end (1/2)

Related