Sun Feb 25, 2018 3:07 pm

standard "formula" for calculating the voltage limiting resistor for any led powered from any voltage is:

subtract the forward voltage over the led from the voltage you are using to power the led, this gives you the theoretical voltage over the resistor.

now apply ohm's law R=U/I (or instead of U many nations use the letter V so R=V/I, where V = Voltage, R = resistance in ohm, and I = current in Ampere).

In your case the supply voltage must be significantly (a few volts at least) larger than the forward voltage of the LED, so a 2.92V LED cannot be reliable powered from 3.3V, and you should use 5V.

The series resistance should then be (if you really want to feed the LED 160mA (0.16A), which is a bit much, but possible) ;

5.00-2.92 = 2.08 V and I = 0.16A so R would be 2.08 / 0.16 = 13 Ohm

the 13 ohm resistor has to be able to handle a heat dissipation of I * V = 0.16 x 2.08 = 0.3328 Watt so this will be quite a large resistor to cope with 1/3th of a Watt. I would recommend a resistor of at least half a watt.

also you will need a transistor to switch the current on/off, and that transistor must be able to switch 0.16A, and be controlled by a control voltage that can deliver 3.3V, and a maximum of say 10mA (a guesstimate, i think the maximum for a single GPIO is 16mA).

is's series (gate resistor) can be calculated as 3.3V minus the base-emmitter diode forward voltage (0.7V typically), so the base resistor should be 3.3 - 0.7 = 2.6V an R = V/I = 2.6 / 0.01 = 2600 Ohm, or 2K6 (nearest equal or higher normal E48 range value will be 2K7).

oops, not 2600 Ohm, but 260 Ohm, I made a small error 2.6/0.01 = 260, so yes 470 Ohm would be okay!