Greetings All!
I have an electrical question that I just cant figure out the answer to.
I have a 5 volt lead coming out of a power supply.
I am trying to power a LED that runs at 20mA, at 1.7 - 2.4 Volts.
I am trying to drop the 5 Volts down to roughly that area. So I went to Radio Shack and bought a pack of 100 Ohm resisters 1/4 Watt Rated. I then soldered two of them together to give me 200 Ohms of resistance, maybe a bit much but I will be able to tell real quick if it is going to be too dim or not.
My problem is this, when measuring the Voltage coming out, I am getting 5.03 volts DC.
When I then connect the wire to the end of the resistor, I still get 5.03 volts DC coming out on the other side of the resistor. I am more than slightly confused, and pretty freaking new to electrical work.
Wire - 100 Ohm Resistor - 100 Ohm Resistor - Test Meter = 5.03 Volts
Wite - Test Meter = 5.03 Volts
When I put my multimeter to measure resitance it does indeed show roughly 200 Ohms resistance, but when I put it in line the power its not dropping voltage. Am I just super tard number 1?
Thanks guys for an insight!
Ok after further testing, I am seeing that it is indeed lowering the current flowing out, but not the Voltage, is it critical that I get the Voltage down to 1.7-2.4? Or is only the current I need to be concerned with? As long as it is within 20mA am I gold? (LED specifies 20mA)
good questions, first the reason you measured the same voltage on both sides of the resistor is because you had no load on it. The impedance (resistance) of the meter is extremly high, therefore we can assum it is infinite and therefore draws *no* current. Now if the we draw no current through a resistor what is the voltage across the resistor? By using ohms law the voltage drop on the resistor is V=I R , we know R, we assume I (current draw) is zero, therefore R*0=0 so no voltage is dropped across the resistor. Therefore both voltages will be equal. Many people dont understand the concept that current is needed to drop the voltage, so dont feel bad or anything. many people say voltage is dropped on resistors, but they fail to recognize that current is required to do so.
You have to remember LEDs are current controlled, NOT voltage controlled, the voltage across a diode will always stay the same unless you give it an extreme amount of current (it will blow up first) this is why LEDs need resistors, to limit the current, not to drop voltage levels. Think of the LED as a small battery (known voltage level), so if you have your power source at 12V, and say a LED with forward voltage of 2V, the difference between the two is 10 volts, so if you connect them with just a wire, the current will be:
I = V / R
I = 10/R , but with just a wire R is extremly low, aproching zero, therefore I approches infinity... great way to burn up an LED
now lets rearange ohms law... V = I * R
R = V / I , now we know our voltage difference, and we know the current we want (20 mA)
R = 10 / 20E-3 = 500 ohms
hope that helps...
Thank you, you can not believe how much that helped. Being so new to electrical work, I am catching on but could not find for the life of me, anything about what you said, regarding the concept that current is needed to drop the voltage. I honestly have not found that anywhere, and have been searching this morning on all things from resistors not dropping voltage, to resistor calculators, mostly sitting here going there must be something I'm just not getting!
You provide a wonderful fountain of information and for this I am seriously, honestly, grateful!!!!
That actually makes perfect sense too, now that you told me!!!
Thank you!