I just cant win....
Either I write assumptions and derivations, or I skip them to minimise criticism. But I prefer the latter. If people (or rather the OP) want to know, they merely ask. (If not educated enough to understand nor understand their "right to request" more info.... - that's not MY problem.
I sound like a politician by replying "I'm glad you asked me that question...". But (1) I'm genuine - especially since you are the OP, & (2) I'm not sure "how glad" (LOL) - will I remember how, or find errors, or reduce my "new" answer to under half the verbiage etc? (Ok, not the latter because I gave the Executive answer. Phew!)
First - the assumptions [
and extra (trivial?) info italicised in square brackets]:

Assume our system varies from 12V to 14.4V - the "nominal" system voltage and normal "maximum" charging voltage of a 12V vehicle. [
For design purposes, I use a range of 8V to 16V. Under 12V doesn't worry us in this case, but we could use 16V.]

A 12V system is nominally "rated at" 13.8V - the float voltage of a 12V lead acid battery (at 25 degrees C). [
Ideally: Charge at 14.4V till full, then drop to float voltage of 13.8V.]

12V components are normally "rated" at 13.8V [
so is your bulb 6W @ 12V or 6W @ 13.8V?]

Use the worst-case scenario, but be practical [
ie, be aware that real component ratings vary (eg, 6W = 5W to 7W), and do a final sanity check in case "reality" might be a lot cheaper].

Ohms Law & power formulae: V=IR & P=VI [
hence P=IIR = VV/R this R=VV/P (where VV = VxV = "V-squared" = =V² or V**2 or V^2 etc; same for II).
And I'll use "R" instead of O.

City lights might be dimmed "normal" bulbs or an under-stressed bulb, hence the resistor is in SERIES with the bulb. [
Hence it is not 14V across the resistor! P=VV/R = 14x14/2.4 = 81.666... = 82W. And your lamp failed when the resistor blew.]
Now, the calcs.... finally (not)
The bulb could be 24R - ie, 6W@12V => 144V/6W = 24R.
Bulb plus series resistor =
24+2.4 = 26.4R.
From this we get our first sanity check (aka approximation)...
Because the resistor is 1/10th the bulb resistance, it will dissipate 1/10th the bulb wattage = ie, 0.6W. [
2.4R/24R = 0.1; 0.1 x 6W = 0.6W]
... and the extra bit & more calcs....
Normally I would leave it there. With topinstaller200's reply of 0.5W I feel sane enough, but would call it 1W. [
Resistors being eg 1/4W, 1/2W, 1W, 2W, 5W, 10W, 20W.]
[
Although the added resistor reduces the lamp current hence overall current hence overall power, this is only by about 10% (being 2.4R compared to 24R) so that means 10% less than 0.6W = 0.54W. Still > 0.5W => a 1W resistor. Of course it's actually 1/11th from 10+1=11 (2.4 in 26.4 is one-eleventh; ie, 1 in (1+10) = 1 in 11; or 2.4+24 = 2.4(1+10) etc)]
But being pedantic, I try other values and methods too....
Let's try 14.5V.
P=VV/R = 14.5x14.5/26.4 = 210.25/26.4 = 7.96W = 8W.
The resistor dissipates 2.4/26.4 of this = 8W/11 = 0.72W
Hence my
"hence 26.4R total = 8W @ 14.5V which means 0.7W from the resistor."
Or from the 1:10 approximation, 8W/10 = 0.8W - still above 0.5W and not "borderline" with 1W.
If the bulb is a fixed resistance of 24R being 6W@12V [
it isn't, but its resistance increases with increasing voltage - hence the "thermal self limiting" and why bulbs (and many other things) don't blow considering their lower resistance when cold], that means 0.5A @ 12V [
I=P/V = 6W/12V = (1/2)A.]
As to my "But if 0.5A bulb current assumed at 13.8V, then resistor wattage is 0.49W - too close to 1W?), I was being conservative and assuming 0.5A @ 13.8V - that 13.8V being the lowest "maximum" voltage that any vehicle
should have. (Certainly voltages above a full battery voltage up to 12.8V are assumed!)
But that is flawed and a bit complex - I think I did something like "if the 12V/6W means 0.5A and that is the current @ 13.8V".... then dropping current by 1/10th means 0.45A => .45x.45x2.4 = 0.486W - 0.49W.
My
too close to 1W? should be "too close to
1/2W?. So there is my d'oh typo.
The above "0.5A@13.8V" was also my way of allowing for higher voltages (13.8 + 10%), but a better method is to assume a 6W bulb @ 13.8V => 31.72 = ~32R.
14.4V across (32+2.4)R => 0.42A, hence .42x.42x2.4 = .43W from the resistor. Hence a 1/2W resistor.
Thereth the lesson ends.
But I'll continue anyhow....
I could have said at the start regarding...
fiat1980spyder wrote:
Watts=Volts squaRED / ohms : 14.00volts squaRED / 2.4ohms=11.6watts Can't be right? |
|
|
...isn't right.
[
4.00volts squaRED / 2.4ohms is NOT 14x2/2.4 = 28/2.4 = 11.6. It is 14x14/2.4 = 196/2.4 = 81.666 = 82W.
That's err#1.
Err#2 is that the resistor is not parallel to the bulb (hence receiving 14V), but is in series. Think logical - how does a resistor (restriction) alongside a component (parallel to a water sink) reduce the current flow through that component (reduce the water flow - the restriction or tap has to be in line!)]
But what's the fun in that?
Besides, I want others not to make the same mistakes I have made time & time again. I mean, the error you made. (Sorry, now I am guilty of transference too! But Geez, how may times? D'oh!!)
And this way I show how such calcs are done.
And that for most experienced people (does the name "topinstaller200" suggest something?"), that is the "correct" answer. IE - 0.43W, and that means a 0.5W resistor.
Even though .43 is close to .5, that is at 14.4V which is the "theoretical real maximum" that a vehicles should see. And in practice - especially older cars - many don't have 14.4V with even light loads (no pun).
In summary, the 1/2W answer assumes the normal
real world ratings that "12V/6w" means 6W @ 13.8V.
And that is probably correct.
And why chose 1Watt?
But I'd be tempted to use a 1W resistor unless there was a reason not to - size, cost, robustness etc?
You might be able to see if the original was a 1/2 or 1 Watt resistor by its size. (2W and above are usually square instead of round, and white ceramic.)
If it was a 1/2W resistor, then is that why it blew? Then again, it has lasted a long time.
Alas I too know of big variations in car voltages. And as you see by the "squared" relationship, voltage changes have a BIG effect on power. (Hence for example a 20% increase in voltage from 12.0V to 14.4V means a 44% increase in power! [
0.5W/0.43W is 1.16 => 1.07 increase in voltage (root of 1/16) => 1.08 x 14.4 = 15.5V. Remember I said I design for up to 16V (even though some vehicles can exceed that after cranking!)?]
And we only have about 10% headroom in our Wattage.
What if the bulb is 6.6W? I think a 10% tolerance on light bulbs is not too conservative.
Besides, with a 1W resistor, you could use 10W bulb (if they are more common etc).
Why the resistor?
I assume the resistor is there to limit inrush current and hence extend lamp life (since 10% external resistance has little effect on brightness (unless halogen)), but maybe no. Maybe it is there to preserve incomes.
If only the resistor failed to short-circuit mode so your light continues operating....
The final say?
From experience I know what a pain the so called "constant voltage" 12V system is.
Try designing for 8V to 16V operation - that's a 4:1 power dissipation ratio!
Though that power dissipation range is overcome by modern "constant power" loads, there is still a 2:1 input current ratio from 8V to 16V. Even fusing becomes tricky - a 1,536W input amplifier is 192A/128A/96A at 8V/12V/16V respectively (ie, same 2:1 range as the input voltage). Then add 30%-whatever derating when driving in hot weather, high altitudes, etc.
Even 10 or 12V to 16V is tricky. And how many batteries stay above 12V under high loads?
So, 0.5W resistor for a 0.43W load.....?
But with likely wiring and switching voltage drops etc - quite likely fine. (Not in my 1965 vintage car - it produces a modern 14.4V and had minimal distribution resistance.)
So there endeth the ramble.
Thank you for you question.
Next...?