Ok so I have a spare car amplifier running around, and I also can get a power converter from AC to 12V DC with 0-29A. I was wondering if this power converter would be "strong" enough to run an amplifier. I figured with the extra amp lying aroud I could wire it up to my tv, and use a "homemade" surround sound system with the extra junk lying around. I'm not sure how many amps an amplifier requires - too me thats it's purpose increase amperage right?
Im assuming 208W/2 = 104 WPC is you assuming its a 2 channel amplifier correct? If its a 4 channel amp then it would be 52WPC etc....
So assuming a 2 channel amp, and the 104WPC limitation - I wouldnt want to set the amp to put out a constant 100W - because of the fluctuations right? That would be like the max output, so I would want to set it somewhere around 80W RMS?
Already figured into the calculations. Peak power comes from within the amplifer, generally, and I am still going to stick with the 100 WPC safe range. 20 watts per channel will make zero audible difference anyway.
Unless you are running sine wave power, (but who listens to sine waves, anyway?) there will be no WAY for you to "set the amp to put out a constant 100W". Music doesn't behave like that.
-------------
It all reminds me of something that Molière once said to Guy de Maupassant at a café in Vienna: "That's nice. You should write it down."
Sorry, I used the wrong wording there - I didnt mean a constant 100W. What I mean is adjusting the gain to make it louder etc. Wouldnt the amplifier start clipping earlier than it would normally?
Although I guess it doesnt really matter; in the end all I have to do is just adjust the gain like I normally would...