I was wondering if there was any way to use a DMM to get a rough idea of the true wattage an amp is putting out.
For example if I have an amp that states it can put out 200rms @ 4 ohms, is there any way I can use a DMM to verify this?
Yeah - I was going to say the voltage and current must be at the same frequency.
And same phase!
It's almost easier driving resistors in a water tank and measuring the temperature rise.....
So without acces to any equipment other than a DMM and not needing to be exact, 2 doors down on the left will do.
If I take the amp, ran a 100hz test tone to a known 4 ohm load. Could I take the V^2/4 and get in the general neighborhood.
All I am trying to figure out is if an amp I have is pushing 100 watts or 200 watts or somewhere in the middle.
You will have to use a Dummy Load, or have an impedance bridge to determine the impedance of your driver at 100Hz. A 4 ohm driver will not be 4 ohms at all frequencies.
If you have the test tone, measure both voltage & current.
The RMS will be VI times the average to RMS conversion factor - ie, if DMM is average, then Prms = V x I x Pi-squaRED / 8 (ignoring phase due to inductance etc).
(You can calc the impedance as V/I)
Consider this, if you will, before putting all efforts into testing "to get in the general neighborhood": most of what you listen to is powered with under 1 watt. Outputs in the hundreds of watts are so brief, in real life, that you can rig up testing guesswork all day long and never find out true RMS output...which means it doesn't make a bit of difference whether the amplifier reaches 100 watts or pushes out that 3db higher at 200 watts if it is providing what you want it to. If it's a good amp, you will know. If it's junk, you will know.
-------------
Build the box so that it performs well in the worst case scenario and, in return, it will reward you at all times.