Unless you are using some really horrid material as a conductor (ie. wire) you should not be able to accurately measure any voltage drop at the load connection due to the size of wire used. However, water (current) will flow faster through a 2" pipe, than it will a straw. Current and voltage ARE DIFFERENT.
I am unsure of who "commonly refers" to power and voltage hungry as the same thing, but I can certainly state they are not engineers because they are different things.
A load (amplifier + subwoofer) may require more current than a given source (battery + alternator) is capable of supplying, which will cause a voltage drop to occur.
However, I assume by "power hungry" you are refering to the amplifiers "need" for current, and the charging systems inability to provide the required amount resulting in voltage drops. If so, this case occurs because the amount of amplification desired in a vehicle not prepared or capable of supplying the amplifier with enough current to then supply the subwoofer the power it draws.
I would assume, by "voltage hungry" you are referring to the voltage drop due to the insufficient source. As the voltage drop increases, an amplifier is deemed more "voltage hungry"?
All of this can be settled by simple ohms law (funny how many people forget, that principles of electronics were developed MANY years ago and are still and will remain applicapable in todays devices). Rule to remember... POWER IN = POWER OUT - POWER LOSSES!! The amount of power OUT from your amplifier, is determined by the amount of power (voltage and current) that it is given (the power from the battery and alternator minus the power the car itself uses). Furthermore, Power = Voltage * Current. The available power out of your amplifier is determined by the voltage and current input. If you test the voltage of your battery while the car is running, (ie. 13.8V) and know that your vehicles alternator is rated at (100Amperes), the MAXIMUM RMS power output of your amplifier is 1380W. This value is valid before any current losses are considered (lights, AC, etc.) If your headlights bulbs are rated at 40W each, this results in (I=40W/13.8V = 3A draw per bulb), and so on with each load in the car. If you were able to successfully find the current draw of your vehicle's average current consumption, subtract it from the current output of the alternator, this will leave you with the reserve or additional current available to supply additional loads (ie. If you have 20A of reserve current @ 13.8V, results in 276W RMS).
These values assume 100% efficiency of your amplifiers,(ie. if your vehicle has 20A reserve at 13.8V, and your amp is 87% efficient, you will actually get about 240W RMS output to your subwoofer) and do not account for peak current supplies (as may be available from your battery).
When you connect (5) 1200W RMS amps to a car that has 20A of reserve current, you will still only be capable of an output of 276W RMS.
ANALOGY: You turn your sink faucet on all the way and it takes 20 seconds for your sink to fill up. If you split the output of the faucet into 5 hoses all draining into the sink, how long will it take the sink to fill up? If you answer 20 seconds, you are getting the idea. The water that runs out of all the hoses combined will still remain the same, because of restraints of water pressure at the supply, and piping sizes in the house (ie. all source issues).
Hopefully someone is paying attention! :-D