All people are unreasonable. But IMO you are not being stubborn.
Quite the opposite - you have provided info which could be useful. IE - data, facts etc.
It seems certain your alternator is under-powering, but you need to improve (or check) your cabling first - that alone might solve your problem. With modern cars, a cable upgrade is more likely to solve under-power problems.
And get a voltmeter directly across the battery - not from the cig-socket.
Resistance is Futile. (All puns intended.)
Accept the above.
And read on for attempted explanation/s below if you dare.
But warning - it is 100F and 3AM here.
The voltmeter needs to be across the battery - not other circuits.
There could be voltage drops between the battery and the cig-socket, especially when other loads are turned on if they share some cabling (ie, ground, and any +12V common path).
Not that a multimeter is suitable as a permanent voltmeter, but it is a very useful equipment. They are usually way under $20 and you can test voltages, resistance & continuity (fuses, bulbs) etc.
Otherwise there are various voltmeters - in dash analog or digital. I bought a cute in-line unit for accessories (fridges etc) for $20, but butchered it and set it into my dash. I wish it were backlit, but it's fine for now - though I have tape over the last digit; 13.3V is enough digits - no bobble. (See it at
ABR-Sidewinder - it's worth it just for the battery "voltage vs charge" diagram at the bottom of the linked page!)
A voltmeter only needs thin wires. They consume stuff all current - even powered meters (like the in-line LCD type linked above) are usually under 10mA - though I do suggest an on-off switch (mine is switched on by an ignition relay).
Your "all-on" voltage drop over time indicates insufficient charging.
This is likely to be an undersized alternator. Or rather, not enough output at the RPM you are doing.
But it could simply be poor cabling.
Traditionally with resistive loads, good cabling would only overcome the undercharging in borderline cases. (If cable resistance was high, the load got less volts hence less current etc.)
But these days due to "constant power loads", the lower the voltage, the higher the current. (eg - assume a 30W HID - that's 2A@15V or 3A@10V). So poor cabling can have a huge effect.
Not that the above matter - the first step should always be to minimise cable/path losses.
Opinion:
Optional reading cum ramble follows...
I find it amazing when people know they have bad voltage drops but they get bigger batteries etc.
If that did somehow increase the power to the load, it also increases the losses.
EG - if originally we had 100 Watts of power lost (heat in the cables) and we somehow delivered 50% more power to the load, we'd than have 150W of cable losses.
Maybe that's enough to melt the ground cables. (That's a fun sight - the ground straps melt, the alternator loses its battery reference else there is a big voltage drop across the little remaining engine to chassis (or battery-) connection so the alternator increases its output voltage and then the dash, audio, DVD, screens etc melt too, else smoke if they are too cool to melt LOL!)
But surely - minimise the losses first, THEN upgrade alternators etc. I release my handbrake - I don't get a bigger engine to overcome it.
Maybe skip this too...
I'll occasionally use my (multi) meter to check the voltage between the engine-body/chassis; engine-battery; & battery-body/chassis ground paths (with engine charging; lights on/off etc) as a check on cable quality.
I did the same recently for the +12V path to my aux battery and found a 0.5V drop in a 6" cable section. I promptly replaced that!
And I will have a few extra engine-chassis ground straps in case one breaks, as well as reducing resistance. Excluding possible "ground loop" issues, there is no reason not to have "excessive" ground strapping. And usually there are plenty of places to run extra engine to body straps.
Alas am the one stubbornly blabbing on...
But your battery is discharging.
Get a multimeter to check where the big volt-drops are occuring (alt to battery; alt or battery to lamps or amps; ground straps; etc.
Or compare alternator +12V to engine block/head voltage, across battery terminals voltage, and amp +12V to gnd voltage (with amp running).
I'd want under 1V difference, but for high current loads, 2V can be difficult to achieve.
FYI - assuming a normal lead-acid battery in a temperate climate, charging voltage should NOT exceed 14.4V long term.
13.6V seems low. Most equipment ratings are at 13.8V - a common "minimum" set voltage for alternators.
(And unless it's an alternator with only a charge-lamp output (D+) (excluding of course its standard "heavy" power output B or B+), it's easy to trick the alternator into delivering a higher voltage.)