You're preaching to the teachers...
But I see some things differently - eg, an ammeter tells me the current being discharged but gives no indication of "rate" wrt to its capacity. (The same applies to charging and is more obvious with end-of-life batteries.)
However a voltmeter tells how much of its capacity remains.
Re spare capacity, IMO so what? If someone did know their alternator output at that RPM deducted their load current (thru another ammeter?) - is the spare capacity enough, or not enough?
A voltmeter tells you, and it's simply two fine wires to the battery. There is no need for ammeters with high-current shunts. And no calculations.
Battery recharge current quickly decays in normal use. Even my peaks of 45A (after minutes of cranking) were below 10A within 60 seconds.
Most batteries will not take "all spare capacity" available from the alternator. Exceptions can include high
initial charge currents as above or older or worn alternators at low RPM.
Vehicle system voltages have nothing to do with the SOC of a battery - except of course when discharging beyond their surface-charge...
Most systems rise immediately to 14.2-14.4V.
Older systems used 13.8V and - especially with external or elecro-mechanical voltage regulators - were more prone to voltage dipping at low RPM.
The whole system is geared to voltage
as required by the battery. Voltage generally has the greatest effect on battery lifetime and is via simple & fast regulation.
Essentially battery current is irrelevant... or at least it has been considered such thru automotive history. It is only recent times that manufacturers nor impose battery charge current limiting. For traditional and especially AGM batteries that's simply squeezing the extra %age of life (the common exceeding of the
battery-spec's max recharge current was not detrimental enough to impact warranties & probably only impacted battery life by 5% or so) whereas current limiting for modern >200V vehicle battery systems is a different issue altogether.
The only people I know that have justified ammeters are like me - experimenters or designers that need to test or measure systems. But it's only temporary & only used for sizing purposes; they are essentially useless as any sort of performance or problem indicator.
Such meters were designed to measure the hotside current by default since dedicated GNDs could not be guaranteed (and were uncommon anyhow).
Historicals with ammeters that I know of measure only the battery current - usually hotside due to multiple GNDs. I can't think of any that measure alternator output except for alternator testers
They're great for certain discharge measurements - like knowing the discharge current from my (isolated) secondary battery to my 12V fridge. But even then a battery protector (low voltage disconnect) should be installed before worrying about that. (Yet again, voltage sensing does the protection.)
An ammeter will tell you the battery is at 100% SOC when it drops to float current - ie, typically under 2A for automotive batteries.
Otherwise it tells the
current charge rate, so marry that to its recharge tables else assume "flat" behavior based on its C10 or C1 rating.
I did the latter after my last alternator failure - abeit bottom-about: I estimated load (engine & headlights etc - assume 15A); Battery was 38AH (probably at C1 rate since it was a UPS AGM); I calculated 2 hour reserve if it was at 100% of rated capacity - after all, it was only 13 years old).
One hour home so I should get there with 1/2 to spare.
Assume 0.1V drop per 10% SOC (conservative).
IGN on & crank and watch the (battery=system) voltage drop to ~12.2V.
So 12.2V is my "100% SOC" voltage reference. It should be 11.7V in 1 hour...
I considered the voltage stable after a few minutes... good - no evidence of a weak dead battery... yet!
The rest of the trip saw the voltage drop as hoped. (Battery voltage was ~12.2V the next morning confirming a discharge of ~0.5V.)
An ammeter would have told me (accurately) what my load was but nothing about the battery's real SOC nor "crash imminent" rapid voltage decay.
Alas that segwayed...
Resolution tends to be an alternator issue. On the older common 30A or 60A ammeters it was difficult to judge float currents. (And what if it's 2A but at 15.5V?)
Newer electronic displays tend to overcome that if they have enough
segments.
Voltage is simply a 3 digit display covering at least 8V-16V.
When applied to batteries, ammeters are merely "predictive". IE if the battery is within specs (& at 25C or 50C etc) then that current for 1 minute equates to c% of capacity recharge.
The physical test is still the cell voltage - unless a hydrometer can be used.
For those that like the lights or already have ammeter/s or have a specific reason, ammeters are fine. Maybe use it to measure HU or amp load, or lights(?!)...? (I'd consider that - or a fridge - if I had a
2 in 1 meter, but otherwise I'd probably not use the ammeter - not with a shunt.)
But to fit an ammeter to a non-metered vehicle or think you are somehow getting valuable information from it...? Make sure you get voltmeter first!
Else do like me - use DC current transducers; just loop the cable - no need to cut & join. And they're zero-Ohm to boot! But use the split DC supply versions for
meters.)
But let's see if
pityocamptes has anything to reply.