Using Amp hours as a measure of energy is what makes things confusing.
Amps is a measure of current capacity.
Watts is a measure of energy capacity.
With lead acid/AGM batteries, the Ah capacity is directly related to the the rate of discharge.
A battery rated at 100Ah @ 5A (0.05C) discharge is rated at 82A @ 20A (0.20C) discharge.
Keep in mind that The LA/AGM battery should not be discharged over 50% if you want the battery to last - so you are talking 50Ah @ 5A (600Wh) or 41Ah at 20A (492Wh). Also, at a 50% discharge, the voltage will be 12-12.2V.
Ref Trojan 31-AGM
https://www.trojanbattery.com/pdf/AG...tLineSheet.pdf
With a lithium battery, the discharge voltage is over 12.8V throughout most of the discharge cycle past 50%.
The link has charts showing voltage under a 25A (.25C)
As a result the lithium has more watts available for a longer period. The discharge graph shows that it does not hit
12V until 90% discharged - then it will drop quick.
https://www.solacity.com/how-to-keep...tteries-happy/
Point being, a LA/AGM amp hour rating is a lot of smoke and mirrors. As uncle_bob said - "Simply to be utterly confusing ....."
It does not have to be so.
amps x nominal voltage = watts
A lead acid / AGM battery is
12V nominal.
A lithium battery is 12.8V nominal.
So a 100Ah battery is
LA/AGM 100ah x
12v = 1200Wh (50% discharge = 600Wh usable)
LFP 100ah x 12.8v = 1280Wh (80% discharge = 1024Wh usable) - most LFP can be safely discharged more.
There is more energy in the lithium than the LA/AGM simply because it maintains a higher nominal voltage.
It maintains a higher voltage throughout the discharge cycle. It can be mostly discharged and can remain partially discharged indefinitely without damage to the battery. Because lithium is 99% energy efficient - virtually ever watt that goes in is, is stored for future use.