In another discussion, charge efficiency came up in the programming of battery monitor settings for best accuracy. I strongly disagreed with a linked article that said you should program a fixed charge efficiency based on what the battery manufacturer says it is. I base this on the fact that charge efficiency is not constant over the recharge, going from very high to very low during a single recharge, so depending on how deep the discharge is you get a different average charge efficiency every time you charge. This is where the inaccuracy on the SOC reading of the monitor comes in on the recharge side, and why is so very important to get a full charge on the batteries regularly which recalibrates the 100% full SOC and zeroes the amps in/out.
That said, this is more about how you can use the knowledge of how the charge efficiency varies to your own benefit by making some relatively easy changes in power use at charging.
Here is a tech paper I dug out of an old discussion on this forum that describes testing done on wet cells to see how much the charge efficiency changes over a single charge cycle.
This is targeted toward solar use, but also does a good job of showing CE variations. Some quotes from the conclusions section:
These tests indicate that from zero SOC to 84% SOC the average overall battery charging efficiency is 91%, and that the incremental battery charging efficiency from 79%to 84% is only 55%.
Charge efficiencies at 90% SOC and greater were measured at less than 50% for the battery tested here,requiring a PV array that supplies more than twice the energy that the load consumes for a full recovery charge.
I will also point out that the acceptance (amps taken) of the batteries also drops huge above 90% so the last 10% can take as long to charge as the previous 90%.
If your batteries are below 70-80% SOC, the charge efficiency is going to be in the 90% , most likely so only a little can be saved by charging selections. But if the batteries are in the 90% SOC, where the CE is under 50%, you can save 1/2 your recharge amp hours and time used to recover used energy. As the article mentions, if you have to supply twice as much energy as you used, you may not have enough solar capacity and time to do it and you net out a loss for a portion of that power used.
Many people already do all, most, some of the tweaks, mostly to save generator run time, so this is certainly not revolutionary stuff but an understanding of why and how it happens may allow even better results.
Probably the most common and useful thing to do is for high use, short duration things like microwave runs or hair dryers, etc. You will use less total power and likely charge time if you run the generator or van engine while the power is being used rather than use battery power and recharge them. If you batteries are under 80% or so, you would save about 10% in energy and probably even more than that in charging time to replace the energy used depending on battery acceptance. If the batteries are above 80% it will need twice energy replaced even more than double the charge time to do it due to low acceptance. What is the surprising, I think, is just how inefficient it gets to use battery power and then replace it, when the batteries are in the higher SOC areas where many of us prefer to be so it is easier to get to a full recharge to help battery life and monitor accuracy.
Another thing that is done by some or many is to chose charging methods to suit the situation. For instance if you need full batteries by night time and don't have enough solar to get them there by itself, run the generator or van charging as early in the charge cycle as possible when the CE and acceptance are the highest to minimize the needed run time. When you have made up the capacity that the solar lacks, shut the generator or van off and let it do the lower efficiency and lower acceptance part of the cycle as it will have much lower output anyway in almost all cases.
If driving that day you want to get full, if practical it is best to get the drive in early to get the SOC and acceptance to a level that will give solar time finish a full charge even though it is low CE and acceptance. That way you got high CE off the large output charge source of driving and didn't waste solar time charging at a time when it couldn't provide all the amps the batteries would take.
Boats tend to need to take some of this to a bit of an extreme, that is hard on batteries for a lot them, but necessary to limit generator and/or engine running and fuel use, but still have enough electrical power. Many boats cycle their batteries only in the high CE and acceptance range of 20-80% SOC so the get the most recovery of SOC vs charging source energy used that they can. The operators know that it is hard on the batteries and will shorten their life a bunch, but that is better than the alternatives for them.
As mentioned, nothing revolutionary here for many campers, but I have been surprised how many times I have been told be folks in campgrounds how they run their generator just before bedtime to "top off" the batteries. If they really are topping the batteries, they are in the single most inefficient and longest time needed point of the charge cycle. It seems, as long as they would have enough power for the night, that it would be much better to run that generator in the morning while other power like cooking, TV, charging stuff, is being used to prevent that power from being used from the batteries and also to do their "topping". Of course, the terms could mean other things to them and they are really in the lower SOC and charging efficiently.
The lithium/AGM hybrid systems that are now showing up here take all this to a new level, by using the a fast charging lithium battery to harvest power quickly from the generator or van engine, and then using that battery to top off the slow, low CE, area of the AGM recharge. They appear to be able to reduce generator run time a whole bunch compared to an AGM only setup.