As for another optimizer option, "center linearity plot around 0dB" would be nice to fine trim the gain factor at resolutions where the normal rms minimizers don't find an improvement anymore. It seems to perfectly suited for automation from the strategic way one does work that out manually, as described in the "dB for gain change values" topic mentioned previously.
Maybe with a parameter at which bit level the plot shall intersect (fit to) the 0dB line, with a "auto" setting where a valid bit range "window" is applied to bit depth where data density is highest (or something like that).
EDIT:
The scenario:
"Match" arrived at a gain factor but I want, for example, the curve fit to 0dB at the 14...18dB bit levels, so about a gain change of -0.00009dB.
The reasoning is when examining static distortion and small dynamic compression/expansion it is often best to align the gains to match at the quiet low-level section(s) of the (broad-band) signal where ill-effects usually will be low (unless dealing with heavy crossover distortion etc).