Clock drift is not an issue as the DAC and ADC are in one single device fed from the same internal independent master clock (as it's USB asynchronous isochronous mode).
Gain drift is, and there block-averaging will help reduce it (though @pkane might implement correction of slow gain drift in DeltaWave, also he might be able to undo simple -- and microscopic, in this case -- linear transfer function differences coming from different cable capacitances and different time-of-flight values).
Heavy block-averaging has its own problems, though. It also reduces loosely correlated random/stochastic effects, while not as much as uncorrelated noise but still to significant amounts so those might go unnoticed unless we also inspect the noise distribution, sample per sample. An example for this is the excess current noise from some resistors, this can best be isolated by inspecting the noise distribution in the residual (compared to an equal but less noisy resistor), where one can find a modulation of the noise with momentary signal level.
I used DeltaWave for some analysis a while back on a few balanced interconnects I had on hand. There were measurable null differences, although @ -100dB or lower:
https://www.audiosciencereview.com/...es-make-a-difference-a-null-test-result.7738/
DeltaWave has progressed since then, so now variable group delay and frequency response errors can be measured and corrected. I've just added a more perceptually-weighted metric to DeltaWave that should help with determining audibility of the difference. I would guess that anything with an RMS null of -100dB is extremely unlikely to be audible under normal listening conditions. Since my audio interface (Apogee Element24) isn't as low noise and distortion as the RME, it would be interesting to see your tests, and run the captures through the latest version of DW
Last edited: