This looks like a controversial experiment... Are you sure this is Motu's problem? CoreAudio is a bit sloppy (((
It doesn't seem like the maximum of the measurement setup, but in theory it correctly reflects the difference between the Mac clock and the Motu clock.
REW seems to apply a filtering to the calculation so the indicated fluctuation could be smoother than real one, but the fact that it fluctuates remains (I am coding a script in C to measure outside of REW however).
I checked the Blackhole source code and it works with mach_absolute_time that is the most accurate timer in Mac. Furthermore it does not seem to make some dynamic adjustments in the request for samples. In any case, even if it were, the fluctuation would also appear with the built-in speaker, but no.
On the other hand, the way Motu handles USB timing is unknown. It is not taken for granted that it directly corresponds with the clock of AD/DA stages.
But if this were not the case, the Core Audio Clock Drift Correction principle would be slightly flawed. Or rather, it would not be recommended to opt for the MK5 clock source in this case.
We will clearly not distinguish a few ppm of drift in an audio signal, jitter audibility threshold should be 20ns (but depends on jitter type) and here I have about 0,01ns if I calculate correctly... so that's irrelevant.
But technically it leaves me perplexed.
I can't determine which is the most reliable configuration this way... I would need a good level adc like cosmos (I don't know the phase noise it has however) or oscilloscope to verify with greater certainty. But it seems too strange to me that the Mac's built-in speaker clock is more stable than the Motu.
Maybe
@mdsimon2 can do a check?