Let’s be a bit more careful about how we frame this discussion. I was at pains to say neither was “better”. The claim was simply that digital was higher fidelity, i.e. produces lower noise and distortion.
And I said it depends on what goes on in your digital and analog paths and how exactly those are implemented. It is not possible to give blanket statements like "digital has higher fidelity".
Since you are making the extraordinary claim that analogue is capable of higher fidelity (is that actually what you’re suggesting btw? I’m still not 100% sure), I think it is you who should provide the supporting evidence.
I fail to see anything extraordinary in it, since neither one is defined in any way. For sure, any semi-decent $100 analog mixing console is better fidelity than the 8/16-bit fixed point mixing done in trackers for example.
Do normalized equal level mix of four 16-bit samples in integer and you have lost three bits of precision.
However, I will take up your challenge when I have some time, and use a DAW (Ableton Live, which BTW is not renowned especially for its audio quality) to mix together 32 separate channels of discrete tones to see what noise and distortion is produced in the mixdown.
Again, mixing is extremely simple DSP process and doesn't demonstrate well the potential problems which appear in more complex algorithms and scenarios that you encounter in real world cases. You need to start playing with typical production things like IIR EQ, reverb FIRs, compressors and such, but bunch of those in chain on every of those channels and then mix the final result.
In addition you typically have at least two chained steps with something like 24-bit middle pipeline. Recording/mixing stage (from a desk) to mastering stage (possibly a desk or DAW).
However, software DAWs running on computers are also likely using better resolution than digital mixing desks and other digital hardware products.