I wish someone would explain/ rationalize / pontificate on how a digital (USB) stream can have jitter. Doesn’t make sense at all. An analog music stream coming off a DAC might present jitter but it seems the USB stream is either intact or not (and if it isn’t intact the digital data is resent TO the DAC.)
I listened to a certain Youtuber named Hans X who goes on and on about jitter and cheap power supplies. I bought an expensive, switched power supply ($90) for a Pi streamer and my ears can’t hear one iota of difference between that and the same streamer with a $5 supply.
Perhaps equipping my streamer with $500 RCA virgin copper, Mystery cables would make this jitter more apparent..... /s
Short version: if the DAC sounds or measures significantly differently with any in-spec source or power supply it's in indication that the designer hasn't done a good job, and I'd be looking for a better designed DAC not a change of source or power supply. Note that 'sounds' means with the usual controls to avoid your mind playing tricks.
Jitter is the variation in the timing of the sample data. If it is still present when the samples are converted from digital to analog there will be 'jitter artifacts' that show up in the analog output. The jitter tests here are looking for the presence of these artifacts in the output of the DAC when fed a specific test signal. With SPDIF and toslink the source defines the clock rate, and data is sent in a continuous sequence with a greater or lesser amount of jitter. In this case a good DAC will use various processing methods to remove the jitter before conversion. For USB it is somewhat different - the data is split into chunks which are sent to the DAC at intervals and buffered before being sent internally to the converter at a clock rate generated inside the DAC. Any jitter here is created inside the DAC. Most DACs tested here show good jitter rejection on all inputs, so they're pretty much unaffected by the quality of the source so long as it's within spec. I think archimago did some tests at the DAC output with a variety of sources to demonstrate this. Every so often there's one that measures badly on one or all inputs, and they may be audibly different depending on the source.
Power supplies are separate issue entirely. Most desktop DACs have multiple regulators and filters internally. If they're fed from a single supply (usb or barrel connector) then one of the internal regulators is probably a switching one to generate a -ve voltage rail for the output stage. For the most part if they've been well designed then they'll measure the same with the power supply they came with, or any other decent power supply. Designers are well aware of the noise on USB power rails, and should design accordingly. Archimago did some testing of noise on various USB ports, and the effect, if any, that this had on the DAC output. Note the reference to decent power supply though - there are power supplies on the market that shouldn't be (safety, RFI issues etc.), and they aren't always easy to spot. Dongle-style DACs are a bit different - they may not have the space for good regulation or filtering, and can be more sensitive to USB power variations.
I suspect grounding issues (loops, leakage currents) may be responsible for audible noise that's misattributed to other things. Changing a power supply might inadvertently fix this in some cases.
Jitter is to do with variation in time between data packets arriving. This can lead to packet loss. If you had a problem with it, you'd know about it (choppy audio etc.).
Not unless someone's redefined it while I wasn't looking. It sounds like you're talking about an interrupt latency problem in the computer - where the computer takes too long to respond when the DAC requests the next chunk of audio data, and the DAC runs out of buffered data before it gets delivered.