• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Where does EMI/EMC, leakage current, common mode noise and other considerations belong on the "audibly the same" discussion?

I agree with that and have asked Amir to put 20 year old audibility lines on the appropriate graphs in some of my other posts. But he often cites when bad test data is approaching the audible ranges, which because it does not, is rare. SINADE for example is around below 70 dB but rarely is any modern device that low. I can not tell you what study that was derived from and that is a general omission at ASR.


Measuring the leakage current magnitude is fairly straight forward. Product safety laws (obsessively) regulate the maximum amount of "touch current", but that is for human safety.

Measuring how a particular device is affected would be doable in a controlled setting; but attempting to then predict how it would be with an endless variety of other equipment connected to it would be hideously complicated. And in the end, not sure it will amount to any real benefit.

The engineer in me says a certain level of assured immunity should be mandated by standards. The realist in me says that'll never happen.
The accountant in me says that would drive up the cost of everything. The consumer in me says I'm still ticked off you can't just buy a mainstream device, use it with mainstream equipment, use it as instructed; and have noise related issues in 2025.
 
Well done OP, you have used a USB isolator to remove the (slightly) audible noise from a mouse being operated.

Now explain how a meter or so of mains cable achieves this.

"There is an intangible sense of musical flow. The more microscopic my mind wanted to get in deciphering it, the more baffling the way Leviathan evokes aural magic directly connecting to the right part of the brain. It was not about separating the left and right hemispheres of the brain, but, strange as it may sound, it was about synchronizing the juxtaposition. A rare encounter that definitely deserves high praise."

and

"The unbridged music energy and power balance continued into the low-frequency response, where the Leviathan not only maintained the low-frequency content with authority but also enhanced the spectral sound quality due to the Pysho acoustics (proper lower foundation extends the upper ranges). Not in an artificial way, but in a fatigue-free, natural way, while guilelessly emphasizing the micro and macro details."

Are all of these incredible audible outcomes simply achieved by "filtering out" RFI?

REVIEW of Stage III Concepts Leviathan Power Cable here.

View attachment 422400

Or are we not discussing the effects of "high-end" mains cables any more....
Please go elsewhere with your disingenuous strawman.

In the case you are not being disingenuous; please purchase the above cables and ship them to me so I may scientifically characterize them.
Are you fine with the industry standard test method of IEC 60939-3, or are you an "American only" connoisseur who insists on using the obsolete MIL-STD-220B?
Is the upper frequency limit of 3.0 GHz of the ENA network analyzer sufficient, or should I also use a VNA to measure up to 18 GHz?

The cable appears to have ferrites. I must confess I can only test for saturation up to around 750 Watts. Going by the cable's girth, I am sure that is not a sufficient amount.
May I outsource that measurement?
 
The consumer in me says I'm still ticked off you can't just buy a mainstream device, use it with mainstream equipment, use it as instructed; and have noise related issues in 2025.
Possible may have noise related issues. It isn't a given and goes well in many situations.

Some gear is more 'sensitive' to it than other gear and depends on ground-plane wiring (as well as routing) and the actual design itself specifically the almost unavoidable coupling between digital and analog ground. This usually is coupled (cheap to make) and not every designer does this equally well. There are a few designs that use galvanically separated designs between digital and analog but this is expensive. On top of that also casing and where is is connected to the PCB, source and power supply may play a role.
This also doesn't mean expensive designs have done this properly either.

Its complicated and simply means in some cases one may have to trouble shoot and perhaps buy some gadgets to make it work properly.
 
Please go elsewhere with your disingenuous strawman.

A straw man fallacy occurs when someone distorts or exaggerates another person’s argument, and then attacks the distorted version of the argument instead of refuting the original point. By using a straw man, someone can give the appearance of refuting an argument when they have not actually engaged with the original idea.

What is strawman about my post? I'm simply asking for an explanation of the significant audible effects reported by the reviewer. Your foray into this forum was apparently to forward the idea that a meter of mains cable can make an audible difference to the music reproduced by a high fidelity sound system. You have asserted that "common mode" noise is a significant issue that can be addressed by using a "high-end" mains cable. So far, the only detail you have supplied is the reduction of switching noise by using a USB isolator in your sound system.

Obviously I know I'm not going to get an explanation, magic cable apologists always fall back on the "trust your ears" gambit which simply leads down the rabbit hole of subjectivism. That's probably why this thread was started, there's no steam left in the thread asserting that mains cables cannot influence the operation of an audio amplifier because the electronic reality is abundantly clear. No matter what EMI's and RFI's you invoke as the boogey men, their effect (if there is any) is so microscopic that it is utterly inconsequential. None of your responses in that thread proved otherwise.
 
I apologize if I incorrectly assumed it was directed to me.
It wasn't, since you clearly are not. Also why I carefully used the word "they" rather than "you" in an (apparently unsuccessful) attempt to ensure it wasn't misinterpreted.
 
I have not had such issues with audio gear except for some amps I used in a previous residence where the problem was traced to improper grounding a cable installer left behind. Is it really that big an issue for many? How can you test for various problems specific households may pose?
 
Problem with PCs is where they are grounded (I mean where in the case and what kind of case they use).
The nicer the case (alu,anodized,etc) the more susceptible to problems they may be.

Half of my friends using such case (me too) have solved the issue just by scratching the points that the PSU bolts with the case and using plain silver screws instead of the nice black painted ones.Yes,sometimes is that simple.Some sane cable routing too,PSUs with modular cables help too.

BUT,I absolutely agree that DACs should include galvanic isolation,the cost now simply cannot justify cutting these corners.
I can play with cables around and the spaghetti and up or down the various interferences by 20-30dB at will while measuring just by external factors and that on dedicated line.I have seen friend's rooms with shared lines where a led or a motor somewhere in the house (3 rooms away! ) just stopped some (admittedly cheap) DAC from playback with a pop too.

Is it so difficult for devices like audio to be well thought and immune?It's a genuine question.Cost can't be that much (I exclude the half baked ones and the cheapest) .

Edit:an old trick I have learned about measuring the health of grounding is measuring the interconnected devices while OFF (all the system,end to end) .If the noise floor there is nice it will be way nicer when on.Don't ask me the science behind that,I have no idea.
 
Last edited:
The question of audibility from stray emissions is multivariate and beyond the scope of single thread IMO.
I agree. Since every environment is different (even two apartments in the same block), singular approaches are unlikely to be universally effective.

I think @HaveMeterWillTravel original posts ask interesting questions. Seeing things from a systemic point of view does call into question whether reviews of an individual device can ever safely describe its behaviour when inserted into multiple different systems in different noisy environments. However, what we do have is decades of real world measurements where the dog doesn't bark and there are no measurable issues.
 
Of course, every environment is unique, and each combination of components can vary, but that doesn’t mean we can’t test the immunity or contribution of specific elements.

For example, consider a PC connected to a DAC via USB, with the DAC then connected to an active speaker or amplifier.

Most PCs have some level of noise on their USB output, and some PCs are grounded while others are not. What we don’t know about your specific PC is the USB port’s impedance to ground and the amount of noise present on the USB ground. Additionally, we don’t know the speaker/amplifier’s impedance to ground or its common-mode rejection characteristics—though we should know this for the gear being tested.

For a DAC, we can measure its impedance (versus frequency) from the digital input to the analog output.
The similar measurement can be applied to an interconnect, where both common-mode and differential-mode impedances can be quantified.

Some DACs have significantly higher impedance between their input and output.
In systems where a PC has a low-impedance USB ground, this can affect overall system nois

Interconnects or "cables" exhibit transfer impedance, common-mode impedance, and differential-mode impedance, all of which can be measured.

With this data, while we can’t make absolute statements about the noise (since we don’t know the magnitude or impedance of the noise source), we can make relative statements about whether a cable or DAC performs better or worse, and estimate how much noise is present in the system.

Since we can never know the exact magnitude of noise and impedance for every USB port, I would suggest we assume and test for worst-case scenarios rather than best-case ones.

For example, by many standards, class 2 appliances with touch current of 0.5mA are considered nominal. So, assuming a USB source with high impedance to ground, the influence of a 0.5mA current should be taken into consideration.
 
A straw man fallacy occurs when someone distorts or exaggerates another person’s argument, and then attacks the distorted version of the argument instead of refuting the original point. By using a straw man, someone can give the appearance of refuting an argument when they have not actually engaged with the original idea.

What is strawman about my post? I'm simply asking for an explanation of the significant audible effects reported by the reviewer. Your foray into this forum was apparently to forward the idea that a meter of mains cable can make an audible difference to the music reproduced by a high fidelity sound system. You have asserted that "common mode" noise is a significant issue that can be addressed by using a "high-end" mains cable. So far, the only detail you have supplied is the reduction of switching noise by using a USB isolator in your sound system.

Obviously I know I'm not going to get an explanation, magic cable apologists always fall back on the "trust your ears" gambit which simply leads down the rabbit hole of subjectivism. That's probably why this thread was started, there's no steam left in the thread asserting that mains cables cannot influence the operation of an audio amplifier because the electronic reality is abundantly clear. No matter what EMI's and RFI's you invoke as the boogey men, their effect (if there is any) is so microscopic that it is utterly inconsequential. None of your responses in that thread proved otherwise.
You sir are either boarding on the delusional, or delight in engaging in pointless sophistry.
The only individual here talking about Sankara stones is you.
Begone foul apparition.
 
Thanks to @sam_adams I found this with much pertinent information. The question of audibility from stray emissions is multivariate and beyond the scope of single thread IMO.
Hence why I choose to consider a path from the opposite direction: Start with the known and well established/documented cases where the effects are audible and work backwards.

I almost instinctively separate separate out the cases of incompatibility between common audio equipment from with a very specific and system-external 'aggressor' (recent thread about dimmable LEDs).

The latter would be mostly pointless as there will never be any meaningful change without the proper regulations mandating EMC testing.

The former? That is where I think things may be more applicable to determine if a product is well engineered.
 
Is it so difficult for devices like audio to be well thought and immune?It's a genuine question.Cost can't be that much (I exclude the half baked ones and the cheapest) .
I think being interoperable with like mainstream equipment (electrical, and not meaning say, digital encoding schemes) would be a a net benefit to the consumer.

For example, measuring the CMRR of inputs. A question would be at what level do you test at? The most straight forward way would be to conduct a survey of existing equipment.

Not that I am necessary proposing it to be done. More about raising awareness and possibilities.
 
Is it so difficult for devices like audio to be well thought and immune?It's a genuine question.Cost can't be that much (I exclude the half baked ones and the cheapest) .
It is not really rocket science but it requires knowledge and awareness on the side of the circuit designers -- one need to think of complete user scenarios, not only the device in question in isolation, under perfect lab conditions. And to make excellent signal integrity cost-efficient is also not exactly trivial.
 
It is not really rocket science but it requires knowledge and awareness on the side of the circuit designers -- one need to think of complete user scenarios, not only the device in question in isolation, under perfect lab conditions. And to make excellent signal integrity cost-efficient is also not exactly trivial.
Given that the user scenario for a DAC is 99.9% to be connected to a PC/laptop and also given that those are all over the place the logical thing to do is isolate this side.
Ok for the cheapest bus-powered ones (and even those can have the prediction to be powered by a spare charger of something) but going up...
 
UBS 2.0 HS iso chips are still at ~$5. And a low-leakage DC/DC is also at least ~$5. That is significant BOM cost for low-cost products.
Ok,now a (maybe dumb) question I always had:
As far as integrating them wouldn't it make more sense to straight isolate the internal I2S lines?
Cost aside and I'm probably far back in terms of evolution but with something of the likes of SI8065AA or similar modern with high data rate?At my non-technical mind seems like a more holistic solution.
 
As mentioned before, a complete galvanic isolation of the USB port is only about a ~5€ chip (for a DAC with an external PSU).
As Sokel mentioned, it's also possible to isolate only the DAC chip.

However, in many cases, full galvanic isolation is not required.
It is would be sufficient to have relatively high common-mode impedance between the USB (GND) and the analog out GND.

This isolation can be implemented at two points: between the incoming USB and the internal digital logic, or between the internal digital logic and the DAC’s analog part.

Ideally, both.

Perfect isolation would mean "infinite impedance for all frequencies."
However, a more practical solution would be "high impedance (compared to ground loop and analog interconnect impedance) at the noisy frequencies, such as 1 kHz for USB."

Unfortunately, some DACs connect them directly together.
 
Back
Top Bottom