• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Why an AC Power Cord cannot make a difference

They could, certainly if they contain filters, are not of applicable gauge, or have specific ferrites into them.
That, however, does not mean it will or can lead to audible issues.

All speaker and interlink cables have different measured performance which can even be system dependent that does not mean that it will always lead to audible issues.

Just stuffing a mains cable with a bunch of ferrites (for different attenuation in different spectra for instance) will surely be seen on an EMI receiver/analyzer but I would not recommend to do that simply to 'avoid' possible problems.
They are expensive and most likely aren't needed anyway. When they are needed (to comply) they will probably already be on the supplied cable(s).

Agreed that a lot of gear will (most likely) not comply to regulations. They could comply when used as intended but may not be when connected to other gear or using incorrect cables, power sources etc.
And agreed this whole thing is far from as simple as it seems to be but the reality is that suitable mains cables rarely have any effect at all, not even measurable on the mains itself.
 
I agree, if you load the supply, you will see more ripple but as far as mains artefacts go, you'd really need to be putting the PS under stress. This scenario is also very unlikely when we're considering domestic HiFi. I think the average is around 7-10 watts for normal listening so most decent amplifiers will be idling realistically, let alone a high-end monster monoblock!

View attachment 421171
Phwoooaaarrr!
... but never forget Class A ;)
(or space heaters, depending upon one's perspective)
 
Yes, it causes a difference

Is that difference good/bad?

Does the $500 cable vs a $2 cable have that difference?

Most of the time we don't even get past the second question, let alone the first.
 
Of course not. But they block RF frequencies, both coming out of a device as well as ingressing into a device. And RF contaminating a device can lead to demodulation which can make the disturbance audible.
Can - of course, in theory.

I've yet to see it demonstrated though, in real world use. Where a ferrite has stopped the audible effect of RFI being audible.
 
I mentioned the power supplys DC side, as this is the only connection the amp has to the mains. If there could be any influence of a mains cable or mains filter, it must show there. If not, there can not be any audible difference. You can not ignore this logic.
I'm quite sure, with all these expensive mains cables you will see no inprovement and no "better sound" or whatever you call it.

I build improved power supplies for pre amps and RIAA amps. I needed quite some time and experimenting until I got a clean DC from it without any noise on top of it. If you don't believe, put your analog scope in the AC mode and increase the sensitivity. At some point you will see all kinds of spikes and other noise components dancing on top of the DC. Try to remove these! Different capacitors, coils and resistors can make them disapear, but this is not as easy as some here may expect it to be. Just a huge cap doesn't help at all. What is interesting, even caps of the same value have different influence on the suppression, depending on what kind of build they are. One hint: It's not the large caps that eliminate this noise, but the combination of different small values.
I did this by experimenting with a huge stash of different makes and types of parts. So no rule how to do it, just try and see.
The noise in part makes it's way even over the usual voltage regulators, while cleaning up the unregulated DC removed them from the regulated DC as well.

Was it audible? The problem is that you can not nail what part was responsible for the improvement. Any DC cleaning effort includes some increase in filter capacity, so that reduced output resistance may have had more influence than the reduction on noise.
 
What you write is only right as long as the PS is not loaded. As soon as you place a load at the output, you will be able to see all kind of noise dancing on the DC. The DC will not be a flat line any more. Of course, you can only see this with an osciloscope in AC mode, preferable an older analog one. If a mains cable could influence the sound of an amplifier, this noise would be the first indicator of something changing. It is the only direct connection the electronics have to the outside mains. I'm quite sure this will not happen, so no mains cable will be able to improve an amps sound.
Of course, you could use a resistor cable and find an improvement by using a normal one without this restriction. So all kinds of ways to cheat in a public demonstration.

About my screened mains wire, this is to keep influence from the AC wires away from sensitive, outside cables in direct proximity. Not to improve the amp from the mains side!
It's called ripple voltage, and a regulator can easily remove it. It's possible for RF to get through the rectifier circuit as well, and that might affect what the supply powers. A regulator would not have a fast enough response to undo that.
 
I mentioned the power supplys DC side, as this is the only connection the amp has to the mains. If there could be any influence of a mains cable or mains filter, it must show there. If not, there can not be any audible difference. You can not ignore this logic.
I'm quite sure, with all these expensive mains cables you will see no inprovement and no "better sound" or whatever you call it.

I build improved power supplies for pre amps and RIAA amps. I needed quite some time and experimenting until I got a clean DC from it without any noise on top of it. If you don't believe, put your analog scope in the AC mode and increase the sensitivity. At some point you will see all kinds of spikes and other noise components dancing on top of the DC. Try to remove these! Different capacitors, coils and resistors can make them disapear, but this is not as easy as some here may expect it to be. Just a huge cap doesn't help at all. What is interesting, even caps of the same value have different influence on the suppression, depending on what kind of build they are. One hint: It's not the large caps that eliminate this noise, but the combination of different small values.
I did this by experimenting with a huge stash of different makes and types of parts. So no rule how to do it, just try and see.
The noise in part makes it's way even over the usual voltage regulators, while cleaning up the unregulated DC removed them from the regulated DC as well.

Was it audible? The problem is that you can not nail what part was responsible for the improvement. Any DC cleaning effort includes some increase in filter capacity, so that reduced output resistance may have had more influence than the reduction on noise.
Power supply noise is also decreased by the PSRR of the circuit and decoupling/bypass caps in the circuit. So some noise from the PS is not a concern if the circuit is designed properly. And even if the PS is perfectly clean RF noise can still get on the rails, thus the use of decoupling caps right on the IC pins.


Interesting to see TI change there recommendation of multiple bypass caps (to cover a large bandwidth) to just a single cap.
 
I've yet to see it demonstrated though, in real world use. Where a ferrite has stopped the audible effect of RFI being audible.
The classic example is a active cell phone near an audio component or one of the connected cables, back in the day of 2G/GSM with TDMA which created RF bursts at 4.615ms intervalls (216Hz "ticker noise").
 
Well designed amplifier must have no issues and no audible effects from speakers even if you put the cell phone on the top and let it operate receive the calls or make the calls. Same if it is near signal cables. At least I want it to be fulfilled.
 
The classic example is a active cell phone near an audio component or one of the connected cables, back in the day of 2G/GSM with TDMA which created RF bursts at 4.615ms intervalls (216Hz "ticker noise").
yep - remember that from cell phones in cars causing the radio to go nuts. Other edge cases are people who live with KW or higher radio transmitters on the other side of the garden fence.

I'm still unconvinced that a ferrite bead on a power lead would prevent that from happening.
 
Class I and II refers to the appliance, with the chassis (and the clearance/insulation) and stuff. The power supply alone is not classified by this class.
But an audio component can use an external Class II power supply, to avoid the need for safety testing.
 
In the distant past, some printers and desk top computers came with shielded power cords, often with a ferrite choke. This was so the unit could pass the interference emissions tests.
A linear power supply can radiate noise via the power cord. It's switching noise from the bridge rectifiers often at maybe 6 kHz. A resistor/capacitor snubber across each bridge diode can reduce this noise.
A poorly designed Switch Mode Power Supply can radiate power line noise at a much higher frequency.
 
If diode noise was making it onto the DC rails, you'd have a really bad power supply. Either it would be faulty or just poorly designed. I don't know another way to describe how the AC energy going into the transformer, then into the rectifier bridge and then gets smoothed out by the caps and presents at the output terminals as DC. The output energy is completely different to the input energy, it has been PROCESSED. It's a one way street, AC is transformed into DC and the amplifier cannot be influenced by the AC or any slight defect in the AC waveform because its GONE! In a properly designed linear DC power supply, no elements of the AC waveform will affect the DC at the output. This is why a meter or two of mains cable or a fuse can not influence the performance of the audio amplifier. If you can't understand this basic principle, you have no business speculating or commenting on issues of an electrical nature.

View attachment 421138
If this were actually true, there would be no purpose for the various EMC conductive immunity test methods such as IEC 61000-4-6 to exist.
Higher frequency EMI will just cruise on through the above circuit and contaminate the +VE, GND, and -VE rails. Even with proper bypass capacitors, your typical op-amp can easily suffer RF rectification which can easily be audible.

I tend to only look at power supplies which bear the CE or TUV stamp. Why? They must comply with IEC 55032 which mandates those products meet a certain level of EMC performance. Unfortunately, the US is way behind the rest of the 1st world on this.

Generally, the typical location to mitigate the issue is to have proper EMI/RFI filtering right at the power inlet port. In certain cases a ferrite may be required on the power cord itself. If that is the case, the manufacture should either integrate it into the cable, or supply the proper external one. But an end user randomly added their own ferrites tends not to work out so well.
 
Class I and II refers to the appliance, with the chassis (and the clearance/insulation) and stuff. The power supply alone is not classified by this class.

You will find that this information will clear out many pages worth of confusion in this thread

This is incorrect. There are many many power supplies designed for integration into a "appliance" which have Class I or II ratings.
Why? A product which uses a an already certified power supply often does not require a retest. While undergoing CE, TUV, etc safety certification they can take "credit" for the part's existing certification/approval to show compliance with 60601-1, EN60601-1, ES60601-1.
Selection of parts which themselves are already compliant and approved can make a massive difference on the time and cost of certification and approval.

Here is but one example:
1737068680123.png

Given that it's a through-hole PCB mounted part, it's clearly not typically used externally.
 
What you write is only right as long as the PS is not loaded. As soon as you place a load at the output, you will be able to see all kind of noise dancing on the DC. The DC will not be a flat line any more. Of course, you can only see this with an osciloscope in AC mode, preferable an older analog one. If a mains cable could influence the sound of an amplifier, this noise would be the first indicator of something changing. It is the only direct connection the electronics have to the outside mains. I'm quite sure this will not happen, so no mains cable will be able to improve an amps sound.
Of course, you could use a resistor cable and find an improvement by using a normal one without this restriction. So all kinds of ways to cheat in a public demonstration.

About my screened mains wire, this is to keep influence from the AC wires away from sensitive, outside cables in direct proximity. Not to improve the amp from the mains side!
It may be apocryphal, but one of the early proponents of devices to lift RCA cables off the floor actually achieved a real and audible effect. However, instead of thinking about possible EMI interference such as if the building mains running right under his floor; he went down the path which leads to the sankara stones.
 
Interesting to see TI change there recommendation of multiple bypass caps (to cover a large bandwidth) to just a single cap.
this is because real components have package parasitics which start to dominate above the self-resonance frequency

by adding different values in parallel you can target different ripple/noise bands
Ideal-capacitor-Real-capacitor-3-with-additional-parasitic-components-inductive-2980633252.png
xSYRn-2488132113.gif

(plot from http://electronics.stackexchange.com/questions/129888/parallel-capacitor-conundrum)

Very large C values will help provide the additional current peaks for deep bass beats, but will do almost nothing at higher frequency, such as the TDMA mentioned
 
First I can understand - though it seems a sledgehammer solution compared to other more appropriate ones, such as downstream isolation, or balanced interconnect.


Second I'm not so convinced. Ferrite loaded mains cables are about RF emissions from the device. They are generally used to get a piece of kit through the conducted emissions test requirement - which only specifies limits from150kHz up to 30MHz and I don't think they do much until you get up from the bottom of that range.
They actually are used in some cases to pass immunity/susceptibility tests. But in that case, the manufacture is supposed to supply them. But there are countless examples of where mitigation was required for compliance testing, but committed from the shipping product.
They do little to nothing at audio frequencies.
While true, it's also mostly irrelevant.
If it were the same frequency simply being pass through the device, than that is a lack of isolation. The manifestation of induced effects during conducted susceptibility/immunity tests are rarely at the test frequency.

For analog circuits, the baseline signal which is being monitored tends to be rather simple: a few DC input/output values, and a few frequencies of a sine wave.

In the case of an audio amplifier, the test plan would most likely omit the DC as it's not relevant to it's performance. A reference and immune signal source would be used to provide a simple sine wave at say 1 kHz, and maybe at 3 different amplitudes. A quality bench top meter will be used to monitor the amp's output, RMS Vac, Vdc, and frequency.
The test signal may be injected onto the power cable via a few methods: a CDN (coupling/decoupling network), a bulk-current injection probe, capacitively coupled, etc.

From experience, as long as the amplifier does not become unstable outright there will be no appreciable shift in frequency. At various test frequencies the RMS Vac will increase or decrease; and the output's DC value will wander around.
 
So what magnitude are these artefacts? Microvolts...... picovolts.... I love how the cable crowd grasp at these minute defects and claim that they cause audible effects! Once again, the average individual participating in the HiFi hobby who purchases so-called "high-end" gear, is well over 40 years of age and has unavoidable age-related hearing loss. No matter how you cut it, these individuals are not capable of detecting 10 picovolts of mains artefacts on a +/- 30V DC power supply. Equally, the mains cable will have absolutely no influence on the existence or rejection of these artefacts! I KNOW you WANT there to be an improvement in your system when you spunk away $5000 on a mains cable but sorry, the physical and electrical realities absolutely kill the possibility that it does.
 
I would say only when needed. Gear should be immune and not emit too much (conducted as well as radiated).
To my knowledge, there is no regulations which stipulate consumer audio related electronics must be fully immune and not effected by EMI/RFI.
Well, technically they do but there is a caveat as wide as the Grand Canyon!

IEC/EN 55035:2017, Section 8.2 "Performance Criterion A", pg 24

" 8.2 Performance criterion A
The equipment shall continue to operate as intended without operator intervention. No degradation
of performance, loss of function or change of operating state is allowed below a performance level
specified by the manufacturer when the equipment is used as intended. The performance level may be

replaced by a permissible loss of performance... "

Effectively it just means CE marked equipment won't be damaged by the tests.

However when it comes to emissions there are limits for both conducted and radiated.

There really is no need for any heroic measures like extra filters, ferrites, capacitors, shielded mains cables etc. in most cases. One can even make things worse in some cases even.
Also, not all ferrites are equal and looping a cable a few times through one ferrite also greatly improves effectivity.

EMI/RFI mitigation is most definitely a case of "a little knowledge is dangerious".
 
Back
Top Bottom