• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Amplifier distortion testing using a modestly priced audio interface

I knew it was a risk to directly connect the V3Mono to the 2i2 input, but as long as the V3 power is controlled, it's safe enough. Just takes care. REW has the benefit of ending any distortion tests if the feedback signal exceeds a specified distortion level, 1% by default that I left unchanged. It has stopped many times during tests (even loopbacks can do that), so I felt it was safe enough without a voltage divider probe or safety caps. The use of the Monitor1 is solely to allow setting the 2i2 feedback voltage to the "sweet spot" that is in the area of lowest inherent 2i2 distortion. The Monitor1 helps for safety as well as it can be set to 0 to start that prevents any excess voltage to the 2i2 if something is done wrong. So far so good, nothing damaging has occurred. For higher amp power tests a single voltage divider probe can be used in front of the Monitor1 to bring the voltage down and still have the Monitor1 to adjust for optimal 2i2 input (or any audio interface used).
The DC voltage at the output of the amplifier
always has the value of the supply voltage/2
with respect to ground even at 0 Watts.

The outputs of the amplifiers are bridged.

That is to say that the speaker is placed between
2 amplifiers having their output voltage in phase opposition.

So, be careful, if you stay with a 32 volt power supply
you will have 16V on the inputs, I don't think
that this poses a problem.

On the "E1DA adc" I would avoid forgetting the safety capacitors.
 
I initially re-tested the V3 Mono at the setting Amir reported, 0.421V input. For his test load of 4 ohms this was 5W output from the V3. For my 8 ohms load it's 2.5W. This was done to have a reference. As he said, the distortion should be lower for an 8ohm load.

Part of this was to show the difference in distortion depending on the feedback voltage into the 2i2. The black one is the optimal one, 2i2 input at -15.03dBFS. The red onee is a slight change, a bit lower level at -15.35dBFS. It doesn't seem like much, but the H2 measurement is about 6dB higher. The green one was more extreme, much higher level, yet well within the range of the 2i2 input at -11.98dBFS. The H2 is more than 25dB higher than the optimal "sweet spot" for the 2i2. The only change was the level set by the Monitor1. 2i2 output remained constant. This, as all of my more recent measurements have been, was with the 2i2 output at maximum and the input gain at 10dB. The REW generator setting was constant.
V3M 1kTone into 8-ohms Monitor1 Various 2i2 Feedback Settings 192kHz Ouput Max Input 10dB Gain...jpg


Ignore the glitch at about 65Hz. I placed a 12V fan underneath the V3 lying flat, partly as a test. I may report results in the V3 users thread, but suffice it to say that the temperature on the top of the V3 was dramatically reduced under constant 5W output.
 
Last edited:
The E1DA ADC has a 43Vrms input
Thanks for the reminder. Prompted me to go back to a web site review of it with a good set of tests. There is a distinction between the XLR and TRRS inputs with regard to voltage. The 0dB point for both is set by the switches on the underside with a maximum of 1.7 to 10V as tagged. Those same switches set the TRRS maximum. It quotes feedback from the developer:
Ivan clarified that the actual range for the "Aux" jack correlates with the dip switches for the XLR. So at XLR 1.7V setting, the "Aux" port will hit 0dBFS at 34.4Vrms. At 10V XLR setting, 0dBFS is at 43Vrms
In my case I will connect the V3 to the Monitor1 with its output going to the ADCiso. This should provide enough protection as long as I'm careful not to crank up the V3 to high or connect a divider probe at some point. Of course I could use the TRRS input, but I don't plan to do very high power testing.
 
The DC voltage at the output of the amplifier
always has the value of the supply voltage/2
with respect to ground even at 0 Watts.

The outputs of the amplifiers are bridged.

That is to say that the speaker is placed between
2 amplifiers having their output voltage in phase opposition.

So, be careful, if you stay with a 32 volt power supply
you will have 16V on the inputs, I don't think
that this poses a problem.

On the "E1DA adc" I would avoid forgetting the safety capacitors.
This raises a question. Why would this admonition not also apply to all audio interfaces that my be used if applicable?
 
This set of measurements were without then with a fan placed underneath the V3 Mono. They demonstrate two things. First, the results of using this measurement scheme at the 5W level for the 8 ohm load. I may at some point create a 4 ohm load to compare the difference as noted by Amir. For now this suffices for testing the scheme.

Second, though not directly related to the test scheme, the V3 Mono distortion changes a fair amount depending on its temperature. It may be that running a constant 5W over a length of time may not directly reflect normal use with music content, it does show that the distortion will change beyond a five minute time frame for a "warmup" prior to testing. It takes a fair amount more time to reach equilibrium with heat dissipation.

These were taken succession starting from an overnight cooling period. First up are those without a fan. I took temperature readings at the top center of the V3 laid flat. Without the fan it was on supports to provided clearance for heat radiation underneath. Times are approximate.

Five minutes:
V3M 1kTone 48kHz SR 256k FFT No Fan 5 Minutes.jpg

2+ hours:
V3M 1kTone 48kHz SR 256k FFT No Fan 2+ Hours.jpg

Overlay of both:
V3M 1kTone 48kHz SR 256k FFT No Fan 5 Minutes vs 2+ Hours.jpg

This shows a significant change in distortion with temperature. However, all of it is below threshold of hearing. One thing I found curious was the decrease in all even order distortion with heat vs increase of H3 (that as expected). However, there is some variance between measurements in levels, so small differences should probably be ignored. But H2 and H3 changed about 6dB, but opposite directions. That makes me still a bit suspicious of these. Repeatability isn't as good as I would like to see, but that occurs with REW measurements for other situations.

Edit: That's not to say that REW has an issue. Just an observation that could be due to the 2i2. Long averages settle down to similar results, short ones not always.
 
Last edited:
This raises a question. Why would this admonition not also apply to all audio interfaces that my be used if applicable?
It is mainly the technique that guides what we do

As a general rule we always have a signal referenced
to zero but for power amplifiers to avoid
having symmetrical power supplies to not have
output capacitors, we use this technique
in bridge with a simple power supply to avoid this capacitor.

It is always used in car radios even with
analog amplifiers.
 
It is mainly the technique that guides what we do

As a general rule we always have a signal referenced
to zero but for power amplifiers to avoid
having symmetrical power supplies to not have
output capacitors, we use this technique
in bridge with a simple power supply to avoid this capacitor.

It is always used in car radios even with
analog amplifiers.
My question has more to do with this specific case, but it is general in nature. If the concern applies to the use of the ADCiso with the V3 Mono, does it not also apply to the use of the Scarlett 2i2 (in my case) or any other audio interface when used with the V3 Mono (or any Class D amp)? If not, why only the concern with the ADCiso?
 
Mind posting your results?

Keep in mind that you average audio interface may not be designed to handle the output of a Class D power amp which tends to be rich in ultrasonics that could wreak all kinds of havoc (both in the analog stages and on the ADC side - I would give 96 kHz with a 128k FFT a shot as well). Audio Precision specifically recommends using a steep lowpass filter for their gear. As an aside, HF distortion in your mic preamp is not exactly great either, so any wideband testing would show up its limits more easily.
Maybe if DavidR plug in a class AB amp and test and compare? If it becomes the same havoc then that is? If there is any havoc now?

Yes, I will be posting. The problem is, again, REW, as I had in my other thread. I restarted REW. The measurements now are more what I expected, so I'll need to make new tests. I think it's tied to REW when switching between sample rates. It may actually not be REW, it could be the ASIO driver for the Scarlett, so I can't do a process of elimination to find it. I will probably post later today. But your comment about the filter is a good point. I've read that elsewhere, but wondered if the V3 Mono is less prone to that issue given the PFFB implementation.

I was also trying 192kHZ and 96kHz. REW would not complete a long average or a successful S-THD, part of what prompted me to re-start it.
First, good job with your measurements!:)

If you don't have one, buy some old class AB amp. From a well-known brand such as Pioneer, NAD or Yamaha just to name a few brands and carry out test measurements and compare vs the class D amp measurements you made. I'm not thinking directly about SINAD in itself (although it's fun to share the results) but more about the set-up of the test procedure itself. Learn, test, measure, compare, practice that is.:)
If you buy a well-known brand/model, you can sell it for the same penny. That is, if you want to test some old class AB amp.

Why not buy an old NAD 3020? Just to test. :) That in itself will be something many will want to read about. You are testing a classic then.

The NAD 3020 is a stereo integrated amplifier by NAD Electronics, considered to be one of the most important components in the history of high fidelity audio.[1] Launched in 1978, this highly affordable product delivered a good quality sound, which acquired a reputation as an audiophile amplifier of exceptional value. By 1998, the NAD 3020 had become the most well known and best-selling audio amplifier in history.[2]

Even better, although maybe overkill and asking for too much, buy two NAD 3020's and measure them. IF they don't measure the same, we can have a discussion about why that is so. Also, what is this same? How much % difference can you accept?

Edit:
Regarding this "the same". It was a bit OT but I thought I'd mention it anyway.:)
 
Last edited:
Maybe if DavidR plug in a class AB amp and test and compare? If it becomes the same havoc then that is? If there is any havoc now?
I do plan to do that, I have three multichannel Kenwood amps (KM-X1). I replaced the output relays and set the bias and offset (I have the service manual). Despite their age they still preform well. All had zero DC offset on all channels and bias adjustments were small except for one amp. These were THX compliant, I assume that has something to do with their quality. I bought them on ebay used when they were 20+ years old because they were in my budget at the time. One has seen nearly daily use by me for many years. My goal is to make distortion tests on all channels at some point, but that will have to wait.
First, good job with your measurements!:)
Thanks for the encouragement. I value all feedback.
 
One more measurement before I move to the E1DA. This is a comparison of the V3 Mono IMD at 5W and (accidental) 11.5W. The latter is rather good, but I'm very disappointed in the 5W results. Maybe that is impacted by the absence of a lowpass filter. I'll probably make one to see what difference it makes. Given that all of my tests were made without one make me curious as to whether or not it's needed for my usage as I have no need of using any sample rates higher than 48kHz. However, I think I'll make some tests at 192kHz.
V3M MultiTone 48kHz SR 256k FFT 5W Fan 72 Deg vs 11.5W Fan 89 Deg.jpg
 
I do plan to do that, I have three multichannel Kenwood amps (KM-X1). I replaced the output relays and set the bias and offset (I have the service manual). Despite their age they still preform well. All had zero DC offset on all channels and bias adjustments were small except for one amp. These were THX compliant, I assume that has something to do with their quality. I bought them on ebay used when they were 20+ years old because they were in my budget at the time. One has seen nearly daily use by me for many years. My goal is to make distortion tests on all channels at some point, but that will have to wait.

Thanks for the encouragement. I value all feedback.
Do it at your own pace and most importantly: Have fun. :)
 
I have a question for the experts with regard to Class D amp testing. Is a lowpass filter recommended for all sample rates or can one dispense with it for samples rates 48kHz and below?
 
I have a question for the experts with regard to Class D amp testing. Is a lowpass filter recommended for all sample rates or can one dispense with it for samples rates 48kHz and below?
It's more to do with the high frequency switching noise causing overload or slew rate limiting to the ADC front end. It should always be used whether you are testing at 44, 48, 96kHz because without it you can get fake distortion and noise results in the audio band.

 
I have a question for the experts with regard to Class D amp testing. Is a lowpass filter recommended for all sample rates or can one dispense with it for samples rates 48kHz and below?
For the measurement of class D amplifier it is
practically necessary to have an anti-aliasing filter to prevent the
HF noise of the amplifier from polluting the measurement and to
prevent high harmonics from damaging the ADC input.

In all measurement cases, we always put
a filter before sampling a frequency higher than FS/2.

In BF we only need a 20kHz to 40Khz filter

attached is what Apx offers in its filters

But to do as steep as Apx is difficult
in passive filter.

 
I have a question for the experts with regard to Class D amp testing. Is a lowpass filter recommended for all sample rates or can one dispense with it for samples rates 48kHz and below?
I'm no expert but the following links might help.

It is not easy to measure correctly on your own, but if you keep doing it, your technical understanding will deepen.
Measurement results that are to be posted on the Internet or shown to third parties need to be more accurate and easy to understand, which takes more effort. Good luck.
 
Thanks for all the input. I've reviewed some of it. The elaborate designs are beyond my needs, but I'll make a version of a simple balanced lowpass to use. I never intended to be able to test more than a couple of balanced amps, but it takes a fair amount of effort to properly test just one evidently. This has become more of an academic and learning exercise for me.

I have moved on to the E1DA after having some initial success in the 2i2 loopback calibration. The ADCiso was set for 4.5V default, but it was measuring better with a lower input voltage (Monitor1 set low), so I changed the switch settings to the lowest one, 0dBFS at 1.7V thinking it to be more accurate for low voltage input. However, the results were garbage. I next ran an REW calibration on it. Also garbage. This was puzzling so I connected my DVM and found the 2i2 output voltage (on the unused output in fact) was low. It went to the expected output when I unplugged the ADCiso. A direct 2i2-ADCiso calibration was absolutely perfect.

I reinserted the Monitor1 between 2i2 and ADCiso and tested at reduced Monitor1 settings until the cal was good, this being at 50 or below. I didn't realize how much the ADCiso input impedance changes with the switch settings. The 2i2 is fine with the Monitor1 alone in a feedback loop because the 2i2 input impedance is 60k ohms. But when the Monitor1 (10k ohms) is feeding the ADCiso this parallel combination is problematic depending on the ADCiso switches.

Any Monitor1 setting above 50 for that ADCiso switch setting causes the 2i2 output voltage to drop, it just can't drive an input much below 10k ohms. In most cases it isn't a problem.

This is the result of the 2i2-Monitor1-ADCiso loop calibration with the Monitor1 at 50 (or below). I set the vertical division to 0.1dBr for clarity. The ADCiso is impressive.
Scarlett 2i2-Behringer Monitor1-E1DA ADCiso Calibration REW -12dBFS Monitor1 50 E1DA in 48k 32...jpg


This is my measurement rig. I'm surprised at how well the PC handles REW. This is an old 4th gen laptop I bought in 2011. Dead (original) battery, replacements have been garbage, so it's running on the charger, yet no 60Hz impact.
Measurement Rig.jpg
 
Last edited:
I've doing a lot of trial-and-error with the 2i2-Monitor1-ADCiso combination. It has confirmed that even with the ADCiso as input that setting the input to a "sweet spot" produces the best results. I say "a" sweet spot because at least in my testing there can be more than one depending on the goal. I tested more than one ADCiso switch setting for 0 dBFS. Some of my results are similar to other measured results I've seen for Stepped-THD vs Level for the 2i2 as well as other audio interfaces. The HD components are low, but within their range they are somewhat erratic. Selecting an REW output dB at random produced somewhat "random" results. Finding the "sweet spot" for the ADCiso was in this manner was not very reliable. Different output levels resulted in varied HD component values. H2 up with H3 down and vice-versa as one example. HD components low with high noise and vice-versa. I found this totally unsatisfactory. Seemed like probing in the dark.

I tried another way. The Stepped-THD vs Level provides a good way to view relative distortion products as well as their absolute values for the signal supplied. The relative relationship was my focus. I ran the S-THD vs Level with a 1dB step. After examining it for relative H2/H3 I chose one where they met at a fairly low level, but not with high THD-Noise, in the upper range of dBFS. Next I ran the single-tone THD with the REW output at that dBFS point. The single tone HD components closely matched those at that level on the Stepped-THD graph. All of them, HD1-HD9 and noise. Then I did the same for a number of other dBFS points on the Stepped-THD with the same results, single-tone results matching HD components very closely. The problem was that there was often a large difference in results between level points. That is, the various relative levels of harmonics from one were still very different than another. That's fine if one is only interested in learning about the audio interface itself. But for my use the interface is only a tool, but I want to use it at its optimum (sweet point) if at all possible. This is meant to minimize its influence in subsequent device measurements, either amplifiers or raw speaker drivers. The problem I saw was that the 1dB step in the Stepped-THD was too choppy. The step was too big, better levels might exist between the steps.

Next I took advice given by Rja4000 and made more tests, first the default, then with coherent averaging. The latter proved to be very useful.

So I ran a Stepped-THD of 2i2-Monitor1-ADCiso with 0.1dB step with coherent averaging (eight average per step). On my old Gen 4 laptop this took quite a few hours, but REW did complete it. This provided much finer detail for choosing dBFS points for the single-tone test. This is in essence a balancing act. As will be seen in graphs below the relative HD components vary sometimes dramatically. The important thing is to choose a point that emphasizes (or minimizes) the distortion product(s) to examine.

Keep in mind that this is using the Scarlett 2i2 for output, certainly a limiting factor, yet with surprising results.

This is the 0.1dB Stepped-THD:
E1DA ADCiso  6.7V Setting Mono Mode Splitter Input Monitor 100 REW 48kHz 256kFFT 0.1 Stepped S...jpg


This is a very busy graph. I suspect that the big steps are glitches in the long test run. To examine it I first disabled all but HD2-HD3 as those are more important to me. Later I decided that I should include up to HD9. After a couple or single-tone tests I expanded the graph range to make viewing easier, making dBFS points easier to choose within the range that tests had shown best results. This is a partial screen capture so that the cursor position I chose is included.
E1DA  6.7V M100 0.1Step CA Subset.jpg


I had initially chosen -24.5dBFS due to the low value and intersection of H2 and H3, but -21.0dBFS seemed best overall. Later I moved to -20.3dBFS. Those graphs will be included further down. I may re-export the files with comments above the graph to make it easier to read them.

This is -20.03dBFS (no coherent averaging):
E1DA ADCiso  6.7V Mono Mode Splitter Input Monitor 100 REW 48kHz 256kFFT Output -20.03 498mV.jpg

-20.03dBFS with coherent averaging:
E1DA ADCiso  6.7V Mono Mode Splitter Input Monitor 100 REW 48kHz 256kFFT Output -20.03 498mV C...jpg

Overlay of the two:
E1DA ADCiso 6.7V Mono Mode Splitter Input Monitor 100 2i2 48kHz Output Max REW 256kFFT Output ...jpg

It's not easy to see, but the HD components are nearly identical.

As a final note I ran some tests with the ADCiso switches set for 10V 0dBFS, but results were not quite as good. This was before I used the technique described above. I may test other switch settings, but it's very time consuming.

Next will be moving to tests of the V3 Mono again using the ADCiso in that loop. The focus of all the above is to find the "sweet spot". Then when testing an amplifier the Monitor1 will allow me to set the amp probe feedback through the Monitor1 so that the input to the ADCiso is precisely (or as close as possible) to the desired sweet spot.

Edit: I should point out that the dBFS values were all those of the 2i2 output at was easier to keep track of the tests that way. For tests of the V3 Mono I will need to determine the ADCiso input dBFS at those REW output values.

One more note. This is an ADCiso A grade.
 
Last edited:
I made a lot of Stepped-THD tests with the ADCiso on several ADC switch and Monitor1 settings. None were better than the 6.7V Monitor1 100 results. What is a bit disappointing is that substituting the ADC for the 2i2 input provided little if any real improvement in the loopback distortion measurements. It may be due to the different clocks when using the ADC whereas the 2i2-only loop has the benefit of the same clock and allowing use of the rectangular window rather than Blackman-Harris 7. My other thought is that the limiting factor is the 2i2 output. My results for the 2i2 were better than those of Amir for individual HD components (due to finding the sweet spot), but noise was far worse for the 2i2 loopback. But I'm not interested in testing for SINAD, HD components are my focus for my needs. At this point the 2i2 I/O with Monitor1 is providing equivalent or better response than 2i2/ADCiso with Monitor1.

With that out of the way I'll be making a lowpass filter for measuring the V3 Mono. But I am curious to know if it's really necessary as long as I'm only going to use if for 1W/5W testing. I've already run tests of the V3 with a direct probe without any damage with what appears to be good results.

E1DA Cosmos ADCiso vs Scarlett 2i2 Gen 4 ADCiso Mono Mode 6.7V 2i2 Input Gain 10dB.jpg
 
Last edited:
Back
Top Bottom