• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

How to accurately level match DACs before testing?

GabrielPhoto

Addicted to Fun and Learning
Joined
Jul 14, 2020
Messages
502
Likes
286
Hi!
For fun I am comparing a WiiM Pro Plus optically connected to a Topping D50 III vs a WiiM Ultra using its integrated DAC.
One thing I noticed is the Topping seems to play louder thus immediately I was preferring that combination.
I had researched that using an SPL meter for this would not be very useful and that I should use a Multimeter and measure the RCA output.
The problem is, I did that by connecting my phone via the RCA input on both the Ultra and Pro Plus and on my meter the Pro Plus measured .2100 v (using a test tone app on the phone 100hz tone) and the WiiM showed .2950 or so. First not sure why not getting a louder output but in any case, not sure why the output of the WiiM is showing higher.

Maybe by using the RCA inputs that is messing things up?
Tomorrow I am going to try a test tone on the USB drive that is shared among the two to skip the RCA inputs completely.
Any suggestions besides that?
Thanks
20250405_220646.jpg
 
Note that using the RCA input means you’re using the DAC of the iPhone dongle first. It will introduce an extra DA/AD step.

The voltage is probably low because it’s not at full volume m. That’s fine. Just measure at the volume you’ll be listening at.
 
Note that using the RCA input means you’re using the DAC of the iPhone dongle first. It will introduce an extra DA/AD step.

The voltage is probably low because it’s not at full volume m. That’s fine. Just measure at the volume you’ll be listening at.
Good point although I use android not iphones.
I will do a test tone directly tomorrow. I am expecting the topping combo to have more output..otherwise if that's not the case then I just somehow like it more but I swear I can hear it's louder most of the time lol
 
Level matching (for each channel) should be within 0.1dB = 1%
Use a test tone between 100Hz and 1kHz and a voltmeter than at least has an AC voltage range of 200mV or 2VAC range.
Handy to use a test tone at -6dBFS as that would result in about 1.0V AC with most DACs on RCA out.
 
That's really cool that you decided to do this! Very impressive.

But now, I’m not entirely sure what your question is?
What source do you typically use for listening to music?
 
If the same voltage goes in both the devices output should be the same if they are spec'd the same.
If not, search for possible settings that affect level, RC, EQ, etc.
 
Level matching (for each channel) should be within 0.1dB = 1%
Use a test tone between 100Hz and 1kHz and a voltmeter than at least has an AC voltage range of 200mV or 2VAC range.
Handy to use a test tone at -6dBFS as that would result in about 1.0V AC with most DACs on RCA out.
Looks I am good with mine:
AC Voltage (V):750.0V/250.00V/25.000V/2.5000V

AC Voltage (mV):250.00mV/25.00mV
 
Ok so confirmed.
The Topping RCA reads 2.5v while the WiiM Ultra reads 2.0038 v
I believe I read somewhere that the Topping had 5v output for balanced and 2.5v for RCA so the above seems to match that.
I found the option from 5v and moved it to 4v and then it dropped the RCA output to 2.000v exactly.
I wanted to make them exactly 2.0000v both but that is as close as they can be for my test (although according to chatgpt, I wont be able to tell the difference between 2.000v and 2.0038v anyway since its so small), so at least now I know why I was favoring the topping..or I am guessing that was the reason
We shall see when I do some blind testing tonight but at least, now I know it will be more fair.
 
Last edited:
As expected ....not so easy to tell them apart now after making them match. I still feel the topping is my favorite but it may be reversed bias (I wanted to prefer the ultra lol).
Time to setup a blind test to find out for sure but that 2.5v vs 2.0 was clearly shifting the balance in favor of the topping
 
What caused you to dive into testing these devices particularly?
 
Since people most probably use the max volume of a DAC...unless there's clipping in the sound etc, people might prefer the DAC with the "louder" sound based on this practical reality when comparing two or more DACs that use max volume in normal listening.
 
Since people most probably use the max volume of a DAC...unless there's clipping in the sound etc, people might prefer the DAC with the "louder" sound based on this practical reality when comparing two or more DACs that use max volume in normal listening.
I don't know if there is any specific reason for this but for sure seems like a nice tactic to "win" comparisons if people do not match the DACs before comparing lol
In my case my avr doesn't need the extra but I suppose for some people it will be beneficial
 
I don't know if there is any specific reason for this but for sure seems like a nice tactic to "win" comparisons if people do not match the DACs before comparing lol
In my case my avr doesn't need the extra but I suppose for some people it will be beneficial
Well for a group of DACs that have similar measurements ... the one DAC that seems louder than the rest could "win" and be a no brainer in terms of purchase decision.
 
Well for a group of DACs that have similar measurements ... the one DAC that seems louder than the rest could "win" and be a no brainer in terms of purchase decision.
Which is my point, a sneaky little tactic , just enough to get the advantage .
I guess the question is why others don't do the same or even more to get more of an upper hand during uncontrolled comparisons
 
Last edited:
Which is my point, a sneaky little tactic , just enough to get the advantage .
I guess the question is why others don't do the same or even more to get more of an upper hand during uncontrolled comparisons
I think there is a pro and con to increase the DAC voltage beyond the standard ie 2V for single-ended and 4V for balanced. Obviously with certain music sources which already have extremely high audio levels...the DAC that exceeds the standardized volume level can clip these music files. For files that have large dynamic range, the higher DAC level will not affect it. I suppose with modern pop music and some remastered stuff that employ high audio levels, using non-standardized high volume level is not encouraged but hey, the DAC has the option to use the standardized volume, right?
 
I would have thought that the small differences in DAC output voltage make no difference at all when audiophiles do completely uncontrolled comparisons. This is where the master volume is set randomly during the listening depending on now the person feels about the particular track at that particular time.

Sure, if a comparison was made where the master volume was left unchanged then output level differences would matter.

I would attribute most uncontrolled listener preferences to biases, not SPL as the latter is never is close to being controlled.

I would like to know how the OP carrries out the blind test after level matching. A well documented process will have value and if it’s simple enough we might be able to encourage others (for example the lunatics on the Wiim Streamer Fanatics Facebook group) to do their own tests.
 
I would have thought that the small differences in DAC output voltage make no difference at all when audiophiles do completely uncontrolled comparisons. This is where the master volume is set randomly during the listening depending on now the person feels about the particular track at that particular time.

Sure, if a comparison was made where the master volume was left unchanged then output level differences would matter.

I would attribute most uncontrolled listener preferences to biases, not SPL as the latter is never is close to being controlled.

I would like to know how the OP carrries out the blind test after level matching. A well documented process will have value and if it’s simple enough we might be able to encourage others (for example the lunatics on the Wiim Streamer Fanatics Facebook group) to do their own tests.
The difference between 2.5 V and 2 V is not small at all. It corresponds roughly to a difference of 1.9 dB.

It is very well established that a difference of level as small as 0.2 dB can be reliably detected by many people in sighted listening as well as blind tests as a difference in the quality of the sound. I have experienced that for myself.

The adjustment of level "by ear" to make sure that a new device plays at the same level as a previous one is not enough reliable. Instrumentation should be used for that purpose. Fortunately, a suitable voltmeter and a test tone is all that is needed.
 
The difference between 2.5 V and 2 V is not small at all. It corresponds roughly to a difference of 1.9 dB.

It is very well established that a difference of level as small as 0.2 dB can be reliably detected by many people in sighted listening as well as blind tests as a difference in the quality of the sound. I have experienced that for myself.

The adjustment of level "by ear" to make sure that a new device plays at the same level as a previous one is not enough reliable. Instrumentation should be used for that purpose. Fortunately, a suitable voltmeter and a test tone is all that is needed.
We agree.
The point I was making is that most subjective conclusions made by audiophiles are not because of any type of A/B test, but simply by listening over extended periods to the new device and building up a set of opinions in addition to what they believe the old device sounded like from memory.
 
Back
Top Bottom