To be clear, you'll need about .5% accuracy to get it within .1 dB
Yep.To be clear, you'll need about .5% accuracy to get it within .1 dB
I can short of trust the particular one cause the results are constantly repeatable normally but I wouldn't for a serious application.
No. You need 0.5% precision, which again is trivial- even the $10 Harbor Freight DMMs will give you that.To be clear, you'll need about .5% accuracy to get it within .1 dB
Yes, where one might run into trouble is making allowances for different levels. Like one device has far higher output, but say it is a DAC with variable output. You measure it with one voltage and measure another at a different voltage and adjust levels without checking again. The meter may vary between say .8 volts and 1.6 volts enough to throw you off. Not common, but really cheap meters can be this bad. So you measure DAC 1 at 1.6 volts and reduce it 6 db expecting it to match the .8 volts of DAC 2. Easy solution is after the fine tuning check it one more time at the same voltage for both sources.No. You need 0.5% precision, which again is trivial- even the $10 Harbor Freight DMMs will give you that.
Remember that for a level match, you just don't care whether the actual level is (say) 0.800V or 0.810V, all you care is that both are the same.
Yes, I agree, but precision isn't generally something that's in any spec sheet of a DMM, so accuracy as a proxy will do.No. You need 0.5% precision, which again is trivial- even the $10 Harbor Freight DMMs will give you that.
No, because it's basically built in. If you have a five digit meter, the precision will inherently be better than 0.1%.Yes, I agree, but precision isn't generally something that's in any spec sheet of a DMM.
Normally, anything up to a kHz or two is fine, remembering that even if the meter rolls off and gives an inaccurate number, the repeatability will still be more than good enough.with a cheap meter i would also chose to use a fairly low frequency <400Hz, 50Hz or 60hz maybe ? that's often what they are built to measure ?
Your phone must be super stable for this to work. A deviation of 2.5 cm from the source would already be the .1 dB difference in gain that is needed for level matching. And even then you'll need some long averaging, and you then have no way of knowing if the differences are from some room source, or a gain difference. it would need several runs at random times to get this done. Just use the f*ching DMM and save you all the troubleAnyone have other ideas for a stable easy to use signal you just read with your phone?
That's the whole point, isn't itNo, because it's basically built in. If you have a five digit meter, the precision will inherently be better than 0.1%.
I have a cheap one that has this deviation (and larger) measuring the same thing after 2 seconds.No. You need 0.5% precision, which again is trivial- even the $10 Harbor Freight DMMs will give you that.
Remember that for a level match, you just don't care whether the actual level is (say) 0.800V or 0.810V, all you care is that both are the same.
The issue here though is comparing two different bottles of wine not different people drinking the same bottleWon't that always be the case? Should hearing be any different than our other senses? Just like two people will experience a different flavor profile when drinking, for example, from the same bottle of fine wine. I'm definitely a "believer" when it comes to ASR and the need for objective measurement, but I'm not blind to the reality that all sound must be processed through our inherently subjective human brains. That's why even someone like Amir, with his wealth of experience and knowledge, still subjectively skews listening results without an objective measurement of same. If anything, we're too prone in ASR (guilty here) of eschewing subjectivity as irrelevant rather too quickly.
I wonder about the precision by measuring with my interface and REW or Multitone,which is what I do for convenience (I mean electrically,calibrated against the DMM which is also calibrated))Your phone must be super stable for this to work. A deviation of 2.5 cm from the source would already be the .1 dB difference in gain that is needed for level matching. And even then you'll need some long averaging, and you then have no way of knowing if the differences are from some room source, or a gain difference. it would need several runs at random times to get this done. Just use the f*ching DMM and save you all the trouble
This brings us to another point: Equally for the blind test to work, you yourself cannot move more than 2.5cm as well while performing the test, otherwise the gains won't match either. Never mind different reflections and other room effects that can be highly localized. All of these things just show how silly it is to use speakers to try to perceive differences that are so tiny.
That's the whole point, isn't it![]()
Do you mean using your interface instead of the DDM? That should be perfectly fine as long as you don't tough any analog or digital settingsI wonder about the precision by measuring with my interface and REW or Multitone
I have some nice meters, but for stuff like this, I use a $10 special from Harbor Freight.You engineers are spoiled by your nice gear![]()
While obviously we cannot know what someone will use, I'd throw such a meter in the trash bin.I have a cheap one that has this deviation (and larger) measuring the same thing after 2 seconds.
It also deviates while applying different pressure at the leads
Now imagine that I have to measure 3-4 freq points at dubious gear and the same at the other one.
0.5 % precision looks like utopia to me with the shorts of that one.
You engineers are spoiled by your nice gear![]()