• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required as is 20 years of participation in forums (not all true). There are daily reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

SSSheridan

Member
Joined
Mar 3, 2021
Messages
6
Likes
3
Hey all! I've just discovered this forum and am eager to dive into some measurements on my own equipment.

Only problem: I can't find any tutorials - I can't even find what software y'all use! I've searched around a bit but can't find a thing - I'd be grateful if someone could link me to something to get me started.

What I want to do is measure my DACs, pre-amps, power amps, ground loop isolators, etc., preferably through my sound card's line in. I don't need high precision - I'm only after a pleasant-quality, low-noise setup on a budget.

Any pointers would be greatly appreciated!
 

Helicopter

Major Contributor
Forum Donor
Joined
Aug 13, 2020
Messages
2,650
Likes
3,777
Location
Michigan
What's your budget?

I'd get an audio interface like Focusrite or Motu, or just use soundcard, maybe a measurement microphone, one of the calibrated ones for about $70, and then download some free software like Peace equalizer APO and Audacity. For dummy loads, you can use fancy big resistors like they sell on Parts Express, or just use stove coils, irons, water heater coils, etc. Multimeter. That'll get you started. You'll probably want to get a soldering station to put everything together. I do everything with my Focusrite scarlet 2i2 now. The iOS Audio Tools app is pretty good for like $10 or whatever it costs.
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
The best all round software I've found is REW. This can be used for both measuring electronics and for making in-room and pseudo-anechoic loudspeaker measurements. other useful similar software is ARTA and Soundcard Scope. These are all free software, and for home use, lack nothing that's worth paying for.

For measuring electronics you'll need a dummy load, as mentioned above, plus an adjustable attenuator, which doesn't need to be anything fancier than a 10k log pot wired across the dummy load so you can adjust the level going into your sound-card or interface.

A multimeter is essential, but many are only any good for 50/60Hz mains frequencies and don't go much above a few hundred Hz. Try and find one that goes to 1kHz as then you can calibrate your sound-card more easily. If you can't get one, then you'll have to calibrate the sound card at 100Hz, which should be Good Enough for Jazz. Calibrating a sound card isn't very convenient, so you may want to buy a dedicated audio millivoltmeter in due course.

REW includes an oscilloscope function so you can monitor for clipping, but with all soundcard-based 'scopes you're limited to the maximum sample rate of the audio card /interface you're using, which limits the frequency range you can measure. If you're going to do repair or calibration work, a dedicated oscilloscope that goes up to, say 10MHz (higher does no harm) preferably dual trace, will be very useful.

If you're going to mess around with turntables or tape machines, then a Wow & Flutter meter will be useful, and you can get, again free, wfgui 8.6 which will do what's necessary without needing a physical meter.

Other things will suggest themselves as you get further into it, but for a few hundred £$€, you should be able to equip a basic audio workbench, less if you have a decent audio card or interface.

S.
 
OP
S

SSSheridan

Member
Joined
Mar 3, 2021
Messages
6
Likes
3
Thanks for both your replies! Turns out I've recently got a lot of the hardware you mentioned - log pots, multimeter, etc. - on the intuition that I'd need them soon. My soldering octopus just arrived today :cool:

My budget is "as low as possible, but not lower" - as a student, money is tight, but I've ended up wasting money (not to mention time) on junk. My target is that elbow in the price-performance curve where it turns the corner toward diminishing returns.

All that being said: could I get away with a $50 Behringer interface, or is that in the "might as well use your sound card" realm? Basically, my uncertainty here is whether a sound card or budget interface would just give me larger error bars on my measurements, or if I would be inviting fresh new headaches and frustrations into my life.

(Right now I'm rocking a USB Creative X-Fi HD that I'm going to flip on eBay as soon as I've replaced it - from what I've seen on this site, that Apple dongle is an upgrade.)
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
I've tried the Behringer interface, and it's not really suitable for measurements. The DAC (output) side is fine, but the ADC (input) side lets it down badly. Distortion at anywhere near 0dBFS is far far higher than you would ever want to measure, and if you keep the levels down to below -10dBFS, noise limits how low you can go. As a recording interface it's OK, given that one should also leave some 10dB of headroom on live recording, but for measurements, one needs as much dynamic range as possible, given that what you're going to be measuring these days is likely to be pretty good.

You would be a lot better off trying to find a used high quality interface. I use an old Lexicon I-Onix U22, which whilst not SOTA by any means, is good enough, and doesn't distort any more at 0dBFS than lower down. Downside is that I have to use it on a Windows 7 laptop as there are no W10 drivers available.

Try your X-Fi card, set up a loopback and run REW output to input. That should tell you if distortion and noise are low enough. If they are, then no need to buy anything else for the moment.

S.
 
OP
S

SSSheridan

Member
Joined
Mar 3, 2021
Messages
6
Likes
3
Huh - maybe I'm missing something, but if it's good enough for recording, shouldn't it be good enough to measure audible noise/distortion? As I'm only testing for personal listening use (for now, anyway), I'm okay with sub-audible flaws being lost in the noise.

one needs as much dynamic range as possible, given that what you're going to be measuring these days is likely to be pretty good.

Ah-ha, well, possibly not the 1.83€ "24W" power amp from Aliexpress. (Okay, I bought that one out of curiosity).

For the loopback, should I put a resistor/pot in the line? Should I run the line directly from out to in, or should I run it to the usual load, and connect the input in parallel? Do you know where I could find tutorials, so I don't have to bother you with these questions?
 
Last edited:

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
Huh - maybe I'm missing something, but if it's good enough for recording, shouldn't it be good enough to measure audible noise/distortion? As I'm only testing for personal listening use (for now, anyway), I'm okay with sub-audible flaws being lost in the noise.



Ah-ha, well, possibly not the 1.83€ "24W" power amp from Aliexpress. (Okay, I bought that one out of curiosity).

For the loopback, should I put a resistor/pot in the line? Should I run the line directly from out to in, or should I run it to the usual load, and connect the input in parallel? Do you know where I could find tutorials, so I don't have to bother you with these questions?
By Good Enough for live recording I meant that it won't be audibly limited for home studio use. For measurements, there's a rule of thumb that the measuring tool should be 10x or 20dB better than what you're trying to measure. With the Behringer unit, I was never sure if, say, 0.3% distortion was the amp or the interface. With an interface that has, say, 0.005% or less distortion, then whether the limit is the amp or the interface, it doesn't matter, it's Good Enough for home use.

As to doing a loopback test, if the output voltage at 0dBFS exceeds what the input can take, the you'll need the attenuator, otherwise just connect output to input. Don't worry about loading on a loopback test as the input will present a high impedance load to the output which will have no problem driving that. You should be able to get some sensible measurements from REW using your existing card. Whether they're good enough depends on what the numbers show and your needs.

As to tutorials, I don't know of any, but then I don't do Youtube, there may be some there. In any event, don't worry about asking here. Most of us are happy to help.

S
 
OP
S

SSSheridan

Member
Joined
Mar 3, 2021
Messages
6
Likes
3
Wonderful. I wish I'd found this forum a month ago.

For measurements, there's a rule of thumb that the measuring tool should be 10x or 20dB better than what you're trying to measure.

Well, that makes perfect sense. I've set up an alert on my ebay-kleinanzeigen (Germany's Craigslist equivalent) for an interface.

With the Behringer unit, I was never sure if, say, 0.3% distortion was the amp or the interface.

For crude usage, could one just "calibrate" for that? 0.3% = Okay Or Better, >0.3% = Worse Than Okay. In other words: would a low-quality ADC be at least consistent with its distortion? Or would that 0.3% vary depending on the device being tested? (I recognize that if my purpose is to find definitely-audible distortion, I could just do a listening test, but that's more exhausting and less fun :) not to mention harder to put in a spreadsheet.)

For the loopback, why would an attenuator need to be anything more than a good-quality resistor? Since the purpose is just to lower the voltage, I think? (FWIW: my background is biophysics, so round here I'm a naive sponge - ignorant but hopefully quick on the uptake.)

Edit: Wikipedia says that "Attenuators are usually passive devices made from simple voltage divider networks." So a pot or a pair of resistors would do?

Basic physics question: is the following correct? A resistor R1 in series with load R2 will the lower voltage experienced by R2 by a variable amount (R1/R2 = V2/V1 ?) - whereas a voltage divider will remove a fixed fraction of the voltage?
 
Last edited:

solderdude

Grand Contributor
Joined
Jul 21, 2018
Messages
11,488
Likes
25,481
Location
The Neitherlands
You can use a volpot at the input of the ADC. This way you can increase the max allowed input voltage.
Or just some resistor networks (with calibrated steps) but you need to make it high enough resistance so you don't load the device you want to test too much. For this you need to take the input resistance of the ADC part into consideration. You also cannot make it too high otherwise you may get roll-off in the highs.
This would be essential when you want to measure headphone and power amps.
For the latter too you will need to load the amps with dummy loads as well that can handle the power.
Also you need the multimeter to know what the actual value is.
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
For crude usage, could one just "calibrate" for that? 0.3% = Okay Or Better, >0.3% = Worse Than Okay. In other words: would a low-quality ADC be at least consistent with its distortion? Or would that 0.3% vary depending on the device being tested? (I recognize that if my purpose is to find definitely-audible distortion, I could just do a listening test, but that's more exhausting and less fun :) not to mention harder to put in a spreadsheet.)

For the loopback, why would an attenuator need to be anything more than a good-quality resistor? Since the purpose is just to lower the voltage, I think? (FWIW: my background is biophysics, so round here I'm a naive sponge - ignorant but hopefully quick on the uptake.)

Edit: Wikipedia says that "Attenuators are usually passive devices made from simple voltage divider networks." So a pot or a pair of resistors would do?

Basic physics question: is the following correct? A resistor R1 in series with load R2 will the lower voltage experienced by R2 by a variable amount (R1/R2 = V2/V1 ?) - whereas a voltage divider will remove a fixed fraction of the voltage?

With distortion as high as the Behringer ADC gave, it's not useful to calibrate round that. If it were 0.01%, then fine, still a bit high, but if it measures worse than that, it's the amplifier. If 0.01% is the measurement limit, as it is with most analogue Distortion Factor Meters, then unless one's trying to characterise the amplifier fully, it's Good Enough to know whether the amplifier is working correctly and whether it's audible or not.

As to the loopback, you're correct. A pot is a pair of resistors where the junction between the two is variable. If your interface or sound card can put out more volts at 0dBFS than your input can take, then you'll need to attenuate the output.

In any event, you'll need one for testing power amps, as very few interfaces or sound cards can take more than, perhaps 10v at most, so that'll limit you to testing power amps of no more than 12 watts 8 ohms or thereabouts.

A normal pot is uncalibrated, so you'll need a method of measuring voltage. A multimeter might just be good enough, but generally, I'd rather use a dedicated audio millivoltmeter.

S
 
OP
S

SSSheridan

Member
Joined
Mar 3, 2021
Messages
6
Likes
3
Great. My ideas are coming together. And I might buy a used Scarlett this evening :)

One thing, though: I'd like to have Bluetooth input as well, and the Scarlett only takes analog in (until you're up to the 4x4s or so). I'm undecided between separate BT receiver -> DAC -> interface, or just getting an interface that can take digital (or Bluetooth) in.

I know I can get a receiver with aptX HD -> S/PDIF for $20, and I suppose that a DAC for aptX-quality would be affordable nowadays, but then I'm buying two DACs instead of just one. Do you have any recommendations for a good-quality, affordable (Scarlett price range) interface with S/PDIF or Bluetooth input?

Taking a step back: am I using the wrong device if I'm planning to use an audio interface as my main audio hub? That is, for daily music-listening.
My setup is (1, inputs: sound card, Bluetooth receiver, aux in) -> (2, "hub": cable splitters to collect the inputs into a 1-in-4-out headphone amp (Behringer HA400)) -> (3, speakers: two pairs of powered speakers (different locations), subwoofer, tactile transducer, headphones). I know it's a bit of a jury-rigged version of a proper audio receiver/interface, but it works, and seems to sound good when I can fend off the ground loops. My idea is that I'll stick the interface in position 2, as the hub, with input from USB and Bluetooth/SPDIF/aux, and output to be split and distributed to the various speakers. But, I keep having a crisis of confidence where I'm thinking "I don't think an interface is the thing I'm supposed to be putting here!" Any chance you could either reassure me or else let me know of any red flags?

Edit: clarity.
 
Last edited:

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
Great. My ideas are coming together. And I might buy a used Scarlett this evening :)

One thing, though: I'd like to have Bluetooth input as well, and the Scarlett only takes analog in (until you're up to the 4x4s or so). I'm undecided between separate BT receiver -> DAC -> interface, or just getting an interface that can take digital (or Bluetooth) in.

I know I can get a receiver with aptX HD -> S/PDIF for $20, and I suppose that a DAC for aptX-quality would be affordable nowadays, but then I'm buying two DACs instead of just one. Do you have any recommendations for a good-quality, affordable (Scarlett price range) interface with S/PDIF or Bluetooth input?

Taking a step back: am I using the wrong device if I'm planning to use an audio interface as my main audio hub? That is, for daily music-listening: collecting various inputs (computer/USB, Bluetooth, aux-in) and sending it out to my (powered) speakers and headphones? As a newbie, I honestly can't tell if that's a good idea, or if I should get a DAC/receiver/whatnot for the listening side of things, and a dedicated ADC for the other half of the measurement. (I'm currently using a 1-in-4-out headphone amp (Behringer HA400), along with cable splitters, to distribute my signal to speakers, subwoofer, and headphones.)

Can't help with the Bluetooth or S-PDIF inputs, as most ADC/DAC interfaces these days don't have them. I have an old Digigram card that has S-PDIF and balanced analogue, but it's so old now, I can only use it with a similarly old Windows XP laptop!

There's nothing wrong at all with using the ADC/DAC interface for normal listening, as the DAC part tends to be quite good. It's the ADC part that lets many of these down. The interface should also be good for headphones unless you need very high levels. Dedicated ADCs are quite rare outside of high-end Pro units like Prism Sound or Apogee (both of these also make some very nice ADC/DAC interfaces, but at high prices).

S
 

Thalis

Senior Member
Joined
Sep 2, 2020
Messages
354
Likes
214
sorry to budge in........... just to ask a quick question....... to measure output level with a MM...... say 2V....... I need to poke the RCA out + and -......... but need a MM that can handle up to 1kHz?
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
sorry to budge in........... just to ask a quick question....... to measure output level with a MM...... say 2V....... I need to poke the RCA out + and -......... but need a MM that can handle up to 1kHz?
Most MM are designed for 50/60Hz mains, not the full range of audio frequencies, so their accuracy at higher frequencies is suspect. Most MMs will specify what the accurate range is, some only a few hundred Hz, a few go higher. That's not too much of a problem is you're making comparative measurements, i.e. A is equal to B, as the error will most likely be the same for A and B but if you need to know how many volts, then you need something that's specified for the full audio range, ideally up to 100kHz or so.

Also, MMs aren't designed for measuring down to a few millivolts, so for measuring audio noise, something that will go down to at very least 1mV full scale over the full audio range is needed. Ideally again, something that will go down to 15uV (-100dBu) over a 100kHz bandwidth would be best. I have a now 40 year old Levell TM3A which is still working well down to -100dBu -15uV fsd range.

MMs have their uses, and indeed are essential for working on electronics, but aren't the right tool for the job making audio measurements.

S.
 

Thalis

Senior Member
Joined
Sep 2, 2020
Messages
354
Likes
214
Most MM are designed for 50/60Hz mains, not the full range of audio frequencies, so their accuracy at higher frequencies is suspect. Most MMs will specify what the accurate range is, some only a few hundred Hz, a few go higher. That's not too much of a problem is you're making comparative measurements, i.e. A is equal to B, as the error will most likely be the same for A and B but if you need to know how many volts, then you need something that's specified for the full audio range, ideally up to 100kHz or so.


S.

Thanks for your reply.

Ok so I assume both my cheap MMs would only do 50/60Hz. I only want to measure the output of my USB mixer to get 2V (output is variable) so i can feed its output to an amp. Judging output by ear is just not cutting it. So if this is the case... what signal should I get Foobar to generate? Would it be accurate enough if I get Foobar to output say, 50Hz at 24/96?

Maybe next time i will invest in a better MM.
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
Thanks for your reply.

Ok so I assume both my cheap MMs would only do 50/60Hz. I only want to measure the output of my USB mixer to get 2V (output is variable) so i can feed its output to an amp. Judging output by ear is just not cutting it. So if this is the case... what signal should I get Foobar to generate? Would it be accurate enough if I get Foobar to output say, 50Hz at 24/96?

Maybe next time i will invest in a better MM.
If you generate 50 or 100Hz at 0dBFS, and use your MM to measure 2V, that'll be fine. You don't need 24/96, pretty much any bit depth and sample rate will do for this purpose.

S.
 
OP
S

SSSheridan

Member
Joined
Mar 3, 2021
Messages
6
Likes
3
Can't help with the Bluetooth or S-PDIF inputs, as most ADC/DAC interfaces these days don't have them.
...
There's nothing wrong at all with using the ADC/DAC interface for normal listening, as the DAC part tends to be quite good.

This honestly saved me probably hours of doubting myself and looking around for a "doing it all" box. I'm more than happy to continue jury-rigging :)

Regarding the millivoltmeter, I'm going to try to use an Arduino module for that (just as soon as I figure out how to use this Arduino....). No idea if the accuracy and precision will be sufficient (for 2€, I'd be impressed), but I love the idea of seeing the readout plotted in realtime on the computer. I'll learn something in any case ¯\_(ツ)_/¯

Is there a convenient way to generate millivoltages of known strength to calibrate the thing? I'm thinking wall wart/battery + voltage divider (and measuring the resistors' actual values, as opposed to nominal). Then at least the millivoltage will be as accurate as my multimeter is.
 
Last edited:

Thalis

Senior Member
Joined
Sep 2, 2020
Messages
354
Likes
214
If you generate 50 or 100Hz at 0dBFS, and use your MM to measure 2V, that'll be fine. You don't need 24/96, pretty much any bit depth and sample rate will do for this purpose.

S.


alrighty..... will try that. So I poke red to + and black to- on each RCA to check voltage as I adjust the level?

Edit... I remove the RCA cables first yes?
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
2,720
Likes
6,817
Location
Suffolk UK
alrighty..... will try that. So I poke red to + and black to- on each RCA to check voltage as I adjust the level?

Edit... I remove the RCA cables first yes?
Shouldn't make any difference, but you may find it easier to use a cable from the output and connect your multimeter to the RCA plug rather than grovel inside the socket. Oh, and as the signal is AC, it doesn't matter whether the pin is + or -, either will give the same reading.

S.
 
Top Bottom