• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Reference levels on an audio interface - What do I get mixed up?

madr

New Member
Joined
Sep 17, 2024
Messages
3
Likes
0
I am currently trying to understand the reference levels of my audio interface. I use a Focusrite Scarlett 4i4 3rd gen, which has the following reference levels:

Output (balanced)Line InputInst. Input
15.5 dBu22 dBu12.5 dBu
430 Ohm60 kOhm1.5 MOhm

If I have understood correctly, these details always refer to the RMS value. I now want to check whether these details can be confirmed using a few simple measurements. For this, because it is easy to use, I have a sine wave at 1 kHz with an amplitude of 1 (i.e. 0 dBFS for the amplitude) and thus an RMS of -3 dBFS as a test signal.

If I now want to calculate which voltage value is present at the output of my sound card for such a signal and also the input level for a loopback setup, I would proceed as follows:

If I connect a balanced cable to my output and measure the signal with an oscilloscope, it should have a level of 15.5 dBu - 3 dBFS = 12.5 dBu (= 3.27 Vrms). If I now loopback the signal into the INST input, the reference voltage is at 12.5 dBu. That means - 12.5 dBu input voltage translates into 0 dBFS in digital domain. But when I do the testing I get -9 dBFS at my input.

  • So the first question is - what am I missing? Where does my math not fit? Can someone please give me a detailled example of the "way" from digital to analog and from analog back to digital domain?
Another thing, that I don't understand is balanced vs. unbalanced. I mean I get the idea behind it, but I also struggle with levels here. If I do the same loopback setup as above and use the same output signal at -3 dBFS I receive a -12 dBFS signal at my input. But thats not what I would expect. I thought unbalanced is the same signal as balanced but divided by two which would (at least in my logic) result in a decrease of 6 dBFS in terms of RMS compared to balanced. That means at my input I'd expect a level that is 6 dBFS lower than the balanced one.
  • So the second question is - what am I missing again? :D Why doesn't the levels differ by 6 dBFS?
 
Instrument inputs are not balanced. I think that is part of your issue. You are grounding one side of the balanced output leaving only one part vs ground. So that would reduce your expectations by 6 db.

You should be able to use a lower frequency tone say 100 hz and any decent multimeter would measure it. So use what you measure vs what dbFS level you end up with to see what your device actually does.
 
If I have understood correctly, these details always refer to the RMS value. I now want to check whether these details can be confirmed using a few simple measurements. For this, because it is easy to use, I have a sine wave at 1 kHz with an amplitude of 1 (i.e. 0 dBFS for the amplitude) and thus an RMS of -3 dBFS as a test signal.
Nonono. Here a full-scale sine = 0 dBFS, and a full-scale rect would be at +3 dBFS. It's not like the Audacity peak level meter of yore.

If you were to connect a moldymeter to hot and cold at -3 dBFS, you would in fact see about 3.27 Vrms, assuming either a TrueRMS unit or a decent mVac range and a ca. 50-60 Hz generator frequency (accuracy tends to be decent into the low 100s of Hz but these more basic units are optimized for mains frequencies). You would see 1.63 Vrms (+6.5 dBu) between either hot or cold and shield due to the type of output stage.
If I connect a balanced cable to my output and measure the signal with an oscilloscope,
Which is arguably easier said than done unless you happen to have a differential probe. You would have to split the signal between two inputs (hot #1, cold #2), invert #2 and sum them, and make sure you turn off the 50 ohm termination.

You cannot generally measure a non-floating balanced signal using just one input of a typical (mains earthed) oscilloscope. One half of the output stage will effectively be shorted to ground through the computer's mains earth. The multimeter, being battery-operated, well-insulated and floating, poses no such problems.
If I now loopback the signal into the INST input, the reference voltage is at 12.5 dBu. That means - 12.5 dBu input voltage translates into 0 dBFS in digital domain. But when I do the testing I get -9 dBFS at my input.
1. You're making it unusually hard on yourself by going for the INST input, which I'm pretty sure is unbalanced. At that point you should normally be seeing about (+12.5 dBu - 6 dB) - 12.5 dBu = -6 dBFS at max output / min input.
2. Assuming you were in LINE input mode instead, you'd be looking at +12.5 dBu - 22 dBu = -9.5 dBFS... I'd put my bets on that being the case here.
3. On the (rear) fixed line-ins, you'd be seeing +12.5 dBu - 18 dBu = -5.5 dBFS.
Another thing, that I don't understand is balanced vs. unbalanced. I mean I get the idea behind it, but I also struggle with levels here. If I do the same loopback setup as above and use the same output signal at -3 dBFS I receive a -12 dBFS signal at my input.
So you did what exactly differently, use an instrument cable instead of balanced patch cable?
 
Last edited:
Thank you @rcstevensonaz, @AnalogSteph and @Blumlein 88 for your very useful replies.
A really well written and very enjoyable read that covers a lot of details about sound levels: All About Decibels, Part I: What’s Your dB IQ?
Really a good article. Also helped to understand why on earth one would prefer 0.775 V as a reference over 1 V.
Nonono. Here a full-scale sine = 0 dBFS, and a full-scale rect would be at +3 dBFS. It's not like the Audacity peak level meter of yore.
This.
You cannot generally measure a non-floating balanced signal using just one input of a typical (mains earthed) oscilloscope. One half of the output stage will effectively be shorted to ground through the computer's mains earth. The multimeter, being battery-operated, well-insulated and floating, poses no such problems.
I did this and received way more reasonable results - thanks for the advice.
So you did what exactly differently, use an instrument cable instead of balanced patch cable?
Yes - I have one stereo (balanced) cable and one mono (unbalanced) cable with 6,35mm jacks. Then I switch them for two different loopback setups with Line Out 1 being connected to Line (or Inst) In 1.

I have now followed all of your advice and now think I know what my mistake was: I got confused between amplitude and RMS. The 0 dBFS figure referred to the RMS value of the output signal, although it actually refers to the instantaneous value of a signal without clipping. If I now measure everything again with this knowledge, it makes sense:

Outputs​

A 0 dBFS sine at 100 Hz (i.e. v(t) = 1 * sin(2pi * 100 * t)) leads to the following voltage levels:
Tip2Sleeve -> Vrms = 2.38 V | 9.8 dBu
Ring2Sleeve -> Vrms = 2.38 V | 9.8 dBu
Tip2Ring -> Vrms = 4.76 V | 15.8 dBu
Theoretically, I can also read the difference between balanced and unbalanced from this - but for my own good I confirmed the measurement again on an unbalanced cable and here too I got the expected 9.8 dBu.

Inputs​

When I loopback the signal described above I should get a level of 15.5 dBu - 22 dBu = -6.5 dBFS and I also could also verify this after some trouble shooting because: For easy measurement setups like this I often use Audacity, cause I like its simplicity. Since I only wanted to look at one channel I did mono recordings. Well - turns out Audacity divides the captured signal by two if recorded mono. So the whole time I had additional -6 dBFS where I just couldn't explain myself where they came from.

Remaining Question​

The only thing that doesn't quiet add up to me is the following: I know now that for the line input level calculations can be done like this

Line Input Level dBFS = Line Output Level dBu - Input Reference dBu (- 6 dBu if Cable is unbalanced) = Line Output Level dBu - 22 dBu (- 6 dBu if Cable is unbalanced)
-> Example 0 dBFS Line Output Ch1 to Line Input Ch1 -> 15.5 dBu - 22 dBu = -6.5 dBFS via balanced | -12.5 dBFS via unbalanced

I thought, okay for the Inst. Input I just have to change the Input Reference value and take into account that the INST doesn't accept TRS but only TS jacks - at least thats what I would interpret from the manual:
Select INST from Focusrite Control (‘INST’ illuminates red) if you are connecting musical instrument, e.g., a guitar in the example, via a TS guitar jack. Deselect INST if you are connecting a line level source such as a keyboard, synthesiser or the balanced output of an external audio mixer via a TRS jack. The Combo connectors accept both TRS and TS types of jack plug for line level sources.
Since INST does only accept TS I would've expected, that independend from balanced or unbalanced cable I'd receive the following level for a -6 dBFS output signal:

-> -6 dBFS Line Output Ch1 to INST Input Ch1 -> 15.5 dBu - 12.5 dBu -6 dBu -6 dBFS = -3 dBFS

But when I measure this I receive -9 dBFS with an unbalanced cable and -6.5 dBFS with a balanced cable... Wrong calculation on my side?
 
Well - turns out Audacity divides the captured signal by two if recorded mono.
This is what happens if you average two channels, one with signal and the second with none.

I have a habit of recording in stereo, splitting the tracks and deleting the empty one, maybe for this particular reason.

Note, make sure you're using WASAPI if you want >16-bit I/O on a Windows machine, otherwise it takes a custom build with ASIO support compiled in.

But when I measure this I receive -9 dBFS with an unbalanced cable and -6.5 dBFS with a balanced cable... Wrong calculation on my side?
I cannot quite explain that either, but suspect that
a) some of our assumptions re: the inputs may not be correct (e.g. ring on the input jack may not be hard-grounded in INST mode but rather via a few hundred ohms)
and
b) input clipping may be reached a good bit below 0 dBFS.

It is pretty well-known that these units (as well as many other similar interfaces) run on +/-5 V, so the very most you could hope for with an unbalanced JFET buffer is a bit short of 10 Vpp. +12.5 dBu translates to 9.25 Vpp, bingo. In this case you may never see more than -9.5 dBFS at minimum input gain. Which, mind you, doesn't jive with any of your results. Some reverse-engineering may be required to solve this particular mystery.
 
I have a habit of recording in stereo, splitting the tracks and deleting the empty one, maybe for this particular reason.
Yes, thats what I also ended up doing. It's a little bit annoying but as long as it works...
Note, make sure you're using WASAPI if you want >16-bit I/O on a Windows machine, otherwise it takes a custom build with ASIO support compiled in.
Didn't know that and always used the standard MME and didn't really question it.
I cannot quite explain that either, but suspect that
a) some of our assumptions re: the inputs may not be correct (e.g. ring on the input jack may not be hard-grounded in INST mode but rather via a few hundred ohms)
and
b) input clipping may be reached a good bit below 0 dBFS.
Okay I think for my future measurements I'll just use the Line Input then. But on the other hand isn't a very high input impedance like provided in the INST case desirable?
It is pretty well-known that these units (as well as many other similar interfaces) run on +/-5 V, so the very most you could hope for with an unbalanced JFET buffer is a bit short of 10 Vpp. +12.5 dBu translates to 9.25 Vpp, bingo. In this case you may never see more than -9.5 dBFS at minimum input gain. Which, mind you, doesn't jive with any of your results. Some reverse-engineering may be required to solve this particular mystery.
Would be the first time but definitely something I consider doing. But at first I have to calibrate my whole measurement chain...

Thank you very much - your answers helped me a lot. Do you maybe know good literature that includes the calibration of interfaces, amplifiers and sensors?
 
Not sure this is relevant, and I have not had time to wade through all of the detailed postings... but keep in mind:

1) Not all balanced output circuits will have symmetric voltages on the hot and cold. Using a typical 2V and 4V as reference levels and with input to the balanced device being at 2V, the normative balanced output could be any of the following: (+2V, -2V), (+2V, 0V), (+4V, 0V).

2) Not all balanced output circuits will provide gain structure for full "professional audio". Professional studio devices will typically be at the full output level, but other "prosumer" devices will be at a lower nominal gain.
 
Back
Top Bottom