• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

+4dbu or -10dbV

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
My system uses a dbx drive rack pa2 with +4dbu/-10dbV switch. I upgraded to a pre-amp that has xlr balanced out rated at 2 v / 600 ohm. Do I stick to consumer level -10dbV or do I switch. Appreciate the help.
 

LTig

Master Contributor
Forum Donor
Joined
Feb 27, 2019
Messages
5,760
Likes
9,442
Location
Europe
My system uses a dbx drive rack pa2 with +4dbu/-10dbV switch. I upgraded to a pre-amp that has xlr balanced out rated at 2 v / 600 ohm. Do I stick to consumer level -10dbV or do I switch. Appreciate the help.
Switch to +4dBu.
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,204
Likes
16,985
Location
Riverview FL
1599694539278.png


http://www.sengpielaudio.com/calculator-db-volt.htm
 

twsecrest

Addicted to Fun and Learning
Joined
Nov 27, 2018
Messages
894
Likes
291
Location
California
+4dBu for a balanced (XLR or TRS) connection and -10dBV for an unbalanced (RCA or TS) connection.
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,204
Likes
16,985
Location
Riverview FL
OP
Z

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
The unit might be quieter at -10. But if normal use has you clipping the meters, then you definitely want to go with +4.

Regards,
Wayne A. Pflughaupt

I'm a relative newbie so bare with me. I have not noticed any clipping. I observed two things since moving from RCA to XLR.
1. With RCA at lower volumes especially when watching TV sometimes one of the horns would stop working, with the XLRs it is no longer a problem. I'm guessing the issue was in the DBX since it's a pro unit and I'm listening at low volume on commercial equipment, it simply wasn't receiving enough signal to push to the horn amp adequately.
2. I have not noticed any clipping, but the horns pop whenever I power on the tuner even though the preamp is turned all the way down. With the -10dBV it popped when I powered the preamp and any source, on +4dBU only with tuner.
 

AnalogSteph

Major Contributor
Joined
Nov 6, 2018
Messages
3,338
Likes
3,278
Location
.de
I would second the -10 dBV input setting, which accepts levels up to +7.7 dBu. If your preamp is a bit on the noisy side with plenty of low-distortion output, however (and your volume control never ever goes beyond the 10-11 o'clock position), the +4 dBu setting accepting up to +19.9 dBu may be more appropriate, which reduces effective gain in the DriveRack by 12 dB. (A lot of traditional hi-fi preamps are like that. -3 dBV sounds more like a Behringer 202HD though.) I am assuming that ADC input stage gain is being switched here, so this allows you to shift the 112 dB(A) of input ADC dynamic range up and down as needed.

I'm a relative newbie so bare with me. I have not noticed any clipping. I observed two things since moving from RCA to XLR.
1. With RCA at lower volumes especially when watching TV sometimes one of the horns would stop working, with the XLRs it is no longer a problem. I'm guessing the issue was in the DBX since it's a pro unit and I'm listening at low volume on commercial equipment, it simply wasn't receiving enough signal to push to the horn amp adequately.
That's not how that works. Sounds dodgy to me.
2. I have not noticed any clipping, but the horns pop whenever I power on the tuner even though the preamp is turned all the way down. With the -10dBV it popped when I powered the preamp and any source, on +4dBU only with tuner.
This may be worth investigating further. Tuner sounds like something connected to an external antenna, usually grounded, so I suspect there may be a hardly-avoidable ground loop involving part of the preamp.

Could you give us a full rundown of the system? Of particularly interest would be:
* speaker sensitivities and types
* types of power amps (and input level settings or external attenuators if any)
* type of preamp and sources

When I hear "horn", this gets my alarm bells ringing - there's plenty of opportunities for less than ideal setup. The DriveRack has one fixed output level of +20 dBu for 0 dBFS and 112 dB(A) worth of dynamic range below that. Unless you have extended output level requirements, you want to be setting up your treble power amp with just enough overall gain to be hitting ~110 dB SPL for +20 dBu. So if your SPL meter says you're playing a 90 dB (unweighted) signal, the "20" output level LED should be flashing or just about active. (Hearing protection may be advised. Also, AFS off.)

This is potentially going one step further than the gain staging discussion found on pp. 19-20 in the manual. They are aiming for clipping level on the amplifier, but something like a horn is likely to deliver >100 dB @ 1 W / 1 m, so a measly 10 W would be good for 110 dB @ 1 m already.

In fact, you may even want to be going so far as to be padding down a horn tweeter with a big ol' resistor in series. Those approaching or exceeding 110 dB sensitivity in particular may make it very hard to find a power amplifier with sufficiently low noise levels otherwise. It would generally tend to reduce amplifier loading and driver distortion levels to boot. I'd start with 10-22 ohms, 10-20 W or thereabouts. Not generally any more than needed to bring sensitivity in line with the bass though.
 
OP
Z

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
I would second the -10 dBV input setting, which accepts levels up to +7.7 dBu. If your preamp is a bit on the noisy side with plenty of low-distortion output, however (and your volume control never ever goes beyond the 10-11 o'clock position), the +4 dBu setting accepting up to +19.9 dBu may be more appropriate, which reduces effective gain in the DriveRack by 12 dB. (A lot of traditional hi-fi preamps are like that. -3 dBV sounds more like a Behringer 202HD though.) I am assuming that ADC input stage gain is being switched here, so this allows you to shift the 112 dB(A) of input ADC dynamic range up and down as needed.


That's not how that works. Sounds dodgy to me.

This may be worth investigating further. Tuner sounds like something connected to an external antenna, usually grounded, so I suspect there may be a hardly-avoidable ground loop involving part of the preamp.

Could you give us a full rundown of the system? Of particularly interest would be:
* speaker sensitivities and types
* types of power amps (and input level settings or external attenuators if any)
* type of preamp and sources

When I hear "horn", this gets my alarm bells ringing - there's plenty of opportunities for less than ideal setup. The DriveRack has one fixed output level of +20 dBu for 0 dBFS and 112 dB(A) worth of dynamic range below that. Unless you have extended output level requirements, you want to be setting up your treble power amp with just enough overall gain to be hitting ~110 dB SPL for +20 dBu. So if your SPL meter says you're playing a 90 dB (unweighted) signal, the "20" output level LED should be flashing or just about active. (Hearing protection may be advised. Also, AFS off.)

This is potentially going one step further than the gain staging discussion found on pp. 19-20 in the manual. They are aiming for clipping level on the amplifier, but something like a horn is likely to deliver >100 dB @ 1 W / 1 m, so a measly 10 W would be good for 110 dB @ 1 m already.

In fact, you may even want to be going so far as to be padding down a horn tweeter with a big ol' resistor in series. Those approaching or exceeding 110 dB sensitivity in particular may make it very hard to find a power amplifier with sufficiently low noise levels otherwise. It would generally tend to reduce amplifier loading and driver distortion levels to boot. I'd start with 10-22 ohms, 10-20 W or thereabouts. Not generally any more than needed to bring sensitivity in line with the bass though.
Thank you for this. I will start by saying that reading your response makes me realize that perhaps I'm trying to do much more than my actual knowledge and understanding would allow.

Here is the system breakdown:
Speakers:
1. Altec Lansing Model 14 modified to bypass the internal crossover and with a 47uf capacitors soldered in to protect the horns.
I know the speaker unmodified has a sensitivity of 95db but I can't find the individual specs although I seem to remember that the horn might be 121db.
2. Mirage subwoofer.


Pre:
Denon PRA-1500 (tuner is powered from the preamp) never listen above 11, definitely never above 11:30 and most of the time I'm around 9-10.

Amplifiers:
Horns: Sony TA-N110 (attenuator set to just a dash above 22db), prior Rotel RB951
Mids: Parasound 2125 (set to max or THX), although to be replaced soon with a Sony TA-N77ES.

Sources:
CD, Cassette, DAT, R2R, TT, Tuner, audio chromecast into external DAC, and a computer screen with fire tv.
 
OP
Z

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
I would second the -10 dBV input setting, which accepts levels up to +7.7 dBu. If your preamp is a bit on the noisy side with plenty of low-distortion output, however (and your volume control never ever goes beyond the 10-11 o'clock position), the +4 dBu setting accepting up to +19.9 dBu may be more appropriate, which reduces effective gain in the DriveRack by 12 dB. (A lot of traditional hi-fi preamps are like that. -3 dBV sounds more like a Behringer 202HD though.) I am assuming that ADC input stage gain is being switched here, so this allows you to shift the 112 dB(A) of input ADC dynamic range up and down as needed.


That's not how that works. Sounds dodgy to me.

This may be worth investigating further. Tuner sounds like something connected to an external antenna, usually grounded, so I suspect there may be a hardly-avoidable ground loop involving part of the preamp.

Could you give us a full rundown of the system? Of particularly interest would be:
* speaker sensitivities and types
* types of power amps (and input level settings or external attenuators if any)
* type of preamp and sources

When I hear "horn", this gets my alarm bells ringing - there's plenty of opportunities for less than ideal setup. The DriveRack has one fixed output level of +20 dBu for 0 dBFS and 112 dB(A) worth of dynamic range below that. Unless you have extended output level requirements, you want to be setting up your treble power amp with just enough overall gain to be hitting ~110 dB SPL for +20 dBu. So if your SPL meter says you're playing a 90 dB (unweighted) signal, the "20" output level LED should be flashing or just about active. (Hearing protection may be advised. Also, AFS off.)

This is potentially going one step further than the gain staging discussion found on pp. 19-20 in the manual. They are aiming for clipping level on the amplifier, but something like a horn is likely to deliver >100 dB @ 1 W / 1 m, so a measly 10 W would be good for 110 dB @ 1 m already.

In fact, you may even want to be going so far as to be padding down a horn tweeter with a big ol' resistor in series. Those approaching or exceeding 110 dB sensitivity in particular may make it very hard to find a power amplifier with sufficiently low noise levels otherwise. It would generally tend to reduce amplifier loading and driver distortion levels to boot. I'd start with 10-22 ohms, 10-20 W or thereabouts. Not generally any more than needed to bring sensitivity in line with the bass though.

Solved horn pop, I had the xlr ground on the dbx lifted, I re-engaged it and no more horn pop.
 

watchnerd

Grand Contributor
Joined
Dec 8, 2016
Messages
12,449
Likes
10,409
Location
Seattle Area, USA
The unit might be quieter at -10. But if normal use has you clipping the meters, then you definitely want to go with +4.

Regards,
Wayne A. Pflughaupt

I use +4 dBU across the board.

My reel-to-reel XLR line out is +4 dBU.

I can trim to line-out and VU meters on the deck, then set my ADDA (RME ADI-2 Pro) to +4 dBU analog input and +4 dBU analog output to my monitor speakers (Dynaudio LYD 5).
 

AnalogSteph

Major Contributor
Joined
Nov 6, 2018
Messages
3,338
Likes
3,278
Location
.de
Here is the system breakdown:
Speakers:
1. Altec Lansing Model 14 modified to bypass the internal crossover and with a 47uf capacitors soldered in to protect the horns.
I know the speaker unmodified has a sensitivity of 95db but I can't find the individual specs although I seem to remember that the horn might be 121db.
Not much of a protection there, except from DC.

Someone went to the trouble of reverse-engineering their Model 14 crossover in this thread.

I'd say about 100-105 dB on the horn with 908-8B driver. (Spec for this one seems to have been 105 dB with 511B horn.) The major peak that the original XO EQs in seems to mostly go away in-room, but I can't say I'm majorly impressed by FR flatness by modern standards... quite mid-centric. Going active probably was a good idea.

The XOs would have some 10 ohms that you could pinch, though I'd imagine something equivalent new wouldn't be overly expensive either... if you go out and buy some new ones, something not too terribly inductive would be good (wirewounds can be due to their very nature of being coiled-up wire).

Pre:
Denon PRA-1500 (tuner is powered from the preamp) never listen above 11, definitely never above 11:30 and most of the time I'm around 9-10.
That's a typical setting for hi-fi but really you have way more headroom than you need. Even 11 o'clock should be something like at least 20, probably more like 30 dB down from max gain, i.e. your overall preamp gain is thoroughly below unity all the time (since preamp gain = ~+16.5 dB). I bet the input level LEDs on the DriveRack barely make it to 15 (or even just 20) when you're cranking it, right?

I would say stick with +10 dBV input for now but you could stand shedding some overall gain so you can turn up the preamp some more. What you want is the input level LEDs rarely flashing 3, more commonly displaying 10 and very commonly showing 20 when you're cranking it with dynamic material (e.g. '80s synthpop or classical symphonies). I would probably briefly stop playback and turn off the power amps once you've established that playback levels are where you want them, so you can fine-tune levels without blowing out your ears. ;)

(Should it already be like this as-is, although I doubt it, switch to +4 dBu input.)

Amplifiers:
Horns: Sony TA-N110 (attenuator set to just a dash above 22db), prior Rotel RB951
Mids: Parasound 2125 (set to max or THX), although to be replaced soon with a Sony TA-N77ES.
The 2125 looks like IEC Class I - have you seen ground loop issues with this one? If the Sony doesn't work out (which I doubt), you could still try one of Yamaha's '80s to early '90s P series PA amplifiers, they had some that were really quite well-specced and they come with input attenuators and XLR standard.

I would suggest:
Either get a 10-12 dB line-level attenuator for the Parasound/Sony, or turn down that output in the DriveRack correspondingly (if noise levels from the woofer are not audible / problematic).
Add series resistor to horn. May still need ~3-5 dB more on the attenuation dial to match with a 10 ohm, 15-22 ohm should be about just right as-is. Measure effect on frequency response and correct if needed.
That should, in theory, get you closer to a typical 11 o'clock on the preamp.

Sources:
CD, Cassette, DAT, R2R, TT, Tuner, audio chromecast into external DAC, and a computer screen with fire tv.
Holy moly, that's basically a Techmoan level setup there. :) (Not quite restorer_john level perhaps.)
 
Last edited:
OP
Z

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
Not much of a protection there, except from DC.

Someone went to the trouble of reverse-engineering their Model 14 crossover in this thread.

I'd say about 100-105 dB on the horn with 908-8B driver. (Spec for this one seems to have been 105 dB with 511B horn.) The major peak that the original XO EQs in seems to mostly go away in-room, but I can't say I'm majorly impressed by FR flatness by modern standards... quite mid-centric. Going active probably was a good idea.

The XOs would have some 10 ohms that you could pinch, though I'd imagine something equivalent new wouldn't be overly expensive either... if you go out and buy some new ones, something not too terribly inductive would be good (wirewounds can be due to their very nature of being coiled-up wire).


That's a typical setting for hi-fi but really you have way more headroom than you need. Even 11 o'clock should be something like at least 20, probably more like 30 dB down from max gain, i.e. your overall preamp gain is thoroughly below unity all the time (since preamp gain = ~+16.5 dB). I bet the input level LEDs on the DriveRack barely make it to 15 (or even just 20) when you're cranking it, right?

I would say stick with +10 dBV input for now but you could stand shedding some overall gain so you can turn up the preamp some more. What you want is the input level LEDs rarely flashing 3, more commonly displaying 10 and very commonly showing 20 when you're cranking it with dynamic material (e.g. '80s synthpop or classical symphonies). I would probably briefly stop playback and turn off the power amps once you've established that playback levels are where you want them, so you can fine-tune levels without blowing out your ears. ;)

(Should it already be like this as-is, although I doubt it, switch to +4 dBu input.)


The 2125 looks like IEC Class I - have you seen ground loop issues with this one? If the Sony doesn't work out (which I doubt), you could still try one of Yamaha's '80s to early '90s P series PA amplifiers, they had some that were really quite well-specced and they come with input attenuators and XLR standard.

I would suggest:
Either get a 10-12 dB line-level attenuator for the Parasound/Sony, or turn down that output in the DriveRack correspondingly (if noise levels from the woofer are not audible / problematic).
Add series resistor to horn. May still need ~3-5 dB more on the attenuation dial to match with a 10 ohm, 15-22 ohm should be about just right as-is. Measure effect on frequency response and correct if needed.
That should, in theory, get you closer to a typical 11 o'clock on the preamp.


Holy moly, that's basically a Techmoan level setup there. :) (Not quite restorer_john level perhaps.)
Thank you for such a through response.
I never setup the gain on the dbx (pg 19-20) so i don't know if that plays a role but I don't even hit the 20 db, just the signal lights come on and off.
I used to get some ground hum with the Parasound but switching the wiring around took care of that except when using the TT (any TT) especially on needle drop, but it is not audible once the music starts.
As for the setup :) I also have a minidisc decks that I forgot to mention, and in my other setup there's an El-caset and a pcm adapter that I'm about to pair with betamax player. So yes, I do have a problem. Love Techmoan by the way.
 
OP
Z

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
Thank you for such a through response.
I never setup the gain on the dbx (pg 19-20) so i don't know if that plays a role but I don't even hit the 20 db, just the signal lights come on and off.
I used to get some ground hum with the Parasound but switching the wiring around took care of that except when using the TT (any TT) especially on needle drop, but it is not audible once the music starts.
As for the setup :) I also have a minidisc decks that I forgot to mention, and in my other setup there's an El-caset and a pcm adapter that I'm about to pair with betamax player. So yes, I do have a problem. Love Techmoan by the way.
Not much of a protection there, except from DC.

Someone went to the trouble of reverse-engineering their Model 14 crossover in this thread.

I'd say about 100-105 dB on the horn with 908-8B driver. (Spec for this one seems to have been 105 dB with 511B horn.) The major peak that the original XO EQs in seems to mostly go away in-room, but I can't say I'm majorly impressed by FR flatness by modern standards... quite mid-centric. Going active probably was a good idea.

The XOs would have some 10 ohms that you could pinch, though I'd imagine something equivalent new wouldn't be overly expensive either... if you go out and buy some new ones, something not too terribly inductive would be good (wirewounds can be due to their very nature of being coiled-up wire).


That's a typical setting for hi-fi but really you have way more headroom than you need. Even 11 o'clock should be something like at least 20, probably more like 30 dB down from max gain, i.e. your overall preamp gain is thoroughly below unity all the time (since preamp gain = ~+16.5 dB). I bet the input level LEDs on the DriveRack barely make it to 15 (or even just 20) when you're cranking it, right?

I would say stick with +10 dBV input for now but you could stand shedding some overall gain so you can turn up the preamp some more. What you want is the input level LEDs rarely flashing 3, more commonly displaying 10 and very commonly showing 20 when you're cranking it with dynamic material (e.g. '80s synthpop or classical symphonies). I would probably briefly stop playback and turn off the power amps once you've established that playback levels are where you want them, so you can fine-tune levels without blowing out your ears. ;)

(Should it already be like this as-is, although I doubt it, switch to +4 dBu input.)


The 2125 looks like IEC Class I - have you seen ground loop issues with this one? If the Sony doesn't work out (which I doubt), you could still try one of Yamaha's '80s to early '90s P series PA amplifiers, they had some that were really quite well-specced and they come with input attenuators and XLR standard.

I would suggest:
Either get a 10-12 dB line-level attenuator for the Parasound/Sony, or turn down that output in the DriveRack correspondingly (if noise levels from the woofer are not audible / problematic).
Add series resistor to horn. May still need ~3-5 dB more on the attenuation dial to match with a 10 ohm, 15-22 ohm should be about just right as-is. Measure effect on frequency response and correct if needed.
That should, in theory, get you closer to a typical 11 o'clock on the preamp.


Holy moly, that's basically a Techmoan level setup there. :) (Not quite restorer_john level perhaps.)
Sorry, yes you're right, when cracking it up I think it will occasionally hit 15 but at normal levels it's just the signal light.
 

AnalogSteph

Major Contributor
Joined
Nov 6, 2018
Messages
3,338
Likes
3,278
Location
.de
Sorry, yes you're right, when cracking it up I think it will occasionally hit 15 but at normal levels it's just the signal light.
Then you know how to fix that now. :)

Your total gain after the volume pot in the treble section right now would seem to be:
+16.5 dB for the preamp
+12 dB for -10 dBV in --> +4 dBu out @ DBX
+/- extra gain in DBX, unknown
-22 dB for the TA-N110 input attenuator
+40 dB for TA-N110 voltage gain (= +18 dB for TA-N110)
---------------------------------------
= 46,5 dB +/-x

Hmm. Do you have some extra negative input or output gain dialed in on the DriveRack? Given that your volume settings are fairly typical, I would have expected about 10 dB less since your speakers ought to be 10 dB more sensitive than average. 46.5 dB would be typical 100 wpc integrated amplifier terrain and should generally result in quite plainly audible hiss from the horn. Do you hear any?

BTW, the TA-N110 appears to have been more or less a datasheet application for the STK4182II. While the datasheet is not clear on the matter at all, I would expect that you should be able to modify the amplifier to reduce gain to the 30 dB vicinity with no ill effects (R109/R159 would be the resistors to change; stock = 560R, you could try e.g. 1.2k for 33.5 dB or 1k for 35 dB; ordinary 1/4 W metal film). This should reduce output noise and distortion accordingly.
Even as-is, just the extra series resistor should help distortion performance (and obviously noise as well) - the datasheet indicates noticeably increased distortion levels at 20 kHz even for an 8 ohm load, indicating crossover distortion creeping in. (Things get more severe at 4 ohms, so the opposite should be happening for e.g. an 18 ohm = 10 + 8 ohm load.) You need about 4 W max, and 20 W / 8 ohm level looks like a comfortable level for the amplifier, so you want about 7 dB worth of attenuation... 10 ohms in series would be bang-on yet again.
That being said, compression drivers are notorious for high 2nd-order distortion, which is very likely to dwarf the amplifier's <0.1% dominant 2nd either way. Amplifiers for these things can be quite low-tech as long as noise is kept in check.
 
OP
Z

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
Hiss is audible if you're standing next to the horn with no audio playing. As I mentioned before this beyond my knowledge level so while I will be able to follow your suggestions it doesn't mean I actually understand a lot of it. And from that shallow well comes this question: why wouldn't just dialing back the attenuators on both amps be enough?
 

AnalogSteph

Major Contributor
Joined
Nov 6, 2018
Messages
3,338
Likes
3,278
Location
.de
And from that shallow well comes this question: why wouldn't just dialing back the attenuators on both amps be enough?
That only helps for noise coming in from before the attenuator, not inherent amplifier noise generated after it. Beyond a certain point, you will not be able to reduce noise even further, and it should still be quite audible then.

TA-N110 SNR is given as 105 dB, input shorted (or maximum attenuation). This indicates your best-case noise floor with this amp, 105 dB below 50 W / 8 ohm - that translates to 88 dB below 1 W / 8 ohm or 92 dB below 5 W / 4 ohm, or ~112 µV. That would be a good value for an integrated amp, for a power amplifier it's not particularly great though. (Amir has measured several amps scoring 100-105 dB SINAD at 5 W / 4 ohm level.) Clearly with a 105 dB horn you'd have to expect somewhere around 17 dB SPL @ 1 m. Math gets super easy in this case:
P_out [dB SPL / 1 m] = (+17 dBW - 105 dB) +105 dB SPL / W / m
That's the absolute best you can do in this combination. This amplifier normally is best used with speakers of about 93 dB SPL sensitivity max.

For DriveRack output DAC noise (+20 dBu - 110 dB = -90 dBu) to equal TA-N110 noise, the amplifier's attenuation would have to be set to about 26.8 dB. At this point, total noise output = 20 dB SPL @ 1 m (as two random noise sources of equal power add 3 dB). Not at all spectacular, I have been able to clearly detect about 7 dB SPL and would aim for 0 dB SPL at listening position if I can help it.

With your current settings, you should be about +5.5 dB above 17 dB SPL, so ~22.5 dB SPL @ 1 m. That's with preamplifier and ADC noise not even considered yet.

Adding a series resistor drops amplifier noise and signal equally, while adding negligible noise in itself (approx. -153 dB below 2.83 V = 1 W @ 8 ohm for a 10 ohm resistor, so it's safe to say that would be far below 0 dB SPL for any driver in existence). A value between 10 and 24 ohms would give noise levels in the 10 dB SPL to 5 dB SPL @ 1 m vicinity.

So I would suggest:
* increase TA-N110 attenuation to 28 dB
* 10 ohm resistor (you could go higher but it doesn't seem like you're particularly bothered by existing noise levels)
* reduce bass channel gain to match relative levels again
* increase preamp volume to ~11 o'clock typ
* watch input and output levels on the DriveRack
 
Last edited:
OP
Z

Zaki Ghul

Member
Joined
Sep 29, 2019
Messages
43
Likes
79
That only helps for noise coming in from before the attenuator, not inherent amplifier noise generated after it. Beyond a certain point, you will not be able to reduce noise even further, and it should still be quite audible then.

TA-N110 SNR is given as 105 dB, input shorted (or maximum attenuation). This indicates your best-case noise floor with this amp, 105 dB below 50 W / 8 ohm - that translates to 88 dB below 1 W / 8 ohm or 92 dB below 5 W / 4 ohm, or ~112 µV. That would be a good value for an integrated amp, for a power amplifier it's not particularly great though. (Amir has measured several amps scoring 100-105 dB SINAD at 5 W / 4 ohm level.) Clearly with a 105 dB horn you'd have to expect somewhere around 17 dB SPL @ 1 m. Math gets super easy in this case:
P_out [dB SPL / 1 m] = (+17 dBW - 105 dB) +105 dB SPL / W / m
That's the absolute best you can do in this combination. This amplifier normally is best used with speakers of about 93 dB SPL sensitivity max.

For DriveRack output DAC noise (+20 dBu - 110 dB = -90 dBu) to equal TA-N110 noise, the amplifier's attenuation would have to be set to about 26.8 dB. At this point, total noise output = 20 dB SPL @ 1 m (as two random noise sources of equal power add 3 dB). Not at all spectacular, I have been able to clearly detect about 7 dB SPL and would aim for 0 dB SPL at listening position if I can help it.

With your current settings, you should be about +5.5 dB above 17 dB SPL, so ~22.5 dB SPL @ 1 m. That's with preamplifier and ADC noise not even considered yet.

Adding a series resistor drops amplifier noise and signal equally, while adding negligible noise in itself (approx. -153 dB below 2.83 V = 1 W @ 8 ohm for a 10 ohm resistor, so it's safe to say that would be far below 0 dB SPL for any driver in existence). A value between 10 and 24 ohms would give noise levels in the 10 dB SPL to 5 dB SPL @ 1 m vicinity.

So I would suggest:
* increase TA-N110 attenuation to 28 dB
* 10 ohm resistor (you could go higher but it doesn't seem like you're particularly bothered by existing noise levels)
* reduce bass channel gain to match relative levels again
* increase preamp volume to ~11 o'clock typ
* watch input and output levels on the DriveRack
I plan on switching the TA-N110 with a TA-N55ES in the near future. Would you still recommend adding the resistors to the horns?
Also, is unity gain for the Denon around the 11 o'clock?
Thanks again,
 
Top Bottom