Switch to +4dBu.My system uses a dbx drive rack pa2 with +4dbu/-10dbV switch. I upgraded to a pre-amp that has xlr balanced out rated at 2 v / 600 ohm. Do I stick to consumer level -10dbV or do I switch. Appreciate the help.
So I'm neither here nor there. I'm basically -0.9 dBu and -3 dBV. Do I go with the +4dBu to protect my speakers?
The unit might be quieter at -10. But if normal use has you clipping the meters, then you definitely want to go with +4.
Regards,
Wayne A. Pflughaupt
thank youhttps://dbxpro.com/en/product_documents/driverack_pa2_manual_5044138-apdf
Setup description page 21.
You have attenuators in addition to the input level switches.
Adjust to your own needs.
That's not how that works. Sounds dodgy to me.I'm a relative newbie so bare with me. I have not noticed any clipping. I observed two things since moving from RCA to XLR.
1. With RCA at lower volumes especially when watching TV sometimes one of the horns would stop working, with the XLRs it is no longer a problem. I'm guessing the issue was in the DBX since it's a pro unit and I'm listening at low volume on commercial equipment, it simply wasn't receiving enough signal to push to the horn amp adequately.
This may be worth investigating further. Tuner sounds like something connected to an external antenna, usually grounded, so I suspect there may be a hardly-avoidable ground loop involving part of the preamp.2. I have not noticed any clipping, but the horns pop whenever I power on the tuner even though the preamp is turned all the way down. With the -10dBV it popped when I powered the preamp and any source, on +4dBU only with tuner.
Thank you for this. I will start by saying that reading your response makes me realize that perhaps I'm trying to do much more than my actual knowledge and understanding would allow.I would second the -10 dBV input setting, which accepts levels up to +7.7 dBu. If your preamp is a bit on the noisy side with plenty of low-distortion output, however (and your volume control never ever goes beyond the 10-11 o'clock position), the +4 dBu setting accepting up to +19.9 dBu may be more appropriate, which reduces effective gain in the DriveRack by 12 dB. (A lot of traditional hi-fi preamps are like that. -3 dBV sounds more like a Behringer 202HD though.) I am assuming that ADC input stage gain is being switched here, so this allows you to shift the 112 dB(A) of input ADC dynamic range up and down as needed.
That's not how that works. Sounds dodgy to me.
This may be worth investigating further. Tuner sounds like something connected to an external antenna, usually grounded, so I suspect there may be a hardly-avoidable ground loop involving part of the preamp.
Could you give us a full rundown of the system? Of particularly interest would be:
* speaker sensitivities and types
* types of power amps (and input level settings or external attenuators if any)
* type of preamp and sources
When I hear "horn", this gets my alarm bells ringing - there's plenty of opportunities for less than ideal setup. The DriveRack has one fixed output level of +20 dBu for 0 dBFS and 112 dB(A) worth of dynamic range below that. Unless you have extended output level requirements, you want to be setting up your treble power amp with just enough overall gain to be hitting ~110 dB SPL for +20 dBu. So if your SPL meter says you're playing a 90 dB (unweighted) signal, the "20" output level LED should be flashing or just about active. (Hearing protection may be advised. Also, AFS off.)
This is potentially going one step further than the gain staging discussion found on pp. 19-20 in the manual. They are aiming for clipping level on the amplifier, but something like a horn is likely to deliver >100 dB @ 1 W / 1 m, so a measly 10 W would be good for 110 dB @ 1 m already.
In fact, you may even want to be going so far as to be padding down a horn tweeter with a big ol' resistor in series. Those approaching or exceeding 110 dB sensitivity in particular may make it very hard to find a power amplifier with sufficiently low noise levels otherwise. It would generally tend to reduce amplifier loading and driver distortion levels to boot. I'd start with 10-22 ohms, 10-20 W or thereabouts. Not generally any more than needed to bring sensitivity in line with the bass though.
I would second the -10 dBV input setting, which accepts levels up to +7.7 dBu. If your preamp is a bit on the noisy side with plenty of low-distortion output, however (and your volume control never ever goes beyond the 10-11 o'clock position), the +4 dBu setting accepting up to +19.9 dBu may be more appropriate, which reduces effective gain in the DriveRack by 12 dB. (A lot of traditional hi-fi preamps are like that. -3 dBV sounds more like a Behringer 202HD though.) I am assuming that ADC input stage gain is being switched here, so this allows you to shift the 112 dB(A) of input ADC dynamic range up and down as needed.
That's not how that works. Sounds dodgy to me.
This may be worth investigating further. Tuner sounds like something connected to an external antenna, usually grounded, so I suspect there may be a hardly-avoidable ground loop involving part of the preamp.
Could you give us a full rundown of the system? Of particularly interest would be:
* speaker sensitivities and types
* types of power amps (and input level settings or external attenuators if any)
* type of preamp and sources
When I hear "horn", this gets my alarm bells ringing - there's plenty of opportunities for less than ideal setup. The DriveRack has one fixed output level of +20 dBu for 0 dBFS and 112 dB(A) worth of dynamic range below that. Unless you have extended output level requirements, you want to be setting up your treble power amp with just enough overall gain to be hitting ~110 dB SPL for +20 dBu. So if your SPL meter says you're playing a 90 dB (unweighted) signal, the "20" output level LED should be flashing or just about active. (Hearing protection may be advised. Also, AFS off.)
This is potentially going one step further than the gain staging discussion found on pp. 19-20 in the manual. They are aiming for clipping level on the amplifier, but something like a horn is likely to deliver >100 dB @ 1 W / 1 m, so a measly 10 W would be good for 110 dB @ 1 m already.
In fact, you may even want to be going so far as to be padding down a horn tweeter with a big ol' resistor in series. Those approaching or exceeding 110 dB sensitivity in particular may make it very hard to find a power amplifier with sufficiently low noise levels otherwise. It would generally tend to reduce amplifier loading and driver distortion levels to boot. I'd start with 10-22 ohms, 10-20 W or thereabouts. Not generally any more than needed to bring sensitivity in line with the bass though.
The unit might be quieter at -10. But if normal use has you clipping the meters, then you definitely want to go with +4.
Regards,
Wayne A. Pflughaupt
Not much of a protection there, except from DC.Here is the system breakdown:
Speakers:
1. Altec Lansing Model 14 modified to bypass the internal crossover and with a 47uf capacitors soldered in to protect the horns.
I know the speaker unmodified has a sensitivity of 95db but I can't find the individual specs although I seem to remember that the horn might be 121db.
That's a typical setting for hi-fi but really you have way more headroom than you need. Even 11 o'clock should be something like at least 20, probably more like 30 dB down from max gain, i.e. your overall preamp gain is thoroughly below unity all the time (since preamp gain = ~+16.5 dB). I bet the input level LEDs on the DriveRack barely make it to 15 (or even just 20) when you're cranking it, right?Pre:
Denon PRA-1500 (tuner is powered from the preamp) never listen above 11, definitely never above 11:30 and most of the time I'm around 9-10.
The 2125 looks like IEC Class I - have you seen ground loop issues with this one? If the Sony doesn't work out (which I doubt), you could still try one of Yamaha's '80s to early '90s P series PA amplifiers, they had some that were really quite well-specced and they come with input attenuators and XLR standard.Amplifiers:
Horns: Sony TA-N110 (attenuator set to just a dash above 22db), prior Rotel RB951
Mids: Parasound 2125 (set to max or THX), although to be replaced soon with a Sony TA-N77ES.
Holy moly, that's basically a Techmoan level setup there. (Not quite restorer_john level perhaps.)Sources:
CD, Cassette, DAT, R2R, TT, Tuner, audio chromecast into external DAC, and a computer screen with fire tv.
Thank you for such a through response.Not much of a protection there, except from DC.
Someone went to the trouble of reverse-engineering their Model 14 crossover in this thread.
I'd say about 100-105 dB on the horn with 908-8B driver. (Spec for this one seems to have been 105 dB with 511B horn.) The major peak that the original XO EQs in seems to mostly go away in-room, but I can't say I'm majorly impressed by FR flatness by modern standards... quite mid-centric. Going active probably was a good idea.
The XOs would have some 10 ohms that you could pinch, though I'd imagine something equivalent new wouldn't be overly expensive either... if you go out and buy some new ones, something not too terribly inductive would be good (wirewounds can be due to their very nature of being coiled-up wire).
That's a typical setting for hi-fi but really you have way more headroom than you need. Even 11 o'clock should be something like at least 20, probably more like 30 dB down from max gain, i.e. your overall preamp gain is thoroughly below unity all the time (since preamp gain = ~+16.5 dB). I bet the input level LEDs on the DriveRack barely make it to 15 (or even just 20) when you're cranking it, right?
I would say stick with +10 dBV input for now but you could stand shedding some overall gain so you can turn up the preamp some more. What you want is the input level LEDs rarely flashing 3, more commonly displaying 10 and very commonly showing 20 when you're cranking it with dynamic material (e.g. '80s synthpop or classical symphonies). I would probably briefly stop playback and turn off the power amps once you've established that playback levels are where you want them, so you can fine-tune levels without blowing out your ears.
(Should it already be like this as-is, although I doubt it, switch to +4 dBu input.)
The 2125 looks like IEC Class I - have you seen ground loop issues with this one? If the Sony doesn't work out (which I doubt), you could still try one of Yamaha's '80s to early '90s P series PA amplifiers, they had some that were really quite well-specced and they come with input attenuators and XLR standard.
I would suggest:
Either get a 10-12 dB line-level attenuator for the Parasound/Sony, or turn down that output in the DriveRack correspondingly (if noise levels from the woofer are not audible / problematic).
Add series resistor to horn. May still need ~3-5 dB more on the attenuation dial to match with a 10 ohm, 15-22 ohm should be about just right as-is. Measure effect on frequency response and correct if needed.
That should, in theory, get you closer to a typical 11 o'clock on the preamp.
Holy moly, that's basically a Techmoan level setup there. (Not quite restorer_john level perhaps.)
Thank you for such a through response.
I never setup the gain on the dbx (pg 19-20) so i don't know if that plays a role but I don't even hit the 20 db, just the signal lights come on and off.
I used to get some ground hum with the Parasound but switching the wiring around took care of that except when using the TT (any TT) especially on needle drop, but it is not audible once the music starts.
As for the setup I also have a minidisc decks that I forgot to mention, and in my other setup there's an El-caset and a pcm adapter that I'm about to pair with betamax player. So yes, I do have a problem. Love Techmoan by the way.
Sorry, yes you're right, when cracking it up I think it will occasionally hit 15 but at normal levels it's just the signal light.Not much of a protection there, except from DC.
Someone went to the trouble of reverse-engineering their Model 14 crossover in this thread.
I'd say about 100-105 dB on the horn with 908-8B driver. (Spec for this one seems to have been 105 dB with 511B horn.) The major peak that the original XO EQs in seems to mostly go away in-room, but I can't say I'm majorly impressed by FR flatness by modern standards... quite mid-centric. Going active probably was a good idea.
The XOs would have some 10 ohms that you could pinch, though I'd imagine something equivalent new wouldn't be overly expensive either... if you go out and buy some new ones, something not too terribly inductive would be good (wirewounds can be due to their very nature of being coiled-up wire).
That's a typical setting for hi-fi but really you have way more headroom than you need. Even 11 o'clock should be something like at least 20, probably more like 30 dB down from max gain, i.e. your overall preamp gain is thoroughly below unity all the time (since preamp gain = ~+16.5 dB). I bet the input level LEDs on the DriveRack barely make it to 15 (or even just 20) when you're cranking it, right?
I would say stick with +10 dBV input for now but you could stand shedding some overall gain so you can turn up the preamp some more. What you want is the input level LEDs rarely flashing 3, more commonly displaying 10 and very commonly showing 20 when you're cranking it with dynamic material (e.g. '80s synthpop or classical symphonies). I would probably briefly stop playback and turn off the power amps once you've established that playback levels are where you want them, so you can fine-tune levels without blowing out your ears.
(Should it already be like this as-is, although I doubt it, switch to +4 dBu input.)
The 2125 looks like IEC Class I - have you seen ground loop issues with this one? If the Sony doesn't work out (which I doubt), you could still try one of Yamaha's '80s to early '90s P series PA amplifiers, they had some that were really quite well-specced and they come with input attenuators and XLR standard.
I would suggest:
Either get a 10-12 dB line-level attenuator for the Parasound/Sony, or turn down that output in the DriveRack correspondingly (if noise levels from the woofer are not audible / problematic).
Add series resistor to horn. May still need ~3-5 dB more on the attenuation dial to match with a 10 ohm, 15-22 ohm should be about just right as-is. Measure effect on frequency response and correct if needed.
That should, in theory, get you closer to a typical 11 o'clock on the preamp.
Holy moly, that's basically a Techmoan level setup there. (Not quite restorer_john level perhaps.)
Then you know how to fix that now.Sorry, yes you're right, when cracking it up I think it will occasionally hit 15 but at normal levels it's just the signal light.
That only helps for noise coming in from before the attenuator, not inherent amplifier noise generated after it. Beyond a certain point, you will not be able to reduce noise even further, and it should still be quite audible then.And from that shallow well comes this question: why wouldn't just dialing back the attenuators on both amps be enough?
I plan on switching the TA-N110 with a TA-N55ES in the near future. Would you still recommend adding the resistors to the horns?That only helps for noise coming in from before the attenuator, not inherent amplifier noise generated after it. Beyond a certain point, you will not be able to reduce noise even further, and it should still be quite audible then.
TA-N110 SNR is given as 105 dB, input shorted (or maximum attenuation). This indicates your best-case noise floor with this amp, 105 dB below 50 W / 8 ohm - that translates to 88 dB below 1 W / 8 ohm or 92 dB below 5 W / 4 ohm, or ~112 µV. That would be a good value for an integrated amp, for a power amplifier it's not particularly great though. (Amir has measured several amps scoring 100-105 dB SINAD at 5 W / 4 ohm level.) Clearly with a 105 dB horn you'd have to expect somewhere around 17 dB SPL @ 1 m. Math gets super easy in this case:
P_out [dB SPL / 1 m] = (+17 dBW - 105 dB) +105 dB SPL / W / m
That's the absolute best you can do in this combination. This amplifier normally is best used with speakers of about 93 dB SPL sensitivity max.
For DriveRack output DAC noise (+20 dBu - 110 dB = -90 dBu) to equal TA-N110 noise, the amplifier's attenuation would have to be set to about 26.8 dB. At this point, total noise output = 20 dB SPL @ 1 m (as two random noise sources of equal power add 3 dB). Not at all spectacular, I have been able to clearly detect about 7 dB SPL and would aim for 0 dB SPL at listening position if I can help it.
With your current settings, you should be about +5.5 dB above 17 dB SPL, so ~22.5 dB SPL @ 1 m. That's with preamplifier and ADC noise not even considered yet.
Adding a series resistor drops amplifier noise and signal equally, while adding negligible noise in itself (approx. -153 dB below 2.83 V = 1 W @ 8 ohm for a 10 ohm resistor, so it's safe to say that would be far below 0 dB SPL for any driver in existence). A value between 10 and 24 ohms would give noise levels in the 10 dB SPL to 5 dB SPL @ 1 m vicinity.
So I would suggest:
* increase TA-N110 attenuation to 28 dB
* 10 ohm resistor (you could go higher but it doesn't seem like you're particularly bothered by existing noise levels)
* reduce bass channel gain to match relative levels again
* increase preamp volume to ~11 o'clock typ
* watch input and output levels on the DriveRack