• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Volume control

Frank Dernie

Master Contributor
Forum Donor
Joined
Mar 24, 2016
Messages
6,445
Likes
15,780
Location
Oxfordshire
I have noticed several members writing about “full volume” and complaining about distortion there.
Back in the day it was normal for an analogue amplifier to yield maximum power at rated input voltage with the volume control around 2 o’clock. This allowed for sources with low output to still be used up to full power without running out of volume control travel.
It does mean that amplifiers of this generation would be in clipping way before full rotation of the volume pot on normal sources and way before on CD - so “full volume” is not achieved at full rotation of the volume control but a fair bit less.
It may be different nowadays but I rather doubt it.
On my Devialet amp 0dB on the volume setting gives full power for a maximum digital signal but it continues up to +30dB which gives a massive margin for the analogue inputs. Using > 0dB will mean the amp is clipping on a full modulation digital input so 0dB is “full volume “ even though the control goes further.
This is normal I believe but does anybody know different for some products nowadays?
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
3,440
Likes
9,100
Location
Suffolk UK
My understanding, going by reviews from the late 1960s onwards, was that input sensitivity was measured with the volume control at full. Typically, that would have meant a sensitivity of around 150mV for line sources. Phono inputs were similarly done at full volume, resulting in a typical sensitivity of 2-3mV.

Clearly, with modern digital sources having typically a 2V output at 0dBFS, that means that an amplifier with that sort of sensitivity, the volume control would have to be way down to avoid amplifier clipping, and it wouldn't be difficult to get a comfortable listening volume without the control being very coarse, and channel balance being off. It's also possible that with 150mV sensitivity, a 2V signal could overload the front end, so no volume setting would result in good quality.

Amplifiers with 'CD' inputs seem to be around 500mV sensitivity, which still means that on a 2v source, the volume control can't be advanced to 'full' without the amp clipping. I accept that it's sensible to have some headroom for CDs that are quiet, but I suspect that a lot of the excess gain in amplifiers is due to trying to impress those that equate volume control position with power.....This amp's so powerful, I can blow out the windows with the volume control at 9 o'clock!

It seems to me that the sensible way to approach this is first to measure the input sensitivity and distortion with the volume control at full, then make sure there's no clipping or rise in distortion when a 1 or 2V signal is used and the volume control retarded accordingly. Only if the amplifier gives unusual readings for frequency response and noise with the 1-2V input would it then be worth looking into whether there's a difference between full and part volume except obviously for gain. Assuming no difference, then the measurements can be done at part volume as more convenient.

S.
 
OP
Frank Dernie

Frank Dernie

Master Contributor
Forum Donor
Joined
Mar 24, 2016
Messages
6,445
Likes
15,780
Location
Oxfordshire
My understanding, going by reviews from the late 1960s onwards, was that input sensitivity was measured with the volume control at full. Typically, that would have meant a sensitivity of around 150mV for line sources. Phono inputs were similarly done at full volume, resulting in a typical sensitivity of 2-3mV.

Clearly, with modern digital sources having typically a 2V output at 0dBFS, that means that an amplifier with that sort of sensitivity, the volume control would have to be way down to avoid amplifier clipping, and it wouldn't be difficult to get a comfortable listening volume without the control being very coarse, and channel balance being off. It's also possible that with 150mV sensitivity, a 2V signal could overload the front end, so no volume setting would result in good quality.

Amplifiers with 'CD' inputs seem to be around 500mV sensitivity, which still means that on a 2v source, the volume control can't be advanced to 'full' without the amp clipping. I accept that it's sensible to have some headroom for CDs that are quiet, but I suspect that a lot of the excess gain in amplifiers is due to trying to impress those that equate volume control position with power.....This amp's so powerful, I can blow out the windows with the volume control at 9 o'clock!

It seems to me that the sensible way to approach this is first to measure the input sensitivity and distortion with the volume control at full, then make sure there's no clipping or rise in distortion when a 1 or 2V signal is used and the volume control retarded accordingly. Only if the amplifier gives unusual readings for frequency response and noise with the 1-2V input would it then be worth looking into whether there's a difference between full and part volume except obviously for gain. Assuming no difference, then the measurements can be done at part volume as more convenient.

S.
I often suspected the reputation for early CD players being harsh may have been the inability of some popular preamps to deal with a 2V input when most line level sources at the time were 150-200mV max.
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
3,440
Likes
9,100
Location
Suffolk UK
I often suspected the reputation for early CD players being harsh may have been the inability of some popular preamps to deal with a 2V input when most line level sources at the time were 150-200mV max.
That's quite likely. Looking at the circuits for some late '70s amps, some had the tone controls before the volume control, and that stage could easily saturate especially if any bass boost was applied with an input 20-25dB over the rated sensitivity. Even those that had the volume control before the tone controls could still have inadequate overload margin.

S.
 

Panelhead

Senior Member
Forum Donor
Joined
Apr 3, 2018
Messages
348
Likes
137
The sensitivity of the loudspeakers must be factored in also. The total system gain defines the range of attenuation.
I like having sources that do not distort “wide open”.
Back in the pre- digital era many preamps became noisy after 2 - 3 o'clock rotation. Think it was to impress at 9 -10 o’clock how loud it was.
A proper volume controll should be as good wide open as it is mid rotation.
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,835
Likes
16,497
Location
Monument, CO
Should probably also consider saturation characteristics... Tubes, factorial devices, were mostly used in single-ended designs with low feedback and exhibited "soft" saturation. They could be pushed a ways into clipping before it became really objectionable. Bipolar transistors have an exponential distortion characteristic and so sound bad "faster" when pushed into saturation (clipping). They also tended to use more feedback, exacerbating their rapid rise in distortion with clipping. A little later, differential circuits would suppress the even harmonics, so apparent clipping was even worse when they were overdriven (so "soft" second harmonic distortion, it was now the third that dominated, with its associated IMD components near the main tone rather than down at DC or an octave higher where they were usually less audible). Tubes also had (have) much greater voltage headroom for the most part and so were less likely to clip (at least preamps and input stages) than relatively low-voltage solid-state circuits. FETs are ideally square-law devices with lower intrinsic distortion than tubes or bipolar devices, but in practice other device physics are at play in real devices so they tend to have a little of everything. MOS FETs and GaAs FETs also have high LF noise so special steps must be taken to compensate for that in the circuits (or use JFETs, pretty much the FET of choice for audio AFAIK).

I suspect the problem with volume controls was the circuitry around them and not the potentiometer itself...
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
3,440
Likes
9,100
Location
Suffolk UK
A potentiometer is never the problem, but overload characteristics are. Amplifiers should never be used into clipping, but often are. An amplifier when clipped should recover cleanly, and if it does, then audibly, one can tolerate quite a bit of clipping, as evidenced by the clipping that is done on CDs all the time these days. However, some amplifiers don't clip clean, they can latch or have bursts of instability which then affects what one hears.

I suspect, but have no proof, that the differences between amplifiers reported by subjectivists without any rigorous testing is down to running the amp into clipping, and what they're hearing is the different overload behaviour, which wouldn't be there at all if the amplifiers were evaluated blind, level matched and with levels properly controlled.

S
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
I have discussed this many times in the past on other forums, particularly the typical (270 degree pot) volume position around 10 o'clock where a typical 150mV for full output amp of the past would be with a 2V source. The S/N is often quite poor at those positions and worse, the frequency response may have significant anomalies.

Yes, input stage overload was an issue for some amplifiers in the 1980s when CD came along, and for many years after, we sold in-line attenuators to people who needed them. Sometimes it was just to regain the full use of the volume control. CD players with adjustable line outs were useful, although the source impedance became quite high when you used them, resulting in HF roll-off.

Yamaha was one of the first to use a digital output level control that didn't cause massive amounts of THD at high attenuation, but by that stage, amplifiers and preamplifiers being sold could handle enormous input voltages without issue. Many of my preamps lying around here can take several volts easily without overload into the line inputs, regardless of pot position.

The digital attenuators on gear these days can be all over the shop with 0dB, some go above it, others don't.
 

GoMrPickles

Active Member
Forum Donor
Joined
Nov 6, 2018
Messages
170
Likes
182
I have a similar question. My desktop listening chain consists of:
Some player with a volume control
An OS with a volume control (Ubuntu Linux with Spotify or Audacious)
A DAC with a volume control (SMSL SU-8)
Speakers with a built-in amp and volume control

As I understand it, it's best to set the software volume (whatever app I'm using and the OS volume) to max, but what about the rest of the chain? Is the actual noise introduced something like:
DAC noise * amp noise * speaker noise?
e.g., 0.9999 * 0.999 * 0.99 = 0.9889, or 1.11% THD?

As I understand it, most amps and active speakers will have lower distortion at higher volumes, but I'd also be going deaf at that point. My goal is low distortion at low volume. (or, "magic.")

Thanks. :)
 

Panelhead

Senior Member
Forum Donor
Joined
Apr 3, 2018
Messages
348
Likes
137
I prefer using a software volume control. Implementation is the key. Many available today are 64 bit. And dithered.
From building preamps with potentiometers made of plastic, carbon comp, resistor ladders, and optioisolators I prefer to max out the analog attenuators and control output digitally. Mid rotation tracking is terrible with some pots. Plus the taper is fake log made with two linear sections.
 

Shadrach

Addicted to Fun and Learning
Joined
Feb 24, 2019
Messages
662
Likes
947
I have discussed this many times in the past on other forums, particularly the typical (270 degree pot) volume position around 10 o'clock where a typical 150mV for full output amp of the past would be with a 2V source. The S/N is often quite poor at those positions and worse, the frequency response may have significant anomalies.
Could you explain to me why this is please? What it is about the position of the pot at this position that makes it more prone to anomalies than any other position.
The problems with low input voltages I can understand. In my early days of stereo audiophiles that I knew went for high efficiency speakers. In the 80's apart from CD becoming mainstream there were some major changes in speaker design often favoring metal, or plastic cones and less efficient drive units. Many of these speakers had multiple drive unit arrays and much more complex crossovers. Back then the view was better power supplies in the amp were required rather than outright wattage.
 

SIY

Grand Contributor
Technical Expert
Joined
Apr 6, 2018
Messages
10,383
Likes
24,749
Location
Alfred, NY
At -6dB, the source resistance of the pot is at a maximum. In a well-engineered preamp, this shouldn't matter, but audiophiles often like expensive and badly engineered gear.
 

Shadrach

Addicted to Fun and Learning
Joined
Feb 24, 2019
Messages
662
Likes
947
At -6dB, the source resistance of the pot is at a maximum. In a well-engineered preamp, this shouldn't matter, but audiophiles often like expensive and badly engineered gear.
Thank you. I wasn't sure.
 

anmpr1

Major Contributor
Forum Donor
Joined
Oct 11, 2018
Messages
3,722
Likes
6,406
I often suspected the reputation for early CD players being harsh may have been the inability of some popular preamps to deal with a 2V input when most line level sources at the time were 150-200mV max.
Maybe. I can't deny that. But I think there was more to it. I recently listened to several Miles Davis records (LP). In Europe; Four and More; Miles Ahead; Cookin' at the Plugged Nickel.

The last two were 'digitally remastered from the original analog tapes'. And they sounded distinctly different in balance from the first two original analog recordings. The HF on Cookin' made me want to turn down the treble, but I don't have tone controls. I got used to it after a while, but there was no doubt that the digital remaster of the analog was adding a lot of hot high frequency stuff not on the comparable analog only recordings, which had a more balanced sound.

Why? Who can say? As a consumer it's impossible to know for sure, and all is necessarily conjecture. Maybe the analog master tapes actually sounded closer to the digital copy, and this was faithfully pressed into the LP as an act of artistic love and historical accuracy. Possibly the engineers decided to make something 'new and improved' because they could. Maybe the execs at CBS told them that if they wanted to get paid, it had to sound 'different and alive' in order for them to sell an old record in a new package, using the then (1986) pretty new digital technology. Who knows how it went down? Unless you were in the control room, how could anyone know for sure?

FWIW, other than what was known as 'digital highs', the bass of the remasters did appear very life like and solid. Much better than that on the analog records. Again, whether this was a digital artifact, or just the care taken in making the new record, I don't know. I don't have the CDs to compare--again, this was using LP transfers.

Final note: my analog system is not a particularly 'hot on top' system. Technics 1200-5 with the KAB damper; Shure V-15x body with M97xE stylus (so the arm is damped at both the pivot and cartridge); Bellari VP-129 opamp into tube phono stage (using the subsonic filter), all into Benchmark electronics. Using something like the Audio Technica 440ML makes every record sound hot and aggressive, whereas the Shure has a slightly rolled off HF response above 10KHz or so.
 

Julf

Major Contributor
Forum Donor
Joined
Mar 1, 2016
Messages
3,004
Likes
3,998
Location
Amsterdam, The Netherlands
The last two were 'digitally remastered from the original analog tapes'. And they sounded distinctly different in balance from the first two original analog recordings. The HF on Cookin' made me want to turn down the treble, but I don't have tone controls. I got used to it after a while, but there was no doubt that the digital remaster of the analog was adding a lot of hot high frequency stuff not on the comparable analog only recordings, which had a more balanced sound.

The key word there is "remastered", not "digitally". Of course the remastering changes the sound.
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,835
Likes
16,497
Location
Monument, CO
There are many articles and threads about the CD remastering fiasco. Most consumers do not know and blame the bad sound on the CD and digital in general when it is really the result of bad mastering driven by executives rather than the sound engineers.

I was just looking for an older CD to replace one of mine that has played out and there are many comments on Amazon about which remasters to get that sound good, e.g. one CD in particular has a lot of comments to get the 2016 remaster and not the previous two or three. And of course that one is no longer available except at an exorbitant price.
 

anmpr1

Major Contributor
Forum Donor
Joined
Oct 11, 2018
Messages
3,722
Likes
6,406
The key word there is "remastered", not "digitally". Of course the remastering changes the sound.
If I had to put money down, my guess is that the execs at CBS told them to make it 'hot' on top so Miles' trumpet would stand out. I don't think anyone at CBS was concerned with 'natural' sonics. Ever. The old joke was, "CBS stands for Cost Before Sound." I'm sure management just saw digital as a marketing thing. A way to sell old product in a new can.

On the other hand, there was an anecdote from the time... a record company exec (don't remember who) that saw digits as the end of the road for company profits. He said something to the effect of "Why would we want to sell CDs, which is essentially giving away our master tapes?" The solution: "remaster" everything using new mixes. And now it's 'remastering' the old stuff in Hi Def 24/96 etc.

In this particular case the original 'producer' Teo Macero was involved. It is difficult to know who did what. Credits list a) digital remix producer; b) digital remix engineer; c) mastering engineer.
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,835
Likes
16,497
Location
Monument, CO
There are those that feel the 1980's-era CBS records were what drove companies like Mobile Fidelity and Telarc to prominence. Another fellow trumpet player, Herb Alpert, started his own company, A&M (but well before 1980, of course).
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,835
Likes
16,497
Location
Monument, CO
Top Bottom