I think appropriate for this thread is some maintenance work and testing I performed this weekend on an old Holman preamplifier. The preamp was provided to me by a forum member who had decided to go another direction and didn't therefore want to spend the several hundred dollars to have the unit restored by the company that did the work on the unit Amir tested.
I just completed replacing the capacitors in the unit and gave it a thorough test. I thought the results of that testing, and the way I performed it, might be interesting.
The Holman service manual suggests that the typical nulling distortion analyzer isn't sufficient, but I think that was written before the Hewlett Packard 339A and well before the digital 8903 analyzers. These analyzers work by nulling the frequency of interest and measuring the true RMS voltage of what remains. The problem Holman had with the distortion analyzers of 1978 was that they did not have true RMS voltmeters that properly integrated the area under the waveforms to get RMS and instead made assumptions of the average based on the shape of the waveform. Of course, if one is only measuring the distortion products and noise, the waveform is too messy for the usual simplifying assumptions. This was resolved by the introduction of the 339A, which sported a true analog RMS voltmeter. The 8903 distortion analyzer performs a precision calculation of RMS voltage by evaluating the digital stream--it was the first digital distortion analyzer Hewlett Packard made.
But these bench analyzers were the standard of the industry in the days when the Holman preamp was built and for probably 30 years thereafter, until Audio Precision came along. Both of the HP units have a bandwidth of 89 dB, but easily measure down to a distortion level of -100 dB. The oscillator source in the 339A is probably the cleanest pure analog oscillator ever available to mortals, and it has a specified distortion of less than -100 dB at least. Holman would have preferred a spectrum analyzer, such as the HP3580, to evaluate the level of the second harmonic in particular. Spectrum analyzers have not found their way to my bench, because even old ones that don't work are expensive. Software-based spectrum analysis is easy these days, and frankly for less than I spent on old HP equipment one could buy, say, a Quant Asylum QA403 and get about -110 dB measurement capability for about 600 bucks. I spent about that on the two analyzers I'm using. But for bench work, dedicated analyzers are amazingly quick and easy to use and don't require software maintenance.
In these days of amps with distortion and noise levels down below -110 or -120 dB, the old bench test equipment isn't good enough. But it's good enough for 1.) vintage equipment, and 2.) most people's human hearing, including mine.
The Holman Preamplifier was reviewed by Amir already, resulting in a long and very informative thread that attracted the participation of Tomlinson Holman is own self. But because of the methods, it seems to me, this needed a thread of its own.
There are several questions I have when evaluating old preamps. One is, will it maintain its good performance even with CD players that are standardized on 2 Volts RMS output for a "full" signal? Will it drive a modern low-gain amp? What issues are there with using turntables and cartridges? So, I tested frequency response and distortion for a "standard" input signal of 2 Volts RMS. I also measure input signals up to 3 volts RMS to make sure the input front end had the headroom needed. And I measured the phono section to determine its gain and range.
Here's the test setup:
The oscilloscope on the left lets me estimate voltage and see clipping, and the two distortion analyzers on the right allow me to look at both channels at once. I was using the oscillator source in the upper unit, an HP 339A. The source is extremely clean and accurate. This one has distortion at least better than -97 dB, which makes it 10 dB at least deeper than the rated distortion of the Holman, so it should be good enough. But I was making primary measurements with the lower unit, an HP 8903B. In the photo above, the scope is showing fully formed sine waves at (reading the 8903) a voltage coming out of the preamp at 7.64 V RMS and 1KHz. The 339A at upper right is confirming a 2-V input to the Holman preamp. I was testing how much the preamp could be cranked up before it clipped. The answer to that was 8.2 volts of output--beyond that and it flattopped the waveforms and distortion jumped.
I used other test equipment, too. I used the HP3456a voltmeter, which is extremely accurate for AC RMS voltage, to validate the voltage readings from the 8903.
On to the preamp. This preamp tested well except for the usual bevy of scratchy potentiometers and noisy switches. Those I can clean. So, I didn't think I'd gain much from replacing the capacitors except longevity. And I had the pile of caps, so I went ahead and replaced them. There's the Holman with the covers off:
The top boards come loose and fold back; removing them is not necessary. That let me power up and check operation every few minutes so that if I made a mistake replacing a cap it would be easier to backtrack.
All those gray caps came out. All tested fine, but then they always do until they don't.
There were more than these
The caps that went back in often have higher voltage ratings and higher temperature ratings. These 85-degree caps were replaced with 105-degree caps, for example, and 16-volt caps were replaced with 25-volt caps, etc. A couple of the 1 microfarad electrolytic caps I replaced with polyester film caps.
Then, I cleaned all the switches and pots with contact cleaner, followed by DeOxit D5 which provides lubrication to the sliders in the pots.
Holman specifications claim the second harmonic will be 87 dB down from a 2-V incoming signal. They claim that the phono inputs impose less than -75 dB distortion and noise. Let's see how it did.
I set up the preamp to provide unity gain, or at least the detent on the volume knob closest to that. As you can see, total harmonic distortion and noise hovered right around the -87 to -90 dB range over the whole frequency spectrum, until the signal was below 20 Hz. I also am showing the voltage output at each frequency. The oscillator output of the 339A stayed right at 2 volts RMS--that needle never moved a whit as I changed the frequency. This is HP stuff, after all. At each frequency, I recorded the RMS voltage in dBU (dBU aligns the dB scale to .775 Volts for 0 dB)--not automated tests suites for this old equipment, at least not until I get the GPIB interfaces integrated into a computer. I wrote it down on a Big Chief tablet with a Number 2 pencil, just like in first grade. Once integrated with GPIB, I can use scripts that are made for running test suites. Because the voltage was in dB units, I just subtracted the nominal 8.8 dBU (aka, 2 volts) to show the difference at each frequency. The vertical scale is exaggerated, and the preamp is flat within a half-dB window, and less than that between 20 Hz and about 18 KHz. This agrees with Amir's measurements.
I conducted a quick test to see if it was easy to saturate the line-input front ends with a signal stronger than the usual 2-V reference maximum that was defined when CD's came out. Prior to that, it was 1V or less. That postdates this preamp so it was worth checking. The HP 339A oscillator has a maximum output of 3 Volts, so I couldn't find an input voltage that would really overload the front end.
Finally, I checked the phono input. Amir found that the input saturated at something like 70 mV, but I did not find this to be the case at all. On my unit, I was able to drive the phono input to 125 mV before the distortion rose to a point where it wasn't clearly better than the vinyl recordings themselves. My measurements confirmed a phono preamp gain of 36.5 dB--30 mV drives a 2-V output--which is about right for moving-magnet cartridges. The minimum test signal output on the HP equipment was still too high to emulate a moving coil cartridge.
Gain linearly increased with input signal until the preamp clipped at its maximum 8.4 Volts.
There were lots of questions in the Holman Preamp review thread wondering if the guy who restored it replaced the op-amps and other components. It isn't necessary--this preamp is meeting its specifications with the factory TL072 op-amps all over the inside of it, and those are very good specifications indeed.
Forgive typos--I'm falling asleep where I sit and I have to drive downtown tomorrow. I'll proof it Wednesday and fix the obvious goofs. I've been dozing off for the last half hour, so who knows what I've been typing. But the data speaks with its own voice.
Rick "respectfully submitted" Denney
I just completed replacing the capacitors in the unit and gave it a thorough test. I thought the results of that testing, and the way I performed it, might be interesting.
The Holman service manual suggests that the typical nulling distortion analyzer isn't sufficient, but I think that was written before the Hewlett Packard 339A and well before the digital 8903 analyzers. These analyzers work by nulling the frequency of interest and measuring the true RMS voltage of what remains. The problem Holman had with the distortion analyzers of 1978 was that they did not have true RMS voltmeters that properly integrated the area under the waveforms to get RMS and instead made assumptions of the average based on the shape of the waveform. Of course, if one is only measuring the distortion products and noise, the waveform is too messy for the usual simplifying assumptions. This was resolved by the introduction of the 339A, which sported a true analog RMS voltmeter. The 8903 distortion analyzer performs a precision calculation of RMS voltage by evaluating the digital stream--it was the first digital distortion analyzer Hewlett Packard made.
But these bench analyzers were the standard of the industry in the days when the Holman preamp was built and for probably 30 years thereafter, until Audio Precision came along. Both of the HP units have a bandwidth of 89 dB, but easily measure down to a distortion level of -100 dB. The oscillator source in the 339A is probably the cleanest pure analog oscillator ever available to mortals, and it has a specified distortion of less than -100 dB at least. Holman would have preferred a spectrum analyzer, such as the HP3580, to evaluate the level of the second harmonic in particular. Spectrum analyzers have not found their way to my bench, because even old ones that don't work are expensive. Software-based spectrum analysis is easy these days, and frankly for less than I spent on old HP equipment one could buy, say, a Quant Asylum QA403 and get about -110 dB measurement capability for about 600 bucks. I spent about that on the two analyzers I'm using. But for bench work, dedicated analyzers are amazingly quick and easy to use and don't require software maintenance.
In these days of amps with distortion and noise levels down below -110 or -120 dB, the old bench test equipment isn't good enough. But it's good enough for 1.) vintage equipment, and 2.) most people's human hearing, including mine.
The Holman Preamplifier was reviewed by Amir already, resulting in a long and very informative thread that attracted the participation of Tomlinson Holman is own self. But because of the methods, it seems to me, this needed a thread of its own.
There are several questions I have when evaluating old preamps. One is, will it maintain its good performance even with CD players that are standardized on 2 Volts RMS output for a "full" signal? Will it drive a modern low-gain amp? What issues are there with using turntables and cartridges? So, I tested frequency response and distortion for a "standard" input signal of 2 Volts RMS. I also measure input signals up to 3 volts RMS to make sure the input front end had the headroom needed. And I measured the phono section to determine its gain and range.
Here's the test setup:
The oscilloscope on the left lets me estimate voltage and see clipping, and the two distortion analyzers on the right allow me to look at both channels at once. I was using the oscillator source in the upper unit, an HP 339A. The source is extremely clean and accurate. This one has distortion at least better than -97 dB, which makes it 10 dB at least deeper than the rated distortion of the Holman, so it should be good enough. But I was making primary measurements with the lower unit, an HP 8903B. In the photo above, the scope is showing fully formed sine waves at (reading the 8903) a voltage coming out of the preamp at 7.64 V RMS and 1KHz. The 339A at upper right is confirming a 2-V input to the Holman preamp. I was testing how much the preamp could be cranked up before it clipped. The answer to that was 8.2 volts of output--beyond that and it flattopped the waveforms and distortion jumped.
I used other test equipment, too. I used the HP3456a voltmeter, which is extremely accurate for AC RMS voltage, to validate the voltage readings from the 8903.
On to the preamp. This preamp tested well except for the usual bevy of scratchy potentiometers and noisy switches. Those I can clean. So, I didn't think I'd gain much from replacing the capacitors except longevity. And I had the pile of caps, so I went ahead and replaced them. There's the Holman with the covers off:
The top boards come loose and fold back; removing them is not necessary. That let me power up and check operation every few minutes so that if I made a mistake replacing a cap it would be easier to backtrack.
All those gray caps came out. All tested fine, but then they always do until they don't.
There were more than these
The caps that went back in often have higher voltage ratings and higher temperature ratings. These 85-degree caps were replaced with 105-degree caps, for example, and 16-volt caps were replaced with 25-volt caps, etc. A couple of the 1 microfarad electrolytic caps I replaced with polyester film caps.
Then, I cleaned all the switches and pots with contact cleaner, followed by DeOxit D5 which provides lubrication to the sliders in the pots.
Holman specifications claim the second harmonic will be 87 dB down from a 2-V incoming signal. They claim that the phono inputs impose less than -75 dB distortion and noise. Let's see how it did.
I set up the preamp to provide unity gain, or at least the detent on the volume knob closest to that. As you can see, total harmonic distortion and noise hovered right around the -87 to -90 dB range over the whole frequency spectrum, until the signal was below 20 Hz. I also am showing the voltage output at each frequency. The oscillator output of the 339A stayed right at 2 volts RMS--that needle never moved a whit as I changed the frequency. This is HP stuff, after all. At each frequency, I recorded the RMS voltage in dBU (dBU aligns the dB scale to .775 Volts for 0 dB)--not automated tests suites for this old equipment, at least not until I get the GPIB interfaces integrated into a computer. I wrote it down on a Big Chief tablet with a Number 2 pencil, just like in first grade. Once integrated with GPIB, I can use scripts that are made for running test suites. Because the voltage was in dB units, I just subtracted the nominal 8.8 dBU (aka, 2 volts) to show the difference at each frequency. The vertical scale is exaggerated, and the preamp is flat within a half-dB window, and less than that between 20 Hz and about 18 KHz. This agrees with Amir's measurements.
I conducted a quick test to see if it was easy to saturate the line-input front ends with a signal stronger than the usual 2-V reference maximum that was defined when CD's came out. Prior to that, it was 1V or less. That postdates this preamp so it was worth checking. The HP 339A oscillator has a maximum output of 3 Volts, so I couldn't find an input voltage that would really overload the front end.
Finally, I checked the phono input. Amir found that the input saturated at something like 70 mV, but I did not find this to be the case at all. On my unit, I was able to drive the phono input to 125 mV before the distortion rose to a point where it wasn't clearly better than the vinyl recordings themselves. My measurements confirmed a phono preamp gain of 36.5 dB--30 mV drives a 2-V output--which is about right for moving-magnet cartridges. The minimum test signal output on the HP equipment was still too high to emulate a moving coil cartridge.
Gain linearly increased with input signal until the preamp clipped at its maximum 8.4 Volts.
There were lots of questions in the Holman Preamp review thread wondering if the guy who restored it replaced the op-amps and other components. It isn't necessary--this preamp is meeting its specifications with the factory TL072 op-amps all over the inside of it, and those are very good specifications indeed.
Forgive typos--I'm falling asleep where I sit and I have to drive downtown tomorrow. I'll proof it Wednesday and fix the obvious goofs. I've been dozing off for the last half hour, so who knows what I've been typing. But the data speaks with its own voice.
Rick "respectfully submitted" Denney
Last edited: