• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Testing a Vintage Holman Preamp using Vintage Test Equipment

rdenney

Major Contributor
Forum Donor
Joined
Dec 30, 2020
Messages
2,659
Likes
5,039
I think appropriate for this thread is some maintenance work and testing I performed this weekend on an old Holman preamplifier. The preamp was provided to me by a forum member who had decided to go another direction and didn't therefore want to spend the several hundred dollars to have the unit restored by the company that did the work on the unit Amir tested.

I just completed replacing the capacitors in the unit and gave it a thorough test. I thought the results of that testing, and the way I performed it, might be interesting.

The Holman service manual suggests that the typical nulling distortion analyzer isn't sufficient, but I think that was written before the Hewlett Packard 339A and well before the digital 8903 analyzers. These analyzers work by nulling the frequency of interest and measuring the true RMS voltage of what remains. The problem Holman had with the distortion analyzers of 1978 was that they did not have true RMS voltmeters that properly integrated the area under the waveforms to get RMS and instead made assumptions of the average based on the shape of the waveform. Of course, if one is only measuring the distortion products and noise, the waveform is too messy for the usual simplifying assumptions. This was resolved by the introduction of the 339A, which sported a true analog RMS voltmeter. The 8903 distortion analyzer performs a precision calculation of RMS voltage by evaluating the digital stream--it was the first digital distortion analyzer Hewlett Packard made.

But these bench analyzers were the standard of the industry in the days when the Holman preamp was built and for probably 30 years thereafter, until Audio Precision came along. Both of the HP units have a bandwidth of 89 dB, but easily measure down to a distortion level of -100 dB. The oscillator source in the 339A is probably the cleanest pure analog oscillator ever available to mortals, and it has a specified distortion of less than -100 dB at least. Holman would have preferred a spectrum analyzer, such as the HP3580, to evaluate the level of the second harmonic in particular. Spectrum analyzers have not found their way to my bench, because even old ones that don't work are expensive. Software-based spectrum analysis is easy these days, and frankly for less than I spent on old HP equipment one could buy, say, a Quant Asylum QA403 and get about -110 dB measurement capability for about 600 bucks. I spent about that on the two analyzers I'm using. But for bench work, dedicated analyzers are amazingly quick and easy to use and don't require software maintenance.

In these days of amps with distortion and noise levels down below -110 or -120 dB, the old bench test equipment isn't good enough. But it's good enough for 1.) vintage equipment, and 2.) most people's human hearing, including mine.

The Holman Preamplifier was reviewed by Amir already, resulting in a long and very informative thread that attracted the participation of Tomlinson Holman is own self. But because of the methods, it seems to me, this needed a thread of its own.

There are several questions I have when evaluating old preamps. One is, will it maintain its good performance even with CD players that are standardized on 2 Volts RMS output for a "full" signal? Will it drive a modern low-gain amp? What issues are there with using turntables and cartridges? So, I tested frequency response and distortion for a "standard" input signal of 2 Volts RMS. I also measure input signals up to 3 volts RMS to make sure the input front end had the headroom needed. And I measured the phono section to determine its gain and range.

Here's the test setup:
IMG_1285-dsqz.JPEG


The oscilloscope on the left lets me estimate voltage and see clipping, and the two distortion analyzers on the right allow me to look at both channels at once. I was using the oscillator source in the upper unit, an HP 339A. The source is extremely clean and accurate. This one has distortion at least better than -97 dB, which makes it 10 dB at least deeper than the rated distortion of the Holman, so it should be good enough. But I was making primary measurements with the lower unit, an HP 8903B. In the photo above, the scope is showing fully formed sine waves at (reading the 8903) a voltage coming out of the preamp at 7.64 V RMS and 1KHz. The 339A at upper right is confirming a 2-V input to the Holman preamp. I was testing how much the preamp could be cranked up before it clipped. The answer to that was 8.2 volts of output--beyond that and it flattopped the waveforms and distortion jumped.

I used other test equipment, too. I used the HP3456a voltmeter, which is extremely accurate for AC RMS voltage, to validate the voltage readings from the 8903.

IMG_1291-dsqz.JPEG


On to the preamp. This preamp tested well except for the usual bevy of scratchy potentiometers and noisy switches. Those I can clean. So, I didn't think I'd gain much from replacing the capacitors except longevity. And I had the pile of caps, so I went ahead and replaced them. There's the Holman with the covers off:

IMG_1289-dsqz.JPEG


The top boards come loose and fold back; removing them is not necessary. That let me power up and check operation every few minutes so that if I made a mistake replacing a cap it would be easier to backtrack.

IMG_1290-dsqz.JPEG


All those gray caps came out. All tested fine, but then they always do until they don't.

IMG_1294-dsqz.JPEG


There were more than these :)

The caps that went back in often have higher voltage ratings and higher temperature ratings. These 85-degree caps were replaced with 105-degree caps, for example, and 16-volt caps were replaced with 25-volt caps, etc. A couple of the 1 microfarad electrolytic caps I replaced with polyester film caps.

Then, I cleaned all the switches and pots with contact cleaner, followed by DeOxit D5 which provides lubrication to the sliders in the pots.

Holman specifications claim the second harmonic will be 87 dB down from a 2-V incoming signal. They claim that the phono inputs impose less than -75 dB distortion and noise. Let's see how it did.

Holman_FRDistortion.PNG


I set up the preamp to provide unity gain, or at least the detent on the volume knob closest to that. As you can see, total harmonic distortion and noise hovered right around the -87 to -90 dB range over the whole frequency spectrum, until the signal was below 20 Hz. I also am showing the voltage output at each frequency. The oscillator output of the 339A stayed right at 2 volts RMS--that needle never moved a whit as I changed the frequency. This is HP stuff, after all. At each frequency, I recorded the RMS voltage in dBU (dBU aligns the dB scale to .775 Volts for 0 dB)--not automated tests suites for this old equipment, at least not until I get the GPIB interfaces integrated into a computer. I wrote it down on a Big Chief tablet with a Number 2 pencil, just like in first grade. Once integrated with GPIB, I can use scripts that are made for running test suites. Because the voltage was in dB units, I just subtracted the nominal 8.8 dBU (aka, 2 volts) to show the difference at each frequency. The vertical scale is exaggerated, and the preamp is flat within a half-dB window, and less than that between 20 Hz and about 18 KHz. This agrees with Amir's measurements.

I conducted a quick test to see if it was easy to saturate the line-input front ends with a signal stronger than the usual 2-V reference maximum that was defined when CD's came out. Prior to that, it was 1V or less. That postdates this preamp so it was worth checking. The HP 339A oscillator has a maximum output of 3 Volts, so I couldn't find an input voltage that would really overload the front end.

Holman_InputSat.PNG


Finally, I checked the phono input. Amir found that the input saturated at something like 70 mV, but I did not find this to be the case at all. On my unit, I was able to drive the phono input to 125 mV before the distortion rose to a point where it wasn't clearly better than the vinyl recordings themselves. My measurements confirmed a phono preamp gain of 36.5 dB--30 mV drives a 2-V output--which is about right for moving-magnet cartridges. The minimum test signal output on the HP equipment was still too high to emulate a moving coil cartridge.

Gain linearly increased with input signal until the preamp clipped at its maximum 8.4 Volts.

Holman_Phono.PNG


There were lots of questions in the Holman Preamp review thread wondering if the guy who restored it replaced the op-amps and other components. It isn't necessary--this preamp is meeting its specifications with the factory TL072 op-amps all over the inside of it, and those are very good specifications indeed.

Forgive typos--I'm falling asleep where I sit and I have to drive downtown tomorrow. I'll proof it Wednesday and fix the obvious goofs. I've been dozing off for the last half hour, so who knows what I've been typing. But the data speaks with its own voice.

Rick "respectfully submitted" Denney
 
Last edited:
Very nice post alround rndenny, and the preamp too.
 
I think appropriate for this thread is some maintenance work and testing I performed this weekend on an old Holman preamplifier. The preamp was provided to me by a forum member who had decided to go another direction and didn't therefore want to spend the several hundred dollars to have the unit restored by the company that did the work on the unit Amir teste

I just completed replacing the capacitors in the unit and gave it a thorough test. I thought the results of that testing, and the way I performed it, might be interesting.

The Holman service manual suggests that the typical nulling distortion analyzer isn't sufficient, but I think that was written before the Hewlett Package 339A and well before the digital 8903 analyzers. These analyzers were by nulling the frequency of interest and measuring the true RMS voltage of what remains. The problem Holman had with the distortion analyzers of 1978 was that they did not have true RMS voltmeters that properly integrated the area under the waveforms to get RMS and instead made assumptions of the average based on the shape of the waveform. Of course, if one is only measuring the distortion products and noise, the waveform is too messy for the usual simplifying assumptions. This was resolved by the introduction of the 993A, which sported a true analog RMS voltmeter. The 8903 distortion analyzers performs a precision calculation of RMS voltage by evaluating the digital stream--it was the first digital distortion analyzer Hewlett Packard made.

But these bench analyzers were the standard of the industry in the days when the Holman preamp was built and for probably 30 years thereafter, because Audio Precision came along. Both of the HP units have a bandwidth of 89 dB, but easily measure down to a distortion level of -100 dB. The oscillator source in the 339A is probably the cleanest pure analog oscillator ever available to mortals, and it has a specified distortion of less than -100 dB at least. Holman would have preferred a spectrum analyzer, such as the HP3580, to evaluate the level of the second harmonic in particular. Spectrum analyzers have not found their way to my bench, because even old ones that don't work are expensive. Software-based spectrum analyzers is easy these days, and frankly for less than I spend on old HP equipment one could buy, say, a Quant Asylum QA403 and get about -110 dB measurement capability for about 600 bucks. I spend about that on the two analyzers I'm using. But for bench work, dedicated analyzers are amazingly quick and easy to use and don't require software maintenance.

In the days of amps with distortion and noise levels down below -110 or -120 dB, the old bench test equipment isn't good enough. But it's good enough for 1.) vintage equipment, and 2.) most people's human hearing, including mine.

The Holman Preamplifier was reviewed by Amir already, resulting in a long and very informative thread that attracted the participation of Tomlinson Holman is own self. But because of the methods, it seems to me, this needed a thread of its own.

There are several questions I have when evaluating old preamps. One is, will it maintain its good performance even with CD players that are standardized on 2 Volts RMS output for a "full" signal? Will it get everything even a modern low-gain amp might require? What issues are there with using turntables and cartridges? So, I tested frequency response and distortion for a "standard" input signal of 2 Volts RMS. I also measure input signals up to 3 volts RMS to make sure the input front end had the headroom needed. And I measured the phono section to determine its gain and range.

IMG_1285-dsqz.JPEG

Here's the test setup:

The oscilloscope on the left let me estimate voltage and see clipping, and the two distortion analyzers on the right allowed me to look at both channels at once. I was using the oscillator source in the upper unit, an HP 339A. The source is extremely clean and accurate. This one has distortion at least better than -97 dB, which makes it 10 dB at least deeper than the rated distortion of the Holman, so it should be good enough. But I was making primary measurements with the lower unit, an HP 8903B. In the photo above, the scope is showing fully formed sine waves at (reading the 8903) a voltage coming out of the preamp at 7.64 V RMS and 1KHz. The 339A at upper right is confirming a 2-V input to the Holman preamp. I was testing how much the preamp could be cranked up before it clipped. The answer to that was 8.2 volts of output--beyond that and it flattopped the waveforms and distortion jumped.

I used other test equipment, too. I used the HP3456a voltmeter, which is extremely accurate for AC RMS voltage, and I used it to validate the voltage readings from the 8903.

IMG_1291-dsqz.JPEG


On to the preamp. This preamp tested well except for the usual bevy of scratchy potentiometers and noisy switches. Those I can clean. So, I didn't think I'd gain much from replacing the capacitors except longevity. But I had the pile of caps, so I went ahead and replaced them. There's the Holman with the covers off:

IMG_1289-dsqz.JPEG


The top boards come loose and fold back; removing them is not necessary. That let me power up and check operation every few minutes so that if I made a mistake replacing a cap it would be easier to backtrack.

IMG_1290-dsqz.JPEG


All those gray caps came out. All tested fine, but then they always do until they don't.

IMG_1294-dsqz.JPEG


There were more than these :)

The caps that went back in often have higher voltage ratings and higher temperature ratings. These 85-degree caps were replaced with 105-degree caps, for example. A couple of the 1 microfarad electrolytic caps I replaced with polyester film caps.

Then, I cleaned all the switches and pots with contact cleaner, followed by DeOxit D5 which provides lubrication to the sliders in the pots.

Holman specifications claim the second harmonic will be 87 dB down from a 2-V incoming signal. They claim that the phono inputs impose less than -75 dB distortion and noise. Let's see how it did.

Holman_FRDistortion.PNG


I set up the preamp to provide unity gain, or at least the detent on the volume knob closest to that. As you can see, total harmonic distortion and noise hovered right around the -87 to -90 dB range over the whole frequency spectrum, until the signal was below 20 Hz. I also am showing the voltage output at each frequency. The oscillator output of the 339A stayed right at 2 volts RMS--that needle never moved a whit as I changed the frequency. This is HP stuff, after all. At each frequency, I recorded the RMS voltage in dBU (dBU aligns the dB scale to .775 Volts for 0 dB)--not automated tests suites for this old equipment, at least not until I get the GPIB interfaces integrated into a computer. Then I can use scripts that are made for running test suites. Because the voltage was in dB units, I just subtracted the nominal 8.8 dBU (aka, 2 volts) to show the difference at each frequency. The vertical scale is exaggerated, and the preamp is flat within a half-dB window, and less than that between 20 Hz and about 18 KHz. This agrees with Amir's measurements.

I conducted a quick test to see if it was easy to saturate the line-input front ends with a signal stronger than the usual 2-V reference maximum that was defined when CD's came out. Prior to that, it was 1V or less. That postdates this preamp so it was worth checking. The HP 339A oscillator has a maximum output of 3 Volts, so I couldn't find an input voltage that would really overload the front end.

Holman_InputSat.PNG


Finally, I checked the phono input. Amir found that the input saturated at something like 70 mV, but I did not find this to be the case at all. On my unit, I was able to drive the phono input to 125 mV before the distortion rose to a point where it wasn't clearly better than the vinyl recordings themselves. My measurements confirmed a phono preamp gain of 36.5 dB--30 mV drives a 2-V output--which is about right for moving-magnet cartridges. The minimum test signal output on the HP equipment was still too high to emulate a moving coil cartridge.

Gain linearly increased with input signal until the preamp clipped at its maximum 8.4 Volts.

Holman_Phono.PNG


There were lots of questions in the Holman Preamp review thread wondering if the guy who restored it replaced the op-amps and other components. I isn't necessary--this preamp is meeting its specifications with the factory TL072 op-amps all over the inside of it, and those are very good specifications indeed.

Forgive typos--I'm falling asleep where I sit and I have to drive downtown tomorrow. I'll proof it Wednesday and fix the obvious goofs. I've been dozing off for the last half hour, so who knows what I've been typing. But the data speaks with its own voice.

Rick "respectfully submitted" Denney
Fascinating - thank you. Always nice to see the classic test gear.
 
Very cool. Did you also measure it before the cap replacement? Was there any performance difference?
 
Very cool. Did you also measure it before the cap replacement? Was there any performance difference?
I did, but not nearly as comprehensively. I measured no change in performance.

Rick “replaced the caps for longevity” Denney
 
Call for any measurements anybody wants while it's on the bench. I may have some time this evening to do something.

Also, it interested me to note what it would cost to do good distortion measurement over time. AP doesn't say how much the APx555 costs ("Call for a Quote") but Amir reported that it was $28,000 back in 2018 when he bought his. Plus, I gather (indirectly) that the analysis software requires an ongoing subscription.

We think that's expensive, because it is. But what did technicians do before computer-based distortion testing?

To measure distortion, one needs a very clean oscillator source as the input and a way to measure the distortion products in the output. And to characterize that distortion the way the AP does, one needs a spectrum analyzer. In 1990, one would need two bits of equipment from someone like Hewlett Packard. HP had cheaper competition, but so does AP--let's compare the quality standard from each period. Actually, to measure distortion in both channels at once, one would need two distortion analyzers.

It's actually not dumb to own both a 339A analog analyzer and an 8903 digital analyzer, and both were in the catalog at the same time. They work a bit differently. Arguably, the analog Wien oscillator in the 339A is cleaner by several dB than the digitally synthesized oscillator in the 8903. But the 8903 has true differential inputs for evaluating devices with balanced interfaces (or for one side of bridged amps while both are operating). Both measure distortion of the same source within a dB of each other, even at this remove of decades.

Another option would be to have an 8903B, which provides the source and analyzer in one unit, and an 8903E, which providers the analyzer without the source.

In the 1990 catalog, HP published these prices:

339A Distortion Measuring Set: $4225
8903A Audio Analyzer: $6250, plus $210 for the 400Hz high-pass filter (handy for quickly eliminating power-supply hum from the measurement)
8903E Distortion Analyzer: $4235, plus $210 for the filter
3580A Spectrum Analyzer: $9370, plus $270 for balanced inputs

So, whether the second-channel measurement comes from a 339A or an 8903E, the cost adds up to about the same 20.5 kilodollars. In 2018 dollars, that's just shy of $40K.

The good news is that old HP stuff is cheap on the used market (except for that spectrum analyzer), and those with the understanding of how to use them can probably correct most of the age-related issues that the ebay source imposes.

But for those wanting a measurement capability that is more modern than old HP slabs-o-wonderfulness, but much more affordable, the Quant Asylum QA403, at $600 and with a measurement capability down to about -110 dB, is mighty tempting.

Rick "test equipment playing catchup with products capabilities" Denney
 
It seems that you can make various estimates by registering your email address on AP Company's page.
 

Attachments

  • apx555b01.png
    apx555b01.png
    262.9 KB · Views: 26
Back
Top Bottom