I've been digging through ADCs old and new lately, which had me wondering:
What's the real SOTA in audio ADCs?
Nowadays, if you want to buy a fancy ADC, you might choose e.g. an RME ADI-2 Pro FS R Black Edition (quite the mouthful), which sports these ADC specs:
Which is absolutely first-rate, of course - I think I've seen one other ADC with a -117 dB THD spec, but that's about it. Some ADCs have a rated dynamic of up to 130 dB(A) these days, but the RME is all but inflexible with its variable input sensitivity and allows covering a total range of 139 dB / 143 dB(A). Trying to get 120 dB or more through a studio seems like a bit of a challenge anyway.
Now what if I told you that there is a DAC released in 1998 with the following specs among others:
What it is? Meet the Prism Sound Dream AD-2. Not sure what these cost back then but you can buy one for something like 9 grand now. I would certainly love to interview the designers; they clearly were very proud of their product (and deservedly so).
So can you do better these days? After all, IC-based ADCs have caught up and several are sporting up to 130 dB as well these days. Possibly. The (slim) specs for a Weiss ADC2 indicate the following at +26 dBu in:
Speaking of which, composite ADCs seem to be dating back at least as far as 1994:
Studio Sound, March 1995
On a side note, I found out that even kilobuck ADCs do not necessarily sport a decent, halfway comprehensive set of specs, especially these days. The Mytek Brooklyn ADC's manual has got to be one of the worst offenders, it mentions a 130 dB dynamic range and <1 ps clock jitter spec but no mention of levels at all except for a mysterious "headroom: 13-20" (does that mean a +17 dBu to +24 dBu 0 dBFS level?). Relevant information is also strewn across the whole thing. This is in stark contrast to, say, RME manuals, or what you see in the '90s.
What's the real SOTA in audio ADCs?
Nowadays, if you want to buy a fancy ADC, you might choose e.g. an RME ADI-2 Pro FS R Black Edition (quite the mouthful), which sports these ADC specs:
Input sensitivity switchable +24 dBu, +19 dBu, +13 dBu, +4 dBu @ 0 dBFS
Signal to Noise ratio (SNR) @ +13/19/24 dBu: 120 dB RMS unweighted, 124 dBA
Signal to Noise ratio (SNR) @ +4 dBu: 119 dB RMS unweighted, 123 dBA
THD @ -1 dBFS: -116 dB, 0.00016 %
THD @ -10 dBFS: -125 dB, 0.000056 %
Which is absolutely first-rate, of course - I think I've seen one other ADC with a -117 dB THD spec, but that's about it. Some ADCs have a rated dynamic of up to 130 dB(A) these days, but the RME is all but inflexible with its variable input sensitivity and allows covering a total range of 139 dB / 143 dB(A). Trying to get 120 dB or more through a studio seems like a bit of a challenge anyway.
Now what if I told you that there is a DAC released in 1998 with the following specs among others:
Clearly the distortion department has still made some progress since then, although even those numbers would still be very good, and at -60 dBFS a good modern converter would probably show basically no harmonics at all. It's also 24/96 "only" (the latest and greatest back then). But an instantaneous dynamic range of over 130 dB unweighted is nothing short of remarkable. That's DC-to-daylight level. Clearly no typical monolithic CMOS ADCs in this one (about the best one you could get back then was the Crystal CS5396, 117 dB unweighted with a following wind).Dynamic range or signal-to-noise ratio (input sensitivity:+28dBu=0dBFS, 1kHz@-60dBFS):
typical: 131.5dB (unweighted RMS)
worst case:
>129.0dB CCIR-RMS
>131.0dB (A weighted RMS)
>129.0dB (unweighted RMS)
Dynamic range or signal-to-noise ratio (input sensitivity:+18dBu=0dBFS, 1kHz@-60dBFS):
typical: 128.0dB (unweighted RMS)
>127.0dB CCIR-RMS
>129.0dB (A weighted RMS)
>127.0dB (unweighted RMS)
Total harmonic distortion and noise (1kHz@ -1dBFS):
typical, <-108.0dBFS (0.00045%) (unweighted RMS)
worst case,<-105.0dBFS (0.00063%) (unweighted RMS)
Intermodulation distortion:<-90dB
Note: The intermodulation test from AES17 uses 18kHz and 20kHz signals at -6.03dBFS each. The result is the ratio of the total output rms signal level to the rms sum of the 2nd and 3rd order modulation products at 2kHz and 16kHz.
Spurious aharmonic levels:<-130dBFS(1kHz@ -1dBFS)
Note: Highest level of any aharmonic spurious component
Any spurious levels:
<-112dBFS signals to -1dBFS
<-130dBFS signals below -20dB FS
<-140dBFS signals below -60dB FS
Note: Highest level of any harmonic distortion component
What it is? Meet the Prism Sound Dream AD-2. Not sure what these cost back then but you can buy one for something like 9 grand now. I would certainly love to interview the designers; they clearly were very proud of their product (and deservedly so).
So can you do better these days? After all, IC-based ADCs have caught up and several are sporting up to 130 dB as well these days. Possibly. The (slim) specs for a Weiss ADC2 indicate the following at +26 dBu in:
So that might indicate a dynamic range of about 150 dB, unless I am misinterpreting things. (Which would be super high but definitely not impossible at +26 dBu, though I suppose internal levels may be lower than that.) Makes you wonder whether you can actually get converters this good these days or whether it's some sort of composite setup.THD+N at 1 kHz:
Less than −103 dBFS at −3 dBFS output level, unweighted
SNR at −40 dBFS input:
Higher than 110 dB unweighted
Speaking of which, composite ADCs seem to be dating back at least as far as 1994:
Studio Sound, March 1995
The CS5390, by itself, was a 48 kHz 20-bit delta sigma ADC rated 110 dB(A) DR and -100 dB THD+N and released in October of 1993. I didn't think you could just clock one twice as fast and expect it to work, but apparently it did (I suppose some debugging in the lab may have been quite necessary).DG 4D goes to third generation
The Deutsche Grammophon Recording Centre has developed a third generation upgrade of the Stage Box system central to the 4D recording chain. All recordings made by the Recording Centre since October 1994 have used the new DG AD III technology, whose convertors feature the new Crystal CS5390 delta-sigma 20-bit A-D convertor ICs to provide 23-bit digital-floating delta-sigma A-D conversion. The process employs two 20-bit convertors, one handling the input signal at unity gain and the other operated with 18dB gain. A sophisticated DSP algorithm regulates the crossfade between the two convertors, producing three bits of supplementary resolution. The DSP program was modified to allow the DSP chip to handle 20-bit convertors at its inputs and a 24-bit wordlength at its outputs. Quoted specifications include THD+n of -121dBFs with an input of 997Hz at -30dBFs and linearity errors within 1dB down to -135dBFs, together with a largely flat noise-spectrum. A further improvement is the development of the Authentic Clock Recovery system, permitting superior reconstruction of the master clock signal under real world operating conditions such as long cable runs and numerous interconnected PLLs, where phase modulation of the clock, jitter, becomes a limiting factor on overall system performance. Because Authentic Clock Recovery uses crystal PLLs driven at 512Fs, as opposed to the current 256Fs standard, A-D conversion at up to 96kHz is possible, with full oversampling capability.
Deutsche Grammophon, Germany. Tel: +49 4044 181115.
On a side note, I found out that even kilobuck ADCs do not necessarily sport a decent, halfway comprehensive set of specs, especially these days. The Mytek Brooklyn ADC's manual has got to be one of the worst offenders, it mentions a 130 dB dynamic range and <1 ps clock jitter spec but no mention of levels at all except for a mysterious "headroom: 13-20" (does that mean a +17 dBu to +24 dBu 0 dBFS level?). Relevant information is also strewn across the whole thing. This is in stark contrast to, say, RME manuals, or what you see in the '90s.
Last edited: