Addicted to Fun and Learning
- Dec 6, 2018
- Adelaide Australia
While a non oversampling DAC could objectively reach exactly the voltage levels expected based on the digital samples at exactly the right points in time, it's clear that if we had recorded more samples inbetween, they wouldn't typically have the same level as the previous or next sample, so the need for filters makes sense to me.
But then different filters produce different results based on the same input - which result is the most accurate? Apparently there's a hypothetical correct answer, mathematically speaking (ideal sinc function?), but we can only approximate it currently. The Chord M Scaler + Dave DAC supposedly come very close to it, but introduce a latency of up to 600 ms. I don't quite understand why the correct value of an interpolated sample depends on what happened more than half a second ago when even super low 10 Hz bass has completed six full cycles by then, but I'm far from an expert on the matter, it's just counter-intuitive.
There is a lot here.
A NOS DAC has no special capability wrt exact voltages, it suffers from exactly the same issues as a DAC running at any other rate.
The core issue is that no real DAC creates an infinitesimally short output pulse. However the ideal ADC samples at an infinitesimally short time. An ideal DAC with output filter creates sinc function pulses, and these overlapping sincs sum to the precise bandwidth limited input the ideal ADC sampled. A sinc function is of infinite length. So in theory, you need to sum every one of the sinc functions at every sample point in order to assemble the signal, which for practical purposes means you are going to need a long latency. Even 600ms isn't enough. Everything is an approximation.
However the question then becomes - does it matter? How quickly do these effects vanish below the noise and are impossible to actually find? A lot of this naval gazing is done without considering the impact of noise, something that is an intrinsic part of the real universe rather than the one where such obsessing exists. A sinc function dies away very quickly. This is important. 600ms is actually insanely over the top. There is no useful information out there and the whole idea is just bragging rights. The limits to the amount of information actually present are clear, and in reality, it just doesn't matter.
You need a reconstruction filter, that is an absolutely iron clad reality. What is also iron clad is that real life filters are never ideal. There is always a compromise. And that compromise is fluid between the digital and analog domains, giving the designer the freedom to do implement in the most effective domain. The NOS adherents don't really get this, and they somehow believe that everything possible should be done in the analog domain when the modern reality is that we have such huge capabilities available in the digital domain that is makes no sense to do things in the analog domain if you can avoid it. So NOS DACs are intrinsically compromised by the limitations possible implementing a real world analog reconstruction filters that work in the limited bandwidth they have to spare. Or worse, they somehow think that huge amounts of aliasing into the audio band is yielding a better result and they compromise the reconstruction filter.