- Joined
- Feb 23, 2016
- Messages
- 20,689
- Likes
- 37,411
Amir, you're trying to convey that if you precisely measure phase noise in a certain test situation, using a certain method, that it's possible to translate that into a particular level of distortion, which by the book is inaudible - and that is perfectly OK: years ago I went through the exercise of working out precisely how much error it effectively created in amplitude if the DAC "reading" moment occurred a certain time interval away from the correct point - and amazingly, , it worked out just like the spec's said. Which said that the obsession with jitter back then was a nonsense, and I have never worried about it as a problem in itself. But that didn't stop me worrying about the sound not being as good, sometimes, as it should have been. And I have always been interested in some measurement regime that shows a change of behaviour in the DUT that correlates well with audible, subjective, performance characteristics. The fact that the numbers don't compute is irrelevant - what matters is that the numbers change in a way that corresponds to what one hears.
I seem to have missed where clocked devices sound better after 15 minutes, at the very worst by far at 1 hour, and then great one day or more later. Does that fit the subjective narrative?