AnalogSteph
Major Contributor
Pretty sure that's a software / processing artifact, I've seen similar when distortion over frequency was being measured. You can try recording some of your noise floor and using the spectrum analyzer function built into RMAA (N = 16384, Kaiser-Bessel window, beta = 20).Curious. I have a Gen 1. Quite happy with it for speaker testing, but I notice a big difference in your graphs than mine. My noise floor runs from about -120 dB @ 20 Hz, rising in a strait line to about -94dB @ 20K. You show a flatter noise floor. Is this a feature of the DAC technology, or maybe a feature of my spectrum analyzer? ( TrueRTA)
A/D dynamic range in the current generation is up to 111 dB(A) from 105 dB(A) in the first one. While a nice improvement, I suspect you'll generally be running into microphone and room noise first, and skilled use of the input gain dial would make a greater difference.
If you are talking firmly 1980s and earlier, then probably yes. Studio-grade ADCs were reaching 18 bit performance by the early '90s. A few years later, the best ones reached 120 dB, and by the end of the decade SOTA had moved to 130 dB, and 96 kHz recording was basically standard.If you think it is not good enough for home recording, just think back a few years when the best we had was a Sure pre-amp into a Sony quarter track 7 inch reel to reel. This thing is better than the every best studios for when most of my CD collection was recorded.