You CAN get read errors and the drive makes a difference. If you rip CDs you can use AccurateRip to check for a bit-perfect rip and sometimes there are errors. It's almost always the CD but some drives are better than others at ripping marginal CDs. There is a statistical
database for "computer drives" but I don't know if there is anything similar for audio CD players or "transports".
Most audio players are pretty good at error-correction/error-hiding. I'm sure you've noticed how rarely you hear a defect and I'm sure you've noticed that most CDs are fine and it's a problem with the particular bad/damaged CD.
In the end you're listening to analog out of the DAC so there are no "bits" and no "perfection".
... I just scanned-through the article, but
. If, for example, you have a 16-bit test signal which contains a section of “digital zero” for testing the noise floor, this may indeed to be resampled – from a 16-bit digital zero to a 20-bit digital zero.
As noted, the intrinsic noise floor for 16-bit audio is 96.3dB. For 20-bit audio it’s 120.4dB. That nicely encompasses the actual 115.4dB measured result.
In the digital domain, a zero is a zero, and a string of zeros is minus infinity dB. There is no noise except the analog noise that comes from the analog-side of the DAC (or whatever noise that exists in the recording).
You don't get quantization noise until you have at-least one bit of data. If you've ever listened to an 8-bit file you can hear the quantization noise as a "fuzz" riding on top of the signal (at about -48dB). But when there is silence the quantization noise goes-away completely (minus infinity in the digital domain).
From what I've read jitter also sounds like noise but you can't hear it unless you somehow generate an unusually high amount of jitter.
Dither is also intentionally added noise and if the audio is dithered that usually exists even during silence (i.e. the zeros are not zeros) but a 16-bits or better you can't normally hear dither (or the lack of dither of the effects of dither.)