My amp is SS. So the A500 has a much lower output impedance than the manual claims...
One thing that has confused me is the input sensitivity of the A500. The manual claims it is 1.64V which fits with what Peter Aczel measured back in 2005: ‘the maximum output is a little over 33 volts at the test limit of 1% distortion’ (1.64 x gain of 20). That would make the sensitivity +6.5dBu, according to the online sengepielaudio calculator, but the balanced inputs are labelled as +4dBu. So my confusion is which figure to use (+4dBu or +6.5dBu) when working out how much attenuation to use given max output of the source – in your case +22dBu (9.75V), I believe, and in mine +8.7dBu (CD- level of 2.1 V)? I realise that some A500 users would not bother with attenuation for CD-level max voltage.
My measurements indicated that the A500 has a higher output impedance than the spec, but still low enough for it not to matter.
As to levels, the differences you're concerned about don't seem to me to be that important, a few dBs is neither here nor there, considering that you will pretty much never be driving the amp that hard. Also, I measured mine not at 1% distortion, but at 0.02% distortion, i.e. before any clipping, not 1% THD into clipping, so that will account for some of the difference.
For a CD-level output of around 2V, I wouldn't bother with any attenuation, as depending on what sort of music you listen to, not all CDs are mastered to 0dBFS anyway. Classical and Jazz CDs, especially those originally issued in the 1980s and 1990s still leave some headroom. Even the original Dire Straits CDs have something like 6dB of headroom, so won't put out more than 1V. Some classical CDs have even more than 6dBs headroom, so maximum output will vary between masterings.
Considering how volume control is set entirely subjectively when listening, a few dBs more or less seems to me to be unimportant, as long as nothing is being overloaded at one end, and is quite loud enough at the other.
S.