AmadeusMozart
Member
- Joined
- Mar 30, 2020
- Messages
- 14
- Likes
- 9
Perhaps some of the more knowledgeable persons are able to provide an answer (or some insight) for me.To summarize, it seems that there are these primary questions here:
1. How much phase distortion are we talking about?
2. What frequencies are we talking about?
3. Is this specific amount of phase distortion at these specific frequencies audible?
Using LtSpice I am modelling a single ended tube amplifier using UL and CFB using only local feedback in the output stage.
All is fine, I get at 400Hz 0.65% distortion at onset of grid current (= maximum drive).
But when running simulation at 50Hz I get 3.25% distortion.
Reducing the signal capacitor between driver (SRPP) and output tube from 1uF to 50nF gives 1% distortion but phaseshift goes from 168 degrees to 141 degrees (amplifier is inverting due to SRPP) at 50Hz (at 400 Hz 179 degrees).
I am not sure but suspect that the inductance of the output transformer introduces some phase shift that then is offset by the phase shift in the smaller capacitance.
But the conundrum is: what will be more audible: the higher distortion or the higher phase shift?
FWIW I've read that 2% second harmonic distortion of 30Hz at 60Hz is actually perceived as being louder than the fundamental 30hz. Don't know how true that is.
When I grew up "HiFi" was defined as a true reproduction of sound between 50Hz and 15kHz.
Looking for your response, many thanks in advance and am now ducking under a chair after showing my ignorance. (wink)