• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Does Phase Distortion/Shift Matter in Audio? (no*)

Oh, that was just an example. In general, you should think between the choice of a very 'non-ideal' phase vs one that is linear-ish via DSP. This is where you might be able to hear a difference.
It is tricky, byt your example points to a miniscule difference in presentation. So it just might be it
 
Rene, first off let me say that I am appreciative of your efforts to educate us. But I have to say something about your article: I am sure there is something valuable in there, but all that maths makes it impenetrable for those of us who do not have much math literacy. Would you consider another article but written in a more accessible way?
Hmmm, yes, I have considered how to do this without as much math. I do have a long video going over all of the theory, but I guess that that requires math knowledge too. I have also offer to hold some Youtube live session regarding phase, where people could join and ask question, but there was seemingly no interest. So I am not entirely sure if there even is much to do here, as people may be happy thinking about phase just as a delay and that is it. But I am open to suggestions. I can also talk to audioXpress about doing a simplified version, but perhaps that is out of scope, as I do know companies have been using the article as is to settle disputes. Perhaps some whitepaper, and then use an ASR post to upload it? I don't know. Perhaps I can make a video with Amir sometime?
 
Hmmm, yes, I have considered how to do this without as much math. I do have a long video going over all of the theory, but I guess that that requires math knowledge too. I have also offer to hold some Youtube live session regarding phase, where people could join and ask question, but there was seemingly no interest. So I am not entirely sure if there even is much to do here, as people may be happy thinking about phase just as a delay and that is it. But I am open to suggestions. I can also talk to audioXpress about doing a simplified version, but perhaps that is out of scope, as I do know companies have been using the article as is to settle disputes. Perhaps some whitepaper, and then use an ASR post to upload it? I don't know. Perhaps I can make a video with Amir sometime?

You could write the article and create a thread. Yes there is a chance it might get lost, but if you keep linking to it every time someone is confused about phase, I am sure people would read it.
 
You could write the article and create a thread. Yes there is a chance it might get lost, but if you keep linking to it every time someone is confused about phase, I am sure people would read it.
I will be looking forward to that. In the mean time this two part old YT video is one of the best I have seen in simplifying the concept:
 
I will be looking forward to that. In the mean time this two part old YT video is one of the best I have seen in simplifying the concept:
One minute into the first one, and it is wrong already. These types of videos typically don’t match common signal processing and just add to the confusion, and now people have to unlearn this stuff instead having an open mind to the topic.
 
You could write the article and create a thread. Yes there is a chance it might get lost, but if you keep linking to it every time someone is confused about phase, I am sure people would read it.
He did I thought. (I am 99% sure of it)
Just that half the people or 90+% just blink when they hear about phase. I think that they do not know, and if they did, they also do not care.

He did that one presentation for a few of us here, and maybe more (percentage wise) in the Eu.
 
Only if it were that easy. I could not yet find a single tool that unwraps phase correctly for all possible impulse responses and Matlab's unwrap is one of the worst.
I have usually written my own "unwrapping" routine that looks at the slope near the 180 degree points to (try to) decide if it is approaching a cusp or discontinuity and decide how to unwrap. It will not work for "all possible" responses as a true discontinuity can fool it, and frequency resolution must be good enough to properly resolve the slope (I routinely use 100 points/decade as a starting point). Fortunately, infinite Q and ideal impulses (and discontinuities) are not found in the real world, so it works well enough for actual systems. The most common problem I have run into is when the models are inadequate or extrapolated beyond their intended application, such as capacitors or inductors that do not properly model parasitics well beyond their self-resonant point. That happens a lot in decoupling circuits (power distribution networks) where multi-GHz signals (current and voltage spikes) hit the power rails; I usually had to perform measurements and S-parameter extraction to determine if the power decoupling was adequate. Never used to worry about that in audio, but SMPS and class-D amplifiers mean more attention must be paid to wideband decoupling. It is not just throwing more small capacitors at the voltage rails; they can actually make the problem worse by adding fairly high-Q resonant peaks with board trace and wiring inductance. Which also makes unwrapping the phase problematic, natch.
 
One minute into the first one, and it is wrong already. These types of videos typically don’t match common signal processing and just add to the confusion, and now people have to unlearn this stuff instead having an open mind to the topic.

I watched that video and I did not see anything wrong with it, which tells me that there is a problem. I am aware that there is something more to it than my current understanding. If that video seems to be correct, it means my understanding of phase is wrong. This is why an explanation which is easy to understand is sorely needed.

There is nothing wrong with unlearning, IMO. It is something I do all the time. In fact it's something we should all do all the time.
 
I watched that video and I did not see anything wrong with it, which tells me that there is a problem. I am aware that there is something more to it than my current understanding. If that video seems to be correct, it means my understanding of phase is wrong. This is why an explanation which is easy to understand is sorely needed.

There is nothing wrong with unlearning, IMO. It is something I do all the time. In fact it's something we should all do all the time.
He talks about wavelength from the get go. What wavelength? What wave? This is signal processing, not physics. 360 deg is not delayed a wavelength to 0 deg, at least not for the relevant phasor phase. They are the same phase. He talks about the frequency axis as if it were a coordinate axis related to some distance. He talks about the delay being one wavelength for the graphs but since wavelength varies with frequency how can he then at the same time have a constant delay? If it is not constant there is not one delay only to talk about as you now need to introduce phase delay and group delay. This is an engineering approach to understanding mathematics via measurements and it just comes at a cost.
 
I have usually written my own "unwrapping" routine that looks at the slope near the 180 degree points to (try to) decide if it is approaching a cusp or discontinuity and decide how to unwrap. It will not work for "all possible" responses as a true discontinuity can fool it, and frequency resolution must be good enough to properly resolve the slope (I routinely use 100 points/decade as a starting point). Fortunately, infinite Q and ideal impulses (and discontinuities) are not found in the real world, so it works well enough for actual systems. The most common problem I have run into is when the models are inadequate or extrapolated beyond their intended application, such as capacitors or inductors that do not properly model parasitics well beyond their self-resonant point. That happens a lot in decoupling circuits (power distribution networks) where multi-GHz signals (current and voltage spikes) hit the power rails; I usually had to perform measurements and S-parameter extraction to determine if the power decoupling was adequate. Never used to worry about that in audio, but SMPS and class-D amplifiers mean more attention must be paid to wideband decoupling. It is not just throwing more small capacitors at the voltage rails; they can actually make the problem worse by adding fairly high-Q resonant peaks with board trace and wiring inductance. Which also makes unwrapping the phase problematic, natch.
Sign changes are a (*(*&(* when you have a zero in the middle.
 
Sign changes are a (*(*&(* when you have a zero in the middle.
And also assuming microphone readings are accurate. According to this interesting (to me at least) video, they are most likely not even close:


A summary of the video for the lazy:

1. HPF and AC Coupling (DC blocking) add positive low frequency phase shift.
2. Speakers/amps can have electrical responses that might not have acoustically obvious markers.
3. Oscilloscope phase shift to right = negative (downward) FFT phase shift = lateness or wave reshaping.
4. Speaker drivers naturally get “behind” at higher frequencies, up to -180º of phase shift.
5. Sinusoidal sound waves have a +180º phase shift relative to the motion of the driver.
6. Test condenser mics map sound pressure to voltage faithfully.
7. Some mics are polarity flipped, and often have HPF and AC Coupling that adds positive shift.
 
Last edited:
And also assuming microphone readings are accurate. According to this interesting (to me at least) video, they are most likely not even close:


A summary of the video for the lazy:

1. HPF and AC Coupling (DC blocking) add positive low frequency phase shift.
2. Speakers/amps can have electrical responses that might not have acoustically obvious markers.
3. Oscilloscope phase shift to right = negative (downward) FFT phase shift = lateness or wave reshaping.
4. Speaker drivers naturally get “behind” at higher frequencies, up to -180º of phase shift.
5. Sinusoidal sound waves have a +180º phase shift relative to the motion of the driver.
6. Test condenser mics map sound pressure to voltage faithfully.
7. Some mics are polarity flipped, and often have HPF and AC Coupling that adds positive shift.
About amps down low,a test that I have done probably more than a hundred times to figure out what happens.
Both amps well,VERY well into their linear range,confirmed by electrical measurements.Both powered by PSUs that can cover 6 of them.
But...

test1.jpg test2.jpg

Don't have phase right now but at the 40's where the one struggles a little there's always a wrap too where it disappears when the stronger one is there.
(way above my paycheck to interpret it)
 
  • Like
Reactions: OCA
He talks about wavelength from the get go. What wavelength? What wave? This is signal processing, not physics. 360 deg is not delayed a wavelength to 0 deg, at least not for the relevant phasor phase. They are the same phase. He talks about the frequency axis as if it were a coordinate axis related to some distance. He talks about the delay being one wavelength for the graphs but since wavelength varies with frequency how can he then at the same time have a constant delay? If it is not constant there is not one delay only to talk about as you now need to introduce phase delay and group delay. This is an engineering approach to understanding mathematics via measurements and it just comes at a cost.

Thank you for your response. The part I highlighted in bold is what I would like to discuss. In the real world, sounds have wavelengths, and phase rotates in a 3D fashion as shown in OCA's illustration of Heyser's corkscrew. DSP and signal processing is an abstraction of physics - it represents a 3D phenomenon in 2D.

We have already seen that it comes with a cost - unwrapped phase errors.

IMO there is nothing wrong with abstractions, provided the abstractions actually model real world behaviour. None of us have a problem with digital 1's and 0's because that particular abstraction perfectly recreates a real waveform. This is the part I do not understand - what is the cost of this abstraction? Are there real world penalties to misunderstanding phase - i.e. applying our understanding of physical phase to signal processing? To be honest, I understand phase (in physics) but I simply applied my understanding to signal processing. I assumed that what I see on my screen is what is actually happening. Is this wrong?
 
Thank you for your response. The part I highlighted in bold is what I would like to discuss. In the real world, sounds have wavelengths, and phase rotates in a 3D fashion as shown in OCA's illustration of Heyser's corkscrew. DSP and signal processing is an abstraction of physics - it represents a 3D phenomenon in 2D.

We have already seen that it comes with a cost - unwrapped phase errors.

IMO there is nothing wrong with abstractions, provided the abstractions actually model real world behaviour. None of us have a problem with digital 1's and 0's because that particular abstraction perfectly recreates a real waveform. This is the part I do not understand - what is the cost of this abstraction? Are there real world penalties to misunderstanding phase - i.e. applying our understanding of physical phase to signal processing? To be honest, I understand phase (in physics) but I simply applied my understanding to signal processing. I assumed that what I see on my screen is what is actually happening. Is this wrong?
I am at a conference this week so will get back to you in a week.
 
And also assuming microphone readings are accurate. According to this interesting (to me at least) video, they are most likely not even close:


A summary of the video for the lazy:

1. HPF and AC Coupling (DC blocking) add positive low frequency phase shift.
2. Speakers/amps can have electrical responses that might not have acoustically obvious markers.
3. Oscilloscope phase shift to right = negative (downward) FFT phase shift = lateness or wave reshaping.
4. Speaker drivers naturally get “behind” at higher frequencies, up to -180º of phase shift.
5. Sinusoidal sound waves have a +180º phase shift relative to the motion of the driver.
6. Test condenser mics map sound pressure to voltage faithfully.
7. Some mics are polarity flipped, and often have HPF and AC Coupling that adds positive shift.
Reminds me of this thread, where I learned how whacky things get when a driver is sloshing into free air: https://www.audiosciencereview.com/...ich-way-does-a-loudspeaker-driver-move.42697/

I get the impression that condenser mics used for measurements are actually fairly accurate. I don't know about other kinds of microphones that might be used when recording music. I think you showed me in one of your videos to phase correct the calibration file for the microphone. I wonder if that's really necessary. Is the condenser mic. not a minimum phase device? It would seem that EQing it's response flat should also correct it's phase response.
 
  • Like
Reactions: OCA
We have already seen that it comes with a cost - unwrapped phase errors.
That's not a "cost" it's just mathematics. One could use other than Fourier basis, and see different results.

None of us have a problem with digital 1's and 0's because that particular abstraction perfectly recreates a real waveform. This is the part I do not understand - what is the cost of this abstraction?

Which abstraction? If you mean representing delay as unwrapped phase, I don't see any cost. Bear in mind that there is a sampling issue with calculating phase just like there is with anything else.

Sample properly, you're good. I'm still waiting to find out why there is any issue about loudspeakers from another poster that makes measurement hard. I don't see any difficulty with measurement. Now, with loudspeakers, yes, there are all sorts of issues, but measurement is not impacted, you just might stare in horror at the actual measurements. With some systems, that's a perfectly reasonable reaction, too.
 
Last edited:
Now, with loudspeakers, yes, there are all sorts of issues, but measurement is not impacted, you just might stare in horror at the actual measurements. With some systems, that's a perfectly reasonable reaction, too.
One more reason to not check ASR whilst eating lunch; spewing food all over the monitor is such a PITA to clean.

(I tried to come up with some witty phase reference to make it relevant but gave up.)
 
Back
Top Bottom