I post this elsewhere but I am highlighting this in its own thread given the amount of misinformation on both sides of the fence on this topic:
----
"The other really great thing about a digital system like HDMI is that digital signals don't degrade. A digital system takes a signal, and reduces it to a series of bits - signals that can be interpreted as 1s and 0s. That series of bits is divided into bundles called packets. Each packet is transmitted with a checksum - an additional number that allows the receiver to check that it received the packet correctly. So for a given packet of information, you've either received it correctly, or you didn't. If you didn't, you request the sender to re-send it. So you either got it, or you didn't. There's no in-between. In terms of video quality, what that means is that the cable really doesn't matter very much. It's either getting the signal there, or it isn't. If the cable is really terrible, then it just won't work - you'll get gaps in the signal where the bad packets dropped out - which will produce a gap in the audio or video."
HDMI specification is confidential. You have to become a member to know what is in it. As such, a lot of folklore has been created around what it is, and isn't. The above is one of them. He is confusing HDMI with networking protocols. It does not work that way at all.
HDMI is a real-time stream of data. Most of that the time what it sends is the value of the video pixel to be displayed. Each piece of data arrives to be displayed. It is not part of a "packet" nor does it have any checksum. If the data comes across wrong, it gets displayed wrong or shows up as sparkles, hashes, etc. If we had checksums, the receiver could put up an error. We don't see that because there is no checksum for validity of data. Just about any value, right or wrong, could be the real deal from receivers point of view.
When HDMI gets to the end of a video line, it then switches to sending auxiliary data. One of those axillary data is audio. Audio does have a checksum because if you try to output screwed up audio data, you could produce DC or other serious static that could damage equipment, or make you go deaf. If the checksum indicates data corruption however, unlike most networking protocols, there is no retransmission. The sound will most likely mute and we go about our business.
In no case will the system try to re-capture lost data. The time for displaying that pixel or playing that snippet of sound has come and gone. The receiver has done with it what it can and has moved on. It can't go back in time and fix two frames back from what it is displaying now.
As Opus mentioned, he is also wrong about nature of transmission. Capturing data is one thing. Knowing when to output it is another. The latter is the timing for said audio/video samples. In the case of video there is no problem in that each pixel location is digital in nature in today's displays. So when told to light up pixel 105 as red, we know where it is regardless of whether the data for it came 0.1 pixel value sooner or later.
For audio, it is a different animal altogether. The receiver must extract the timing if the incoming data in order to create its clock for the DAC. Any vagaries in that calculation will cause jitter and lots of it in the case of HDMI. Here is a comparison of HDMI and S/PDIF input on the same AVR and hence DAC:
This is showing the *analog* output of the AVR DAC. Purple is S/PDIF. Yellow is HDMI. Identical digital data was sent to each input. Yet what came out of the DAC was much more distorted in the case of HDMI.
Now all of this said, it is not clear HDMI cables can influence this picture much. I ran some quick test and could not cause this output to change meaningfully when using a short cable versus a very long HDMI cable. But the possibility exists.
This got long so I should make it its own thread elsewhere .
----
Unfortunately he makes some serious technical errors in his argument:
"The other really great thing about a digital system like HDMI is that digital signals don't degrade. A digital system takes a signal, and reduces it to a series of bits - signals that can be interpreted as 1s and 0s. That series of bits is divided into bundles called packets. Each packet is transmitted with a checksum - an additional number that allows the receiver to check that it received the packet correctly. So for a given packet of information, you've either received it correctly, or you didn't. If you didn't, you request the sender to re-send it. So you either got it, or you didn't. There's no in-between. In terms of video quality, what that means is that the cable really doesn't matter very much. It's either getting the signal there, or it isn't. If the cable is really terrible, then it just won't work - you'll get gaps in the signal where the bad packets dropped out - which will produce a gap in the audio or video."
HDMI specification is confidential. You have to become a member to know what is in it. As such, a lot of folklore has been created around what it is, and isn't. The above is one of them. He is confusing HDMI with networking protocols. It does not work that way at all.
HDMI is a real-time stream of data. Most of that the time what it sends is the value of the video pixel to be displayed. Each piece of data arrives to be displayed. It is not part of a "packet" nor does it have any checksum. If the data comes across wrong, it gets displayed wrong or shows up as sparkles, hashes, etc. If we had checksums, the receiver could put up an error. We don't see that because there is no checksum for validity of data. Just about any value, right or wrong, could be the real deal from receivers point of view.
When HDMI gets to the end of a video line, it then switches to sending auxiliary data. One of those axillary data is audio. Audio does have a checksum because if you try to output screwed up audio data, you could produce DC or other serious static that could damage equipment, or make you go deaf. If the checksum indicates data corruption however, unlike most networking protocols, there is no retransmission. The sound will most likely mute and we go about our business.
In no case will the system try to re-capture lost data. The time for displaying that pixel or playing that snippet of sound has come and gone. The receiver has done with it what it can and has moved on. It can't go back in time and fix two frames back from what it is displaying now.
As Opus mentioned, he is also wrong about nature of transmission. Capturing data is one thing. Knowing when to output it is another. The latter is the timing for said audio/video samples. In the case of video there is no problem in that each pixel location is digital in nature in today's displays. So when told to light up pixel 105 as red, we know where it is regardless of whether the data for it came 0.1 pixel value sooner or later.
For audio, it is a different animal altogether. The receiver must extract the timing if the incoming data in order to create its clock for the DAC. Any vagaries in that calculation will cause jitter and lots of it in the case of HDMI. Here is a comparison of HDMI and S/PDIF input on the same AVR and hence DAC:
This is showing the *analog* output of the AVR DAC. Purple is S/PDIF. Yellow is HDMI. Identical digital data was sent to each input. Yet what came out of the DAC was much more distorted in the case of HDMI.
Now all of this said, it is not clear HDMI cables can influence this picture much. I ran some quick test and could not cause this output to change meaningfully when using a short cable versus a very long HDMI cable. But the possibility exists.
This got long so I should make it its own thread elsewhere .