Great but you won't find that by swapping cables about .
Great but you won't find that by swapping cables about .
This also falls below standards, no one here denies slight measurable differences between cables when you dig deep in to capacitance and such things. They don't tend to effect audio performance, your welcome to prove otherwise but please no more of this kind of thing.Ok .my last post in this thread...
When it comes to cables there are measurable differences in their ability to get input signal to output end, and how they shape the signal on the way between sources and receiver AND how that effect the soundwave producing element. This is not my opinion, this is fact with scientific methods from people that I trust way more than anyone at this forum.. this may be or not be a shock to some members here that cables actually make a difference in an audio system and their performance and reproduction abilities .. and you can not only measure capacitance and inductance and claim there's no difference .. no science there...
My only point is, don't piss on people that's trying to get the best out of THEIR sound system with all means available..
And last , I really like comments that add to the discussion with different perspectives.. to the troll lot , please eat the cyanide capsul now, the cometh is coming....
@Harmonie, I think you're missing the all point of Audio there.
Already repeated these basics over and over, but to put it oversimplified:
- Hi-Fi is about Audio Reproduction.
- Audio Reproduction ≠ Audio Production
- Audio Production (Music) IS Art. Therefore, it's is subjective. (Anything emotionally related is there, not elsewhere)
- Audio Reproduction ISN'T. It's all about technical engineering and Science. As such, it is objective.
- Audio Reproduction should ideally have one purpose: to be transparent to Music. That's what High Fidelity means.
The one and only purpose of cables is to transmit a signal flawlessly. The end. This purpose cannot be related to Art in any shape of foam.
That's pretty much it.
. Thx! Corrected.Very well put! (Even if you did write "of foam" instead of "or form" )
Ok .my last post in this thread...
So did I, if ever so briefly.My guy's work at Ericsson telecom systems and developed parts of 3G 4G 5G 6G and so on systems .
Almonds... delicious.Troll .... eat your cyanide now !!
This is not my opinion, this is fact with scientific methods from people that I trust way more than anyone at this forum..
My only point is, don't piss on people that's trying to get the best out of THEIR sound system with all means available..
What cables do you use in your systems?
Years later as an experiment I removed all the power cords and Hydra and replaced them with ordinary power cords and block of sockets. There was no loss of detail but a slight edge to the sound.
Sighted listening relying on auditory memory?
Sighted listening relying on auditory memory?
I agree that HDMI cables do make a difference for UHD HDR. In most cases, you must use new HDMI cables certified for UHD HDR or you will have drop outs, or other issues. Of course, when it works, it works. I don't think your picture or sound will be better with more expensive cables as some manufacturers claim.
When it was only 1080p, it was easy, every cable worked. But now...
I remember a great lecture on HDMI from RMAF I watched on Youtube.
Yes and no.If a given HDMI cable delivers an error-free stream of bits from source to destination (all features work correctly) it is 100% as good as any other HDMI cable in that same application, i.e., carrying that same bit rate. If you have an error-free picture with a given HDMI cable, there is no way that any other HDMI cable can improve the picture. The same is true for audio, however bit errors are likely not as apparent with audio as with video.
It has been a while since I've paid any attention to the HDMI specs, so out of curiosity I just took a quick glance at the information given on HDMI.org, concerning the 2.1 specification. Unless changes are stipulated for the physical connectors (i.e., an increase in the number of pins and conductors), the new features all reside at a protocol layer above the silicon and copper. There appear to be a lot of enhanced features and a vastly more elaborate scheme at the layer immediately above the silicon/copper. But with the quick glance I took I did not see anything that suggested any change to the connectors or the number of pins. As such, and notwithstanding the repeated assertions in their Q&A that you need a cable certified for 2.1 in order for the new features to work, you might not need a cable with that certification. It seems reasonable to prognosticate that a high-quality older HDMI cable, especially a shorter one, would pass the new certification without difficulty. In HDMI 2.0 there was a pin (14) that wasn't used, so there is some possibility that in a cheap cable the manufacturer would have saved a smidgen of copper by not connecting pin 14 at the two ends of the cable. But even if you were to encounter a cable of this sort, which seems rather unlikely, it wouldn't cause any problem unless the 2.1 spec makes use of the pin. I haven't looked into it enough to know whether it does or doesn't. But even if it does, it isn't likely that in a cable of decent quality pin 14 would be left unconnected end-to-end. You could always check this with an Ohmmeter or continuity tester.