• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

[bug] HDMI 2.1 chip (Panasonic Solutions)

circumstances

Member
Joined
Sep 1, 2020
Messages
66
Likes
16
i have my unit connected to the internet (and have done some firmware upgrades that way since I've had it).

would that have required registering my unit, or is there some further step i should be taking?

i believe i needed the serial number of the unit to connect to the internet.
Never mind. I just made an account on the Denon page and registered my unit there.
 

brettjv

Member
Joined
Oct 9, 2020
Messages
24
Likes
15
For the US it is easier, you have an icon to enter or open your account at the top right.

https://www.denon.com/en-us/

Denon's is such an incredibly frustrating (and unprofessional) site.

I created an account there (at your link above), logged in, then clicked Support in the footer. Then I clicked 'Product Registration'. That seems to take me to a different domain, one on which I'm not logged on anymore (!). And get this ... it wants a 'Username' now to log in. I've never made a username, only used my email when I signed up. That email does not work as the username. So I choose 'Email My Username' and put in my email address ... and the email never friggin comes.

It's like they make it impossible for you to actually register a product. I wonder why that is ...
 

oupee

Active Member
Joined
Oct 15, 2020
Messages
157
Likes
57
Has anyone tried the Playstation 5 with 4K / 120 if it doesn't work either?
 

mcdull

Member
Joined
Jul 3, 2020
Messages
10
Likes
8
Since PS5 is confirmed to work with the buggy HDMI 2.1 chip at 4K/120Hz/HDR/YUV422, I think it will put pressure on Microsoft to update the software or firmware to have option to workaround the bug if that's possible.
 
Last edited:

sweetchaos

Major Contributor
The Curator
Joined
Nov 29, 2019
Messages
3,921
Likes
12,139
Location
BC, Canada
Latest update:
https://www.audioholics.com/news/sound-united-hdmi-2.1-no-issue

Sound United states:
Sound United completed additional testing and is pleased to report that 4K/120Hz passthrough works without issue on Nvidia and PS5 devices. Outside of what was originally reported regarding Xbox at 4K/120Hz output setting, we have had no further HDMI 2.1 device interoperability issues reported as of this writing.

What if you have PS5?
Configuring A PlayStation 5 With A Denon/Marantz Receiver for 4K@120fps Gaming
Released on Nov 21:

What if you have RTX 30 GPUs?
Configuring An NVIDIA RTX 30 Series Graphics Card With A Denon/Marantz Receiver for 4K@120fps Gaming
Released on Dec 7 (2 days ago):

What if you have Xbox Series X?
While the PS5 and RTX cards work fine, there is a compatibility issue with the new Xbox. We are working on a solution for Xbox Series X, as soon as I know more, I will be sure to share. Currently the best work around for an Xbox is to connect the game system directly to LG and feed uncompressed audio from the TV to receiver via eARC.
Source: Answer in Youtube comments from Sound United, in Dec 7 video
 

Vasr

Major Contributor
Joined
Jun 27, 2020
Messages
1,409
Likes
1,926
Since PS5 is confirmed to work with the buggy HDMI 2.1 chip at 4K/120Hz/HDR/YUV422, I think it will put pressure on Microsoft to update the software or firmware to have option to workaround the bug if that's possible.

This is backwards. Sound United is spinning the story putting out their PR videos making it look like there was no problem with PS5. They rolled out their smooth-talking marketeer in the videos who can, if necessary, make a compelling pitch 10-bit audio seem like the greatest invention and a gift to audiophiles.

Meanwhile, PS5 is under pressure from their users for capping their bandwidth without telling anybody which is what makes it work with 2.1 AVRs with the Panasonic chip. They will likely need to remove that cap soon for marketing reasons.

The real issue:
HDMI 2.1 specs support up to 48Gbps bandwidth.

40Gbps bandwidth is sufficient to do 10-bit HDR YUV 444 at 120hz.

Given the processing power needed to go above 40Gbps. most consoles including the X Box aren't likely to go above 40Gbps in the current generation. And when using 40Gbps they are not compressing the streams since that requires more processing power than they have available without affecting refresh rate.

Panasonic 2.1 chips cannot do uncompressed streams. This is a limitation that current generation of HDMI 2.1 AVRs with that chip have. There is no way around it.

When upstream devices are capped at 32Gbps, they do compressed video and cannot do 10-bit HDR YUV444 at 120hz and so there is chroma sub-sampling needed (the audio equivalent of the limitations of downsampling a high sample rate audio stream to fit the limited capability of the wire protocol/downstream device). So color hi-fidelity is lost. That means you have a choice between high sample rate (smooth picture) at lower color fidelity (lower bandwidth) or vice versa. The current Panasonic chip based AVRs will all have this compromise.

This is why the customers of PS5 are dumping on Sony when the news that PS5 bandwidth was capped came out recently.

XBox does not have this cap and so does 40Gbps with uncompressed streams (compressing them would require more processing power than they have available). Since most TVs are capable of 40Gbps bandwidth, PS5 users are screaming that they are not getting the full benefit of HDMI 2.1. LG which had capped their TVs at 32Gbps also came under fire for this reason.

But if the console or card sends out uncompressed 40Gbps, then it will blank out for a pass-through via the AVR with the Panasonic chip rather than degrade gracefully.

Nvidia RTX 30-series cards can do full 48Gbps. But the drivers have a setting to downgrade it.

I don't know if this is a problem with HDMI 2.1 specs or its implementation but I am surprised that there wasn't a HDMI handshake negotiation between source and destination to figure out the max rate that can be supported between source and destination and so degrade gracefully than lead to this blanked out screen situation.

The solution is to manually cap the source at 32Gbps if possible when the source is capable of higher bandwidth. This will avoid the blank screen problem when passing through the AVR.

So, if there is any fix at all for Xbox, it would be a setting to specify less than its maximum 40Gbps (or max YUV422 no HDR in 120hz mode, etc). That is, in effect, allowing the user to voluntarily downgrade to accommodate the AVR! I don't see XBox group rushing to do this as most of their users probably connect it directly to the TV anyway.
 

circumstances

Member
Joined
Sep 1, 2020
Messages
66
Likes
16
Latest update:
https://www.audioholics.com/news/sound-united-hdmi-2.1-no-issue

Sound United states:


What if you have PS5?
Configuring A PlayStation 5 With A Denon/Marantz Receiver for 4K@120fps Gaming
Released on Nov 21:

What if you have RTX 30 GPUs?
Configuring An NVIDIA RTX 30 Series Graphics Card With A Denon/Marantz Receiver for 4K@120fps Gaming
Released on Dec 7 (2 days ago):

What if you have Xbox Series X?

Source: Answer in Youtube comments from Sound United, in Dec 7 video
Is there anything in the Audioholics article that wasn't already in the Stereonet article I posted on Monday?
 

Obelisk

Member
Joined
Sep 8, 2020
Messages
21
Likes
22
Location
BC
This is backwards. Sound United is spinning the story putting out their PR videos making it look like there was no problem with PS5. They rolled out their smooth-talking marketeer in the videos who can, if necessary, make a compelling pitch 10-bit audio seem like the greatest invention and a gift to audiophiles.

10-bit audio? Can you clarify what this is referring to?

Meanwhile, PS5 is under pressure from their users for capping their bandwidth without telling anybody which is what makes it work with 2.1 AVRs with the Panasonic chip. They will likely need to remove that cap soon for marketing reasons.

The real issue:
HDMI 2.1 specs support up to 48Gbps bandwidth.

40Gbps bandwidth is sufficient to do 10-bit HDR YUV 444 at 120hz.

Given the processing power needed to go above 40Gbps. most consoles including the X Box aren't likely to go above 40Gbps in the current generation. And when using 40Gbps they are not compressing the streams since that requires more processing power than they have available without affecting refresh rate.

I'm not sure "processing power" is the correct term when discussing consoles/video cards and HDMI connections. The video interface bandwidth of an Nvidia RTX 3070 and RTX 3090 are exactly the same; "processing power" doesn't come into it. Current HDMI capabilities are the XBOX Series X is 40 gbps, while the PS 5 is 32 gbps. DSC is implemented in the HDMI interface, so the statement "when using 40Gbps they are not compressing the streams since that requires more processing power than they have available without affecting refresh rate." is nonsensical.

Panasonic 2.1 chips cannot do uncompressed streams. This is a limitation that current generation of HDMI 2.1 AVRs with that chip have. There is no way around it.

I think you have this backwards. Since HDMI 2.0 with 8-bit and 10-bit colour depths, DSC is currently only needed for 4K 240Hz, 5K 120Hz and 8K 60Hz, etc. The Denon HDMI 2.1 capable receivers support 8K 60Hz with DSC and all lower resolution without DSC.

When upstream devices are capped at 32Gbps, they do compressed video and cannot do 10-bit HDR YUV444 at 120hz and so there is chroma sub-sampling needed (the audio equivalent of the limitations of downsampling a high sample rate audio stream to fit the limited capability of the wire protocol/downstream device). So color hi-fidelity is lost. That means you have a choice between high sample rate (smooth picture) at lower color fidelity (lower bandwidth) or vice versa. The current Panasonic chip based AVRs will all have this compromise.

Regarding the PlayStation 5's HDMI output (422), we will all need to wait an see what is forthcoming with promised updates.

This is why the customers of PS5 are dumping on Sony when the news that PS5 bandwidth was capped came out recently.

XBox does not have this cap and so does 40Gbps with uncompressed streams (compressing them would require more processing power than they have available). Since most TVs are capable of 40Gbps bandwidth, PS5 users are screaming that they are not getting the full benefit of HDMI 2.1. LG which had capped their TVs at 32Gbps also came under fire for this reason.

But if the console or card sends out uncompressed 40Gbps, then it will blank out for a pass-through via the AVR with the Panasonic chip rather than degrade gracefully.

Nvidia RTX 30-series cards can do full 48Gbps. But the drivers have a setting to downgrade it.

I don't know if this is a problem with HDMI 2.1 specs or its implementation but I am surprised that there wasn't a HDMI handshake negotiation between source and destination to figure out the max rate that can be supported between source and destination and so degrade gracefully than lead to this blanked out screen situation.

The solution is to manually cap the source at 32Gbps if possible when the source is capable of higher bandwidth. This will avoid the blank screen problem when passing through the AVR.

So, if there is any fix at all for Xbox, it would be a setting to specify less than its maximum 40Gbps (or max YUV422 no HDR in 120hz mode, etc). That is, in effect, allowing the user to voluntarily downgrade to accommodate the AVR! I don't see XBox group rushing to do this as most of their users probably connect it directly to the TV anyway.

HDMI 2.1 is obviously having some teething problems. Given time I anticipate that most of them should be addressable with firmware updates. If folks want to complain about AV Receiver HDMI compatibility, the lack of 2560x1440 resolution support is significant, as high frame rate gaming on these consoles at 4K is only going to be possible with less graphically intensive games.
 

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,497
I know this is off-topic, but does anyone have a clue what encoder and decoder hardware do the latest 6000 series GPUs sport?

Please don't send me to the stupid official website with barely any information. I need something more specific like this.

I'm thinking of getting a camera that support 10-bit 4:2:2 4K video recording, but I've yet to see a single hardware solution that can handle such files. Nvidia's dumb ass seemed to have 4:4:4 and 4:2:0 but not 4:2:2 support for some retarded reason (I know the reason is because you need dedicated hardware, but I would assume if 4:4:4 has been covered for so long, why they wouldn't include 4:2:2 subsampling support in hardware)
 

Obelisk

Member
Joined
Sep 8, 2020
Messages
21
Likes
22
Location
BC
I know this is off-topic, but does anyone have a clue what encoder and decoder hardware do the latest 6000 series GPUs sport?

Please don't send me to the stupid official website with barely any information. I need something more specific like this.

I'm thinking of getting a camera that support 10-bit 4:2:2 4K video recording, but I've yet to see a single hardware solution that can handle such files. Nvidia's dumb ass seemed to have 4:4:4 and 4:2:0 but not 4:2:2 support for some retarded reason (I know the reason is because you need dedicated hardware, but I would assume if 4:4:4 has been covered for so long, why they wouldn't include 4:2:2 subsampling support in hardware)

Short answer: Nvidia's 7th generation NVENC (20 and 30-series GPUs) and PureVideo VP10 and VP11 (20 & 30-series respectively) are both far more capable and better quality than AMD's Radeon RX 6000 series video encode/decode offering. Doesn't OBS process everything as 4:4:4, so if you bring your camera output in as 4:2:2 isn't OBS taking care of the conversion? You can then choose 4:4:4 or 4:2:0 for output. To the best of my knowledge, Nvidia GPUs can hardware decode 4:2:2 for many formats.

To answer your question about 4:2:2 encoding support, from Nvidia's point of view you have 4:4:4 for quality or 4:2:0 for bandwidth/space savings. What PC application has a compelling reason to hardware encode at 4:2:2? Put another way, in what way does your camera's recording format have anything to do with the computer's encoding format? Are trying to encode files on a PC to transfer and watch on the camera!?! If so, why are you doing this? If you just want to output and preserve the camera's colour detail, choose encoder output of 4:4:4. Yes, the output will be a little bit bigger than 4:2:2, but any computer generate content (e.g. text) will be sharper at 4:4:4.
 
Last edited:

Vasr

Major Contributor
Joined
Jun 27, 2020
Messages
1,409
Likes
1,926
10-bit audio? Can you clarify what this is referring to?
That was just a hypothetical joke on the Sound United marketing guy who is really good at spinning anything as good. :)
I'm not sure "processing power" is the correct term when discussing consoles/video cards and HDMI connections. The video interface bandwidth of an Nvidia RTX 3070 and RTX 3090 are exactly the same; "processing power" doesn't come into it. Current HDMI capabilities are the XBOX Series X is 40 gbps, while the PS 5 is 32 gbps. DSC is implemented in the HDMI interface, so the statement "when using 40Gbps they are not compressing the streams since that requires more processing power than they have available without affecting refresh rate." is nonsensical.
You are correct. I had the wrong understanding of the problem from what I had read. The article had mixed up the processing power for generating those high-resolution video with the bandwidth compression.
I think you have this backwards. Since HDMI 2.0 with 8-bit and 10-bit colour depths, DSC is currently only needed for 4K 240Hz, 5K 120Hz and 8K 60Hz, etc. The Denon HDMI 2.1 capable receivers support 8K 60Hz with DSC and all lower resolution without DSC.
We are saying the same thing. I was referring to the inability to handle uncompressed high depth/resolution streams. This is going beyond my limit of understanding but to be more precise the problem seems to be with the use of the Fixed Rate Link (FRL) signalling that is new in HDMI 2.1 and used/needed with higher bandwidth uncompressed stream. This is what the Panasonic chip is unable to do. FRL apparently isn't used for lower bandwidth/compressed streams and not available in HDMI 2.0. So, the Panasonic chip can handle those as you mentioned.

Regarding the PlayStation 5's HDMI output (422), we will all need to wait an see what is forthcoming with promised updates.
I wonder if Sony capped the bandwidth in PS5 because of the issue with their own AVRs using the same Panasonic chip. But the negative reaction has created a bad PR problem. As far as I know, there is no technical reasons why Sony cannot allow 40Gbps in PS5 by removing the cap with an update. But then they will still have the embarrassing situation of PS5s attached to their own HDMI 2.1 AVRs showing blank screens (and unlike D&M they cannot plausibly point fingers at someone else). They will likely wait until they can figure out a graceful (and transparent) degradation with firmware updates when so-connected than a black screen or requiring manual user settings change.

HDMI 2.1 is obviously having some teething problems. Given time I anticipate that most of them should be addressable with firmware updates. If folks want to complain about AV Receiver HDMI compatibility, the lack of 2560x1440 resolution support is significant, as high frame rate gaming on these consoles at 4K is only going to be possible with less graphically intensive games.

There will be a lot of 4k gaming going on with the Nvidia 30-series based gaming PC builds where the power has now become lot more affordable. The major game makers are expected to release a lot more titles soon utilizing the full capabilities. But these setups have other means of getting the audio out than passing through an AVR. It will probably change once more games start to include Dolby Atmos, etc., which require the multi-channel connectivity options of the AVRs.

Frankly I don't think the console gaming community is high on the priority list of these AVR companies at the moment (except perhaps Sony because of their own gaming console).
 
Last edited:

Obelisk

Member
Joined
Sep 8, 2020
Messages
21
Likes
22
Location
BC
There will be a lot of 4k gaming going on with the Nvidia 30-series based gaming PC builds where the power has now become lot more affordable. The major game makers are expected to release a lot more titles soon utilizing the full capabilities. But these setups have other means of getting the audio out than passing through an AVR. It will probably change once more games start to include Dolby Atmos, etc., which require the multi-channel connectivity options of the AVRs.

Frankly I don't think the console gaming community is high on the priority list of these AVR companies at the moment (except perhaps Sony because of their own gaming console).

I'm not sure "a lot more affordable" is quite realistic. An RTX 3080 is still US$699, excluding the rest of the computer, whereas an XBOX Series X is US$499. Consoles are still the best way to consume UHD Bluray content, so I would expect console compatibility is a high priority for AV Receiver manufacturers. I've had some audio issues related to grounding when connecting a RTX 2080 Super to a Denon AVR-X4700H, so I'm not sure how well PC gaming with an AV Receiver is going to work for most people.
 

Chromatischism

Major Contributor
Forum Donor
Joined
Jun 5, 2020
Messages
4,809
Likes
3,749
To be fair, he said the console gaming community which has the unique ability to do 2160p 120 fps (even if it is upscaled). That is not found in TV shows or movies.
 

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,497
Short answer: Nvidia's 7th generation NVENC (20 and 30-series GPUs) and PureVideo VP10 and VP11 (20 & 30-series respectively) are both far more capable and better quality than AMD's Radeon RX 6000 series video encode/decode offering. Doesn't OBS process everything as 4:4:4, so if you bring your camera output in as 4:2:2 isn't OBS taking care of the conversion? You can then choose 4:4:4 or 4:2:0 for output. To the best of my knowledge, Nvidia GPUs can hardware decode 4:2:2 for many formats.

To answer your question about 4:2:2 encoding support, from Nvidia's point of view you have 4:4:4 for quality or 4:2:0 for bandwidth/space savings. What PC application has a compelling reason to hardware encode at 4:2:2? Put another way, in what way does your camera's recording format have anything to do with the computer's encoding format? Are trying to encode files on a PC to transfer and watch on the camera!?! If so, why are you doing this? If you just want to output and preserve the camera's colour detail, choose encoder output of 4:4:4. Yes, the output will be a little bit bigger than 4:2:2, but any computer generate content (e.g. text) will be sharper at 4:4:4.

Sony A7SIII's files tank performance even on highest end rigs when trying to edit some files like 4K 120fps. I need to setup proxies or transcode for usable timeline editing.

I never assumed AMD has better encoders. If they had support for this one specific use case, it would be worthwhile for me to forgo Nvidia's cards in this specific case.
 
Top Bottom