Agreed. The chipset itself has not fully matured. It is not manufacturer implementation.
The Buzz I had heard is that the Gen3 chipset likely wouldn't be available until late this year or early next. I do not know for certain if that is still the case. But what is apparent is that until they get it up to 48gbps, it is still nerfed in some manner.
If it were in fact finalized this year, product in development for release next year or in 2024 could have it.
(Following the chipset details is far from my knowledge base, and truthfully I find myself wondering how the Consoles like PS5 could be so far ahead of the curve... I expect there is a simple answer, but that level of engineering is beyond me. I just hope I can buy once for the next 7-10 years after that and be happy and content without some other forced obsolescence being applied to render any such device unusable. (Hyperbole.)
)
A consolation for you....
If you calculate out the requirements based on your visual acuity, and the distance from the screen, in at least 75% of use cases, 1080p is already beyond your visual acuity - and 4K is therefore unnecessary.
For those sitting nearer the screen, or opting for a screen that fills a wider angle of vision than recommended by the standards (SMPTE or THX) - 4K may be useful... but in most cases it is well beyond what is needed!
This chart is a useful guideline from RTings website
Our TV Sizes to Distance Calculator helps you choose the right size TV for your space. The optimal viewing distance is about 1.6 times the diagonal length of the television. For example, for a 55” TV, the best distance is 7 feet.
www.rtings.com
The key point here, is that the full 48gbps bandwidth is only required for the highest resolutions - and not only is material at those resolutions rare or non-existent for most users, but in fact, it is well beyond our ability to visually see those resolutions. - it is designed for uncompressed 8K@60Hz or 4K@120Hz.
With regards to frames per second - for most people max discernible under experimental conditions is circa 60Hz.... air force flight simulators aim for 75fps - and have shown that some fighter pilots (selected for their high visual acuity and time response) - can reach as high as 75fps.
So we have some rare edge cases that can see the difference between 60Hz and 75Hz, but apparently none that can see the difference between 75Hz and 120Hz (lots of gamers would disagree.... but their perceptual measurements have more to do with PC/Interface responsiveness than visual acuity - and don't have any measurable/quantifiable testing support...)
40g will support 4k (UltraHD) at up to 129FPS in 10bit ... using only 35gbps.
We have reached the point where people are focused on overengineered specifications, that probably add nothing...
I have a 65" 4K screen which I view from 2.2m (7') - the table says that UltraHD/4K should be "worth it" - depending on my visual acuity.
My subjective viewing, says that for most material (streaming, compressed) - I see no difference... and 1080p seems to be the sweet spot.
My own calculations based on other visual acuity calculators, seems to point to it being unlikely that I could tell the difference between 4K and 1080p.
Check your own setup, look into visual acuity and angle of vision calculation - see what resolution you really need, and whether there is any point seeking 48gbps!