I disagree. There's a huge difference of degree: you can indeed argue that Ray Tracing or HDR are not worth the money, and that's fine. But you definitely can't argue that they don't make any difference - the effect of HDR in movies, and Ray Tracing in games like Control, is highly visible, you'd be blind not to see it. That's very different from debates about MQA or DSD (or cables, etc.) where people are arguing endlessly about differences that simply don't exist (or if they do, they are extremely subtle and hard to spot).
There's a really, really big difference between, say, "Ray Tracing is nice but is it really worth a $1,000 GPU?" and "Should I spent $1,000 on a DAC so that I can upgrade my SINAD from 90 dB to 100 dB?". In the first case you're still acting rationally because you know you're going to see some improvement, albeit at a steep price. In the second case you're just burning money.
Yeah I agree, I was trying to talk about the formation of fanboys for tech that is fledgling, or potentially still-born/pointless. The comparison being drawn here was improper, but then again my main blunder was comparing two senses(sight and sound), and that's always a can of worms, especially with my rudimentary knowledge of sound. Maybe you could help me draw a better comparison in this respect?
Though you made some claims about HDR there...
What HDR are you talking about, because I sure as heck haven't seen any of the sort you speak of in my life time.. If you're talking about displays capable of the full DolbyVision spec, that is just a fantasy that doesn't even exist in R&D labs. Still haven't seen a 12-bit panel, nor have I seen content mastered and rendered on 10,000 peak nit brightness screens (though Sony did demo a 10K nit screen I think last CES). HDR is a massive blunder on all fronts simply because even the reference displays the colorists master on, like the Sony BVM-300 series reference display exhibits a massive sink in brightness as the brighter element on-screen captures more of the screen:
This is hardly what I would call some crazy massively amazing difference in a leisurely setting to a laymen comparing a reference display's HDR performance compared to a high end consumer television model (heck that person, like most wouldn't even know what to even be looking out for as you know). Most of the HDR content you view on a high-end UHD display is great (if not better than reference displays) simply because the rest of the performance metrics of the display are great. Show me a garbage display with great HDR rendering, then maybe we can better discuss how great something is in some isolation. But alas, I do admit my comparison wasn't drawn and spelled out properly with much thought with respect to MQA/DSD, it was more to demonstrate the annoying fanboyism surrounding these supposedly paradigm altering claims from HDR/RayTracing proponents.
As for Ray Tracing, take a look at Red Dead Redemption 2's handling of lightening with pre-baked light maps, and general volumetric lightening employed. It was great when it released on consoles, and even better not that the title came to PC this year. No Ray Tracing, and still looks better than the majority of games employing Ray Tracing at a glance. Sure the lighting may not be as good as Ray Tracing, but the performance penalty is nothing to snuff at either. Ray Tracing performance penalty reminds me perhaps the difference between Class A and Class D amp designs with respect to efficiency.. :} maybe that's a better comparison. Especially considering most graphics in games, developers attempt to cheat and hack the systems in order to create illusions of lighting, when Ray Tracing is actual compute attempts at proper lighting emulation. With enough compute power though I think you can cheat and hack away enough to make someone wonder if Ray Tracing was being used (that's why I talked about Red Dead 2). The only question remains which approach scales better (I'd argue Ray Tracing if hardware acceleration designs keep proliferating, as cheating in-software becomes more of a hands-on thing that gets more demanding as fidelity of graphics increases, at that point it just pays off to bite the cost bullet on dedicated hardware for lighting).
Also you spoke about:
"Ray Tracing is nice but is it really worth a $1,000 GPU?" "Should I spent $1,000 on a DAC so that I can upgrade my SINAD from 90 dB to 100 dB?".
There is an equivocation error here, you don't get to choose a $1,000 GPU where the manufacturer offers a SKU with no Ray Tracing hardware, where you somehow get that performance back in contemporary rasterization techniques. In the same way I don't get to choose my my RME without the DSD playback capability, when the reason I spent the thousand dollars was mostly for the creature comforts. Likewise for Ray Tracing when I buy a 2080Ti, I'll be buying it and turning off Ray Tracing in-game to salvage the huge FPS performance that Ray Tracing obliterated when it's turned on. BUT EVEN THAT sort of approach as diminishing returns, as having maximum frames-per-second throughput begins to exhibit severe diminishing returns in current display tech viewing. Asus has a now a 360Hz refresh rate monitor they showed off at CES 2020, I'm sure you would be on my side and say most folks won't be seeing the difference between 240Hz and 360Hz monitors for casual gaming or general computing of any kind outside of gaming.
As for people talking about spending $1,000 to upgrade SINAD 90dB to 100dB, that's not needed anymore. In the interest of maintaining the spirit of what you were implying, I'll steelman your case. If you're spending $1,000 to go from 120dB, to something higher, then that is going to cost something. As that point such a DAC can be employed in scientific monitoring use cases, or simply serve to quell your own self recognized placebo biases perhaps you're unable to shake for whatever reason. In which case I'd argue spending more wouldn't be out of the question. Though I am still in agreement with you, that my comparison sucked, as we were talking about consumer use-cases.
Eventually though, things like HDR, and Ray Tracing will be incorporated into the main-stack of general computing paradigms, in the same way 3D accelerators are virtually a part of every piece of graphics hardware today, but no so much in the 80's as you could imagine with respect to games. But currently, the stuff is handled very poorly. Ray Tracing isn't anything new, but having accelerator hardware cores dedicated to it, brings it to the forefront of potentially using it in more consumer oriented experience enhancing avenues. Likewise with HDR, not new, but unless it seemlessly applied without regressions, or compromises to achieve it at the expense of color uniformity or things of that nature, a bit of a waste..
Again, I understand why my DSD/MQA comparison was a bit trash reading it over again (I don't proof read as you can tell)
;(