Aren't we interested in performance in real-world applications? I get the point about true hardware capabilities. But after all we still use proprietary software like games, Photoshop or various CADs which couldn't be compiled by ourselves. IMO benchmarks per se aren't really interesting at all. I just usually check performance in real-world apps I use. However there is another issue. Choosing right settings and testing place could be troublesome. About like 5 years ago nVidia GPUs handled tessellation much better than AMD. nVidia abused that by paying game producers to use GameWorks. Excessive tessellation usage was huge performance hit for AMD users but improvement in graphic quality wasn't visible.
I did use bold for something. Because if you only use "normal usecase" benchmarks you don't really know what you pay for; and on the other hand, you can detect such collusions.
If you only want better Photoshop performances, well, yeah, measuring actual Photoshop performance is the way to go. Personally, I want the better product (and don't use proprietary software beyond mandatory firmware like CPU microcode and BIOS).
PS: notice that I mentioned x264, x265, gcc and clang, all applications used a lot not only for benchmarking.