I'd encourage people to read the research directly, rather than that Guardian article.
Here's a link to it. Despite the misleading Guardian write-up (which makes it sound like 4K is overkill, because it's talking about very small TVs), the outcome of this research is that actually people are more resolution-sensitive than was previously believed.
To me, the most interesting outcome of this research is that 8K resolution is not actually totally pointless in a home theater context. Using the best knowledge of human visual acuity before, it was just clearly the case that 8K could never have any benefit vs. 4K -- even if you had a fully THX-recommended size screen for your viewing distance, your eyes still couldn't resolve a full 4K, never mind 8K. But with this new study, it turns out that your eyes can actually out-resolve 4K with a screen of that size. Given perfect content, a significant fraction of people would notice the difference between 4K and 8K displays.
(This doesn't mean that 8K is going to happen, of course. The delta between 4K and a visually-perfect display at that distance is pretty small, nowhere near the size of the delta between 1080p and 4K that some people are nonsensically dismissing in this thread. And also of course, 8K doesn't bring anything else to the table -- UHD brought HDR, a wider color gamut, and higher bit depth along with its resolution increase. And
also also many people just have very low standards, which is why UHD-BD isn't dominating the world. But it does mean that people -- like me -- who have believed that 8K was completely and totally pointless were wrong. It's just mostly pointless.)