I've got an interesting question for you,
@March Audio.
Have you ever considered the significance of frame rate in video? Not many people have, I'll admit. But there's a phenomenon that might be termed 'film look' versus 'video look'. I could point you to various articles on an attempt to make a cinema film at 48 fps as opposed to the normal 24 fps. Some people liked 48 fps, but most preferred 'film look' 24 fps because of its slightly dream-like quality. On the other hand, people who watch sports on TV prefer higher frame rates primarily to do with the rendition of motion. Some TVs will cleverly convert all source frame rates to 120 Hz unless you turn it off. Tom Cruise has fronted a campaign to turn this off as a default setting because it makes cinema films look like a soap opera. On the other hand, when soap operas have adopted 'film look', viewers have written in to complain that there's something odd about the picture.
The interesting thing, I think, is that the difference is in plain sight, yet most people can't really put their finger on it. It's not just smoother motion, but a difference in 'clarity'.
Now, supposing you wanted to know which frame rate was best. Well I think it's a question that science can't answer. Cinema adopted 24 fps by accident, but stumbled upon a complete fluke: whatever you film at 24 fps, it ends up looking 'artistic' and 'dreamlike' - and this is pure subjectivity. Objectively there is no way to measure that quality.
Objectively, higher frame rates are 'better' in every way, and objective tests can demonstrate it: people can complete interactive tasks better at higher frame rates; in gaming they can judge time to impact better, etc.
And in this case, we know that context matters hugely because one person can prefer two different frame rates in two different contexts. And the slow frame rate of cinema created its own genre in the first place - without any Hollywood films to view, you couldn't run a meaningful test on whether 24 fps was better than 48. But had cinema adopted 48 fps to start with, it would have ended up making the same types of material as television did in the 1950s because - you guessed it - television adopted 50 or 60 fps (effectively) for video (it had to). Showing this material at 24 fps would look wrong to start with.
I would suggest this as an example where pure empirical science would fail without some background 'philosophy'. Simply setting out to find 'the best frame rate' would be doomed to never-ending confusion.
I think we have similar doomed quests in audio. The one that springs to mind is "What is the best target curve"? The reason why this is doomed to never-ending confusion is that the notion of target curve is 'wrong' because the curve is derived from the context, and assigning a curve despite the context is meaningless. Only by getting past the 'try it and see' mentality (a.k.a. 'science' for most people) can confusion be avoided.