• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Reviewers should have their hearing tested and post results

The worst is a very subjective term when choosing a reviewer :)
But for sure I will not change my mind regarding my top subjectivist reviewer. Because it is still more fiction than a review.
 
This suggestion doesn't fly at all - its entire premise is to take a measurement... that just won't do.

And the reviewer would be all hung up on which power cord the audiologist used with his testing machine - and if the results were unfavourable it would be because the equipment was not resolving enough ...
 
Perhaps…but I would strongly advise anyone to steer clear of ‘listening training’ or whatever you want to call it.
I so regret having my producer friend help me out in understanding and hearing some of the problems he some times come across in his work. Well…once you hear it, you can’t turn it off…and who wants to spend the rest of their lives with an incessant sound connoisseur in their brain - always complaining about minuscule stuff that nobody ever hears unless someone is foolish enough to point it out.
In some cases, ignorance is bliss….unless you work with sound professionally.
I took some classes on making beer and one of the classes focused on how you can screw up a beer and what those specific flavors are. There are so many beers that I can’t drink anymore.
 
I took some classes on making beer and one of the classes focused on how you can screw up a beer and what those specific flavors are. There are so many beers that I can’t drink anymore.
One word: banana.
 
lol yeah it's similar to screen tearing in video games. You can go on and on playing games for years saying "screen tearing...what screen tearing??" But then one day you'll finally see it...and from that point on you can't un-see it.

It'd be fun to see the results from testing the hearing of our favorite reviewers because many of them will talk a lot about the importance of super-high freguency, or even unltrasonic aspects of sound reproduction. Some of them firmly believe that energy above 20khz plays a part in good sound. Ultimately it wouldn't really matter though.
 
Pick the worst "subjective" reviewer you can think of. If you found out they actually had perfect hearing from a test given by an audiologist, would that change your opinion of what they have said about products?
No.
 
Perhaps…but I would strongly advise anyone to steer clear of ‘listening training’ or whatever you want to call it.
I so regret having my producer friend help me out in understanding and hearing some of the problems he some times come across in his work. Well…once you hear it, you can’t turn it off…and who wants to spend the rest of their lives with an incessant sound connoisseur in their brain - always complaining about minuscule stuff that nobody ever hears unless someone is foolish enough to point it out.
In some cases, ignorance is bliss….unless you work with sound professionally.
As an opera singer all my life, it REALLY pains me to listen to others sing now. I'm OK with listening to pop or even Broadway (I don't expect their technique to be "classical"), but listening to almost ANY other opera-type singer just makes me so tense, as I hear EVERY flaw, can tell whether or not they're going to make that next high note, etc. I feel your pain.
 
The worst is a very subjective term when choosing a reviewer :)
But for sure I will not change my mind regarding my top subjectivist reviewer. Because it is still more fiction than a review.
Assuming they don't just write things up without listening, I think fiction goes a little too far. Delusional fits better, imo.

Which is why I don't think a polygraph would add value in most cases. Polygraphs are not perfect even in the best case, but a sincere belief tends to lead to a pass.
 
These reviews remind me of commercials for washing powder. They were getting whites 'whiter than white' back in the 1980s (and before I was born, I'd assume) and they have been getting 'whiter' ever since.
Actually, the washing powder's claim was scientifically true! :)
Pick another analogy . . .
 
As the title says, reviewers should have their hearing tested and post the results annually.

I saw this posted elsewhere, but thought it might be a good thread for a good chuckle. Since we are on ASR and AMIRM states that he is trained in critical listening maybe he should volunteer to be the first of many. Not trying to accuse him of anything, but he does post subjective listening tests for many speakers and headphones (along with measurements) why not subject himself first and ask others to follow suit?
 
Actually, the washing powder's claim was scientifically true! :)
Pick another analogy . . .
How white they must be getting them now!!
Need to swear skiing goggles to hang my undies out!
 
Pick the worst "subjective" reviewer you can think of. If you found out they actually had perfect hearing from a test given by an audiologist, would that change your opinion of what they have said about products?
Not that much, no. But of course it would provide a bit more credibility to the claims of hearing some very small but at the same time so very important details that might be missed by people without proper audiophile experience and/or resolving enough power cables.

I'm pretty sure that should it go the other way, say that the reviewer can't hear a thing after 12kHz, some fans might have something to think about.
 
Ok, I have a whopping 2 hours with nothing to grade (miracle!), so let me describe a way to establish a degree of reliability and validity of reviewers. Actually doing this to a "would be publishable" level is not simple or easy, so keep that in mind as you consider this. I am going to oversimplify, a lot.

Reliability = will repeated measures give the same results?

Validity = does the measure actual measure what it claims to measure?

For example, I have experimented with grading papers. If I only look at the references (blind to title and author), I can get a grade that is accurate +/- 3% about 90% of the time (after a full grading process). That is a reliable measure, but it is NOT valid as a measure of paper quality. And I don't do that, because it is not valid, but also because that other 10% is important, and a 6 point swing is half a letter grade. So it is not reliable enough for my standards, even if it were valid.

Take reviewers who compare products directly. That might be using an A/B switch, as some do. Or just comparing as some (who I pay little attention to) do from memory or changing components manually.

We could take their reviews of speakers for the last year, and make note of the comparison speakers. Keep track of "more and less" comments. Are the comments for speakers consistent, reliable? If A has more bass than B, and B has more Bass than C, then C better have less bass than A, not more. So we could see if there is a consistent pattern to the comments on various speakers over time. We could calculate how consistent those comments are with each other as well. This would give us a measure of reliabilty of the reviewer.

For validity, we would want to compare their comments to measurements. If they call something flat in the midrange, is it? If they say elevated treble, does that show up in the measurements?

Done right, this could lead to an accuracy score, allowing comparisons between various reviewers. I do think it would be a more objective way to assess reviewer subjectivity, accuracy, and bias than is typical.

I will never do this. Because coding qualitative data is a PITA. And if I did it I would have to be very strict, which means paying attention to things like this (highly cited) paper: https://scholarworks.umass.edu/cgi/viewcontent.cgi?article=1247&context=pare

TL/DR version: ordinal alpha is a better measure of reliability than Cronbach's alpha for what I have proposed.

For your consideration and entertainment, fwiw.
 
Not that much, no. But of course it would provide a bit more credibility to the claims of hearing some very small but at the same time so very important details that might be missed by people without proper audiophile experience and/or resolving enough power cables.
The problem is that very small sonic differences present can easily be overwhelmed by confirmation bias, placebo effect, and other psychological processes. So the hearing test might not be adding valuable information at that level. In fact, it might give false confidence in such claims. More chance of a type 2 error, seeing something that is valid that is actually false.

Now, if they have perfect pitch, that would be good to know. But I have NEVER seen or heard a reviewer say, "the speaker is slightly sharp in tone in parts of the midrange." Maybe someone has, but I have not seen anything like that claimed in a review.
 
First of all there are few hearing test preformed that would benefit anyone except your employer. That would be to cover their own collective a$$es.
All occupational hearing test are below 5khz and 80% of those are below 2500hz. The reason is your ability to understands other people's voices in the
work place. There is actually a force whisper test per ear in the DOT world.

I'd much rather put my faith in someone that was trained to hear vs any machine data. I look at specifications first then listen. I won't even bother if
there is a thumbs down from a few people I absolutely trust. I wouldn't look at a speaker with a 78% sensitivity any more than I would consider one
in the 105+ category. They could measure as flat as a board and they would still be tossed no matter the reviewer or the measured specs.

90% of all powered speakers fall in that category primary because of cost and the fact that I picked the types of speaker drivers I liked 50+ years ago.
I've never been a fan of pro speakers no matter who made what. To much floor noise in the ones I've heard and I'm not a fan of an amp taking a dump
and having to buy a pair of speaker to fix a perfectly good speaker. By MY standards they have a place, just not in my place.

The most important part of all speaker test is "how does it sound in your room to you?" That's not subjective at all if you don't like what you hear no matter
the reason. If the speaker is supposed to sound a certain way, it's not up to a hearing test to change what you hear, it's up to your equipment to contour
that sound to your liking. I liken it to a 200 mph car with a flat tire. That flat tire is usually the difference between the dealers room and your room. Where are
you going to be listening to that piece of equipment or that set of speakers or a single center speaker?

I start with specs not reviews and after decades of listening, flowery reviews mean nothing unless I personally know the reviewer and the language they use.

They may make 5K or 15K DACs just like 20-100K power amps, that has never been the measure of my purchases. I look for proof of longevity and that comes
after years of personal information gathering and a proven track record. Mcintosh has yet to let me down, the rest is a matter of taste, not subjective BS I've
had to listen to from NON Mac owners or users. An example of a potential speaker for me is the new PS speaker design.
They are using drivers I usually like in a cabinet that I prefer. A narrow rounded baffle combined with deep solid cabinets. The sensitivity is perfect. Too bad
I already have speakers that will last me for the rest of my life and my kids, for that matter.

Regards
 
lol yeah it's similar to screen tearing in video games. You can go on and on playing games for years saying "screen tearing...what screen tearing??" But then one day you'll finally see it...and from that point on you can't un-see it.
i avoid training my hearing. i don't want to be a golden-eared audiophile. that would be terrible. i am content to be a cloth-eared music lover.
 
This needs to be an objective experimental design test to be of any value. Various variables must be closely controlled. Measurements compared in control group , non control group and/or benchmark standards. This is what audiologists do for a living.
 
Back
Top Bottom