• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required as is 20 years of participation in forums (not all true). Come here to have fun, be ready to be teased and not take online life too seriously. We now measure and review equipment for free! Click here for details.

Neural Pathways for Hearing Bass?

pozz

Data Ordinator
Forum Donor
Editor
Joined
May 21, 2019
Messages
3,289
Likes
5,120
#1
I recently came across a fairly rare book on infrasound (shipped in from a bookseller in Spain, no less) which prompted a few questions. It's from the 1970s, and despite recording amazing work it's clear that the topic was exploratory and the answers not comprehensive.

Bass below 100Hz but especially below 50Hz start hitting resonant points of the body like the stomach and chest cavity. I would like to understand if those inputs are primarily sent to auditory pathways or if other pathways are taken (e.g., up through the vagus nerve and its auditory auricular branch) and the percept is integrated at higher stages of processing.

I'm sure at least part of the way I posed the question is inexpert or jumbles the vocabulary. What I'm really looking for is citations so I can better understand this aspect of audition. Since the auditory nuclei brain show tonotopy I assume that part of the answer will be that there are pathways specific to low frequencies, while another part will be that low frequencies are nothing special and take the same pathways as most of the sound we hear, through the cochlea.

The interesting thing with infrasound specifically is that by requiring immense SPLs even to be perceived it will radiate both acoustically and through structure borne vibrations. That means perception must involve both the acoustic pathway through the outer and middle ears as well as direct bone conduction. But since we're talking about organs being vibrated now, a lot has to be stimulated, and maybe that means low frequency perception, or at least very low frequency perception, is multisensory from the outset.
 
Last edited:

PaulD

Senior Member
Joined
Jul 22, 2018
Messages
437
Likes
1,184
Location
Other
#2
I recently came across a fairly rare book on infrasound (shipped in from a bookseller in Spain, no less) which prompted a few questions. It's from the 1970s, and despite recording amazing work it's clear that the topic was exploratory and the answers not comprehensive.

Bass below 100Hz but especially below 50Hz start hitting resonant points of the body like the stomach and chest cavity. I would like to understand if those inputs are primarily sent to auditory pathways (e.g., up through the vagus nerve and its auditory branch) or if other pathways are taken and the percept is integrated at higher stages of processing.

I'm sure at least part of the way I posed the question is inexpert or jumbles the vocabulary. What I'm really looking for is citations so I can better understand this aspect of audition. Since the auditory nuclei brain show tonotopy I assume that part of the answer will be that there are pathways specific to low frequencies, while another part will be that low frequencies are nothing special and take the same pathways as most of the sound we hear, through the cochlea.

The interesting thing with infrasound specifically is that by requiring immense SPLs even to be perceived (>130dB) it will radiate both acoustically and through structure borne vibrations. That means perception must involve both the acoustic pathway through the outer and middle ears as well as direct bone conduction. But since we're talking about organs being vibrated now, a lot has to be stimulated, and maybe that means low frequency perception, or at least very low frequency perception, is multisensory from the outset.
Interesting pozz! I have seen some work on equal loudness that involves the whole body, and saw a bit of a presentation that showed the traditional equal-loudness curves, but when the whole body is used for the measurement (not just the basilar membrane), the LF parts below a couple of hundred Hertz all flatten out, instead of doing the usual extreme tilt to lower frequencies. This was being used to potentially explain why some people are disturbed by wind farms when the LF disturbance is almost inaudible, because the transmitted vibration at such LF may be perceived in other ways.

This makes sense, because we can certainly sense DC (0Hz) as continuous pressure on our skin. So my feeling (haha) is that sensing this is a higher order function of the brain integrating multiple senses - our hearing and our sense of touch - at LF. Certainly there are acoustic events, such as reproducing a large drum hit at moderate to high volume where the sense of pressure on our skin seems as important as what we receive through the ear canal.

I have searched online several times for that presentation that I saw and have so far failed! If I can find it I will post it, I will ask around.
 
Last edited:
OP
pozz

pozz

Data Ordinator
Forum Donor
Editor
Joined
May 21, 2019
Messages
3,289
Likes
5,120
Thread Starter #3
Interesting pozz! I have seen some work on equal loudness that involves the whole body, and saw a bit of a presentation that showed the traditional equal-loudness curves, but when the whole body is used for the measurement (not just the basilar membrane), the LF parts below a couple of hundred Hertz all flatten out, instead of doing the usual extreme tilt to lower frequencies. This was being used to potentially explain why some people are disturbed by wind farms when the LF disturbance is almost inaudible, because the transmitted vibration at such LF may be perceived in other ways.

This makes sense, because we can certainly sense DC (0Hz) as continuous pressure on our skin. So my feeling (haha) is that sensing this is a higher order function of the brain integrating multiple senses - our hearing and our sense of touch - at LF. Certainly there are acoustic events, such as reproducing a large drum hit at moderate to high volume where the sense of pressure on our skin seems as important as what we receive through the ear canal.

I have searched online several times for that presentation that I saw and have so far failed! If I can find it I will post it, I will ask around.
From what I've seen the skin is fairly insensitive to audio as far as air pressure fluctuations (unless these are large changes in barometric pressure, since that's the background for audio, although I'm not well up on how) and cannot distinguish vibrations above 1kHz. Air velocity however it can pick up really well (e.g., wind, aspiration from speech, ventilation) since can you feel the displacements of air packets, but this isn't something speakers do.

I would be very interested in the study you found!
 

Kal Rubinson

Major Contributor
Industry Insider
Joined
Mar 23, 2016
Messages
2,979
Likes
4,383
Location
NYC/CT
#4
Bass below 100Hz but especially below 50Hz start hitting resonant points of the body like the stomach and chest cavity. I would like to understand if those inputs are primarily sent to auditory pathways (e.g., up through the vagus nerve and its auditory branch) or if other pathways are taken and the percept is integrated at higher stages of processing.
There is no auditory branch of the Vagus. There is an Auricular branch which has no auditory role but is cutaneous sensory.
This makes sense, because we can certainly sense DC (0Hz) as continuous pressure on our skin. So my feeling (haha) is that sensing this is a higher order function of the brain integrating multiple senses - our hearing and our sense of touch - at LF.
The question remains about whether such perceptions are really integrated as audible sound. We do learn to associate stimuli from various senses with each other through experience but, imho, that is not the same as integration into a higher order function.
 
Last edited:

Wes

Major Contributor
Joined
Dec 5, 2019
Messages
2,771
Likes
2,450
#5
There have been numerous studies on the hearing (and sometimes, production) of infrasonics by non-human animals. Often the mechanism is not studied, only the ability.

https://www.sciencemag.org/news/2012/08/elephants-silent-call

One study that comes to mind is that seals can detect infrasonics from their prey, fish, by using the vibrissae (whiskers) on their face. I may have a dim recollection of cats being able to do that to.... It suggests that a manly beard might help the hirsute audiophile...

Otherwise, detection of ground vibrations is well established by means of the lower jaw (mandible) in snakes, and some bones in the reptilian jaw evolved in an inner ear bone IIRC (Kal?). I used to have a diagram to that effect but it never made it onto the computer from an overhead projector version.

If you watch your dog while he has his jaw on the floor you'll see him launch towards the door periodically, likely using the same detection mechanism.

USAF used to use bone conduction for pilots too - maybe still does.
 

Kvalsvoll

Senior Member
Joined
Apr 25, 2019
Messages
362
Likes
690
Location
Norway
#6
Tactile sensations and hearing are not directly connected, but both are part of the perception of the sound we experience.

Very low frequencies are not sensed by acting on resonances of parts of the body, like higher up from around 20hz and up well into midrange frequencies. At very low freq it is the movement on the surface that supports the body that is sensed through detection of acceleration.

There is some information in the threads on the data-bass forum, those interested in the topic of tactile sensations can search and find it there.
 

Kal Rubinson

Major Contributor
Industry Insider
Joined
Mar 23, 2016
Messages
2,979
Likes
4,383
Location
NYC/CT
#7
Otherwise, detection of ground vibrations is well established by means of the lower jaw (mandible) in snakes, and some bones in the reptilian jaw evolved in an inner ear bone IIRC (Kal?).
There are many mechanisms for bone conduction across many species.
 
OP
pozz

pozz

Data Ordinator
Forum Donor
Editor
Joined
May 21, 2019
Messages
3,289
Likes
5,120
Thread Starter #9
A compilation of some of the data I've come across for bass equal loudness. Edit: I should note that the curves are for 0 phon or zero loudness, or what's called minimum audible pressure (MAP).

Good luck getting to 1Hz. Psychoacoustically, tonal quality falls apart when you go under 15Hz. Below that you get rough chugging, whooshing, popping and other noisy sensations, and in the single digits you start locking on individual pressure peaks of the waveform instead of phase locking with the period.

1617891869716.png


With lines connecting all points (instead of showing nothing if adjacent cells have gaps):

1617891066707.png


Whole body testing by Yeowart et al used this pressure chamber:

1617806243358.png
 
Last edited:
OP
pozz

pozz

Data Ordinator
Forum Donor
Editor
Joined
May 21, 2019
Messages
3,289
Likes
5,120
Thread Starter #10

j_j

Major Contributor
Audio Luminary
Technical Expert
Joined
Oct 10, 2017
Messages
1,020
Likes
1,979
Location
My dining room.
#11
To be clear, sensation at very low frequencies is not auditory at all, it is touch sensation and prioperception.
 

CMOT

Active Member
Joined
Feb 21, 2021
Messages
110
Likes
75
#17
There is no auditory branch of the Vagus. There is an Auricular branch which has no auditory role but is cutaneous sensory.

The question remains about whether such perceptions are really integrated as audible sound. We do learn to associate stimuli from various senses with each other through experience but, imho, that is not the same as integration into a higher order function.
Kal is correct here. There might be some question as to whether such perceptions are integrated into the experience of sound, but from what I know, I doubt it. The is little evidence for auditory neural responses for frequencies lower than about 20Hz. But that doesn't mean you don't have a coherent experience when the <20Hz sound waves are aligned (lets say amplitude modulated) with auditorily perceptible sound waves. You experience events, not sound waves, so a particular event might have multisensory components and you experience it as such. It isn't always easy to sort out the specific modality effects. Visual capture of auditory signals is similar - there isn't an auditory response per se, but I wouldn't want to argue that you experience of a complex auditory event isn't influenced by the generating visual action.

Oddly enough, there is pretty good evidence at this point that hearing a familiar human speaker's (not a audio speaker!) voice leads to visual responses in the brain. So audition is - for at least some cases - connected to vision. And the main researchers on this topic have been careful to rule out elicited auditory imagery.

Interaction of face and voice areas during speaker recognition
K Kriegstein, A Kleinschmidt, P Sterzer, AL Giraud - Journal of cognitive neuroscience, 2005
 

RayDunzl

Major Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
11,201
Likes
11,242
Location
Riverview FL
#18
The question remains about whether such perceptions are really integrated as audible sound. We do learn to associate stimuli from various senses with each other through experience but, imho, that is not the same as integration into a higher order function.
A recent Stereophile mentioned a deaf person that listens to music by holding a balloon between his palms.
 
OP
pozz

pozz

Data Ordinator
Forum Donor
Editor
Joined
May 21, 2019
Messages
3,289
Likes
5,120
Thread Starter #19
Kal is correct here. There might be some question as to whether such perceptions are integrated into the experience of sound, but from what I know, I doubt it. The is little evidence for auditory neural responses for frequencies lower than about 20Hz. But that doesn't mean you don't have a coherent experience when the <20Hz sound waves are aligned (lets say amplitude modulated) with auditorily perceptible sound waves. You experience events, not sound waves, so a particular event might have multisensory components and you experience it as such. It isn't always easy to sort out the specific modality effects. Visual capture of auditory signals is similar - there isn't an auditory response per se, but I wouldn't want to argue that you experience of a complex auditory event isn't influenced by the generating visual action.

Oddly enough, there is pretty good evidence at this point that hearing a familiar human speaker's (not a audio speaker!) voice leads to visual responses in the brain. So audition is - for at least some cases - connected to vision. And the main researchers on this topic have been careful to rule out elicited auditory imagery.

Interaction of face and voice areas during speaker recognition
K Kriegstein, A Kleinschmidt, P Sterzer, AL Giraud - Journal of cognitive neuroscience, 2005
I've found a good piece of survey literature on auditory cues specifically: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.03001/full

The upshot is that by isolating auditory cues vs. others (in VR, or by giving the subject headphones and feeding in the sound of their movements through a microphone), and applying either EQ or delay or other manipulations, experimenters were able to change perceptions like the subject's sense of their height, length of limbs, how coarse or smooth something was among many other examples. This is not unlike the visual manipulations by Henrik Ehrsson that consistently and easily caused subjects to have out of body experiences: https://www.nature.com/news/out-of-body-experience-master-of-illusion-1.9569

I think bass specifically is interesting here because, at least from the literature I've seen, you have vibratory studies on the one hand and auditory studies on the other, but nothing like study of the crossover function that says at which levels and frequencies does audition lapse and somatosensory perceptions start filling in. The infrasound studies I referenced in the equal loudness graph above used both headphones and pressure chambers. Notably the sensitivity for both from around 20Hz or so changes slope, so this may be an indicator that the mechanism changed.

For anyone interested in the setup, from Yeowart, N. S., Bryan, M., & Tempest, W. (1967). "The monaural M.A.P. threshold of hearing at frequencies from 1.5 to 100 c/s". Journal of Sound and Vibration, 6(3), 335–342.:
1617887682193.png

The headphones had a working range of 1Hz–200Hz, fed by 15 watt amps and had max output of 146dB SPL. The cups used a B&K 4132 microphone cartridge. The experimenters reported distortion figures, which showed they had at least 30dB–40dB of distortion free range at any frequency.

There are some wires on the subject's face in the picture. Those are surface electrodes for monitoring vestibular nystagmus (involuntary repetitive eyeball movements), since one of the hypotheses (confirmed by the experiment) was that infrasound will disturb the organs responsible for balance.
 

somebodyelse

Major Contributor
Joined
Dec 5, 2018
Messages
1,961
Likes
1,435
#20
Whole body testing by Yeowart et al used this pressure chamber:

That reminds me of a test rig originally used for response to pressure variations expected to be experienced with high speed trains when passing, entering tunnels etc. where the top of the chamber was a bellows driven by a servohydraulics.
 
Top Bottom