• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

16-bit... It really is enough!

U

UKPI

Guest
At around 1khz, human beings can consciously discern about a 2hz difference, excluding amplitude.
Considering the effectiveness of lossy audio compression, I don't think this is much of a problem. Also, frequency resolution (ability to record small differences between frequencies in the bandwidth) has nothing to do with sampling rate or bit depth. It is related to the length of the recording.

A nerve attached to the inner hair cell can fire on average once every 2 milliseconds. Each inner hair cell has hundreds of such nerves. At what rate do these nerves fire all together? We still don't know, cuz nobody is willing to have probes stuck in there ears to find out.
So, is that effect audible? After all, that's what matters the most.

With dithering, recording signals whose amplitude is less than one bit is possible. Since quantization noise is already low enough to be inaudible in reasonable playback levels, I'd say that properly dithered 16/44.1 format is enough for almost every situation for playback.

EDIT: To make myself clear, lossy compression itself is not directly related to the frequency resolution of the recording. Just wanted to point out that lossy compression with psychoacoustic models that have relatively coarse frequency resolution works well enough to shrink the audio signal to one-eighth of the original without noticeable differences.

EDIT2: Grammar cleanup.
 
Last edited by a moderator:

watchnerd

Grand Contributor
Joined
Dec 8, 2016
Messages
12,449
Likes
10,414
Location
Seattle Area, USA
With dithering, recording signals whose amplitude is less than one bit is possible. With quantization noise already low enough to be inaudible in reasonable playback levels, I'd say that properly dithered 16/44.1 format is enough for almost every situation for playback.

Well, also:

Transducers.

There are microphones and speakers at each end of this chain. We can't capture without microphones and can't pipe signal directly to nerve endings.

Oh, also:

2 milliseconds for the "nerve hair cells firing" = 500 Hz

So....yeah.
 
Last edited:

Inner Space

Major Contributor
Forum Donor
Joined
May 18, 2020
Messages
1,285
Likes
2,938
... I do find it humorous how people cling to a 40 year old standard that was chosen for multiple reasons, one of which was the capabilities of electronics at the time.

In retrospect, viewed as dispassionately as possible, the 1982 introduction of CD was astonishing. It turned out to be effectively perfect, in that it can't sensibly be improved upon. How rare is that? How often do long-evolving industries suddenly have a game-over, mike-drop moment?

Which was an oh shit moment for the manufacturers. They realized they had left themselves nowhere to go. It was a crazy mistake. Hence the pointless new formats and marketing nonsense.

It was an unsettling moment for listeners too, in one respect - for a couple of generations, improvements were the name of the game in hi-fi, and most of them were pretty genuine. Especially because most of us were operating on twin axes - as technical progress was made in the labs, we were moving from penniless students to starting-out workers, so that every new thing likely was a real, jaw-dropping upgrade. Those years formed a habit of mind, which was hard for some to abandon. Hence, first, the unthinking assumption that there would be "better digital" one day, and then the endless quest for it.

Problem is, I would expect those fallacies among older folk, because they're the ones who learned the habit of mind. Why have younger folk inherited it? I mean, we surely have posters here, now and then, whose parents were barely out of diapers in 1982, and yet they're on the same mental conveyor belt. Human nature, I suppose.
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,703
Likes
37,442
Yeah, 16/44 is somewhere between 95% and 100% of all the fidelity needed to equal human hearing audibility. It probably is more like 98% for anyone, and 100% for a majority of the people.

If there were a substantial difference between 16/44 an 24/96 it would have been shown definitively long ago. The fact it is still arguable means 16/44 is enough or enough for most people or even very, very, nearly enough for anyone.

Side comment: yes we know how many nerve cells fire. At lower frequencies you get one to one firing with the waveform, at higher frequencies (I think around 2khz and higher) they fire in volleys. Meaning some fire while other rest and the composite firing rate is much higher than the base timing rate of those cells.
 

Frank Dernie

Master Contributor
Forum Donor
Joined
Mar 24, 2016
Messages
6,452
Likes
15,798
Location
Oxfordshire
Over-engineering can be bad engineering.
Indeed.
I was taught that making something more expensive by giving it greater capacity than needed for its function was stupid and the sort of thing we needed engineers to prevent idiots from doing :)
Keith Duckworth, the mentor in question, often said an engineer is a person who does for 5 bob (slang for shillings, which actually went out of official use in around 1968) what any fool can do for 5 quid (slang for pounds).
 

Frank Dernie

Master Contributor
Forum Donor
Joined
Mar 24, 2016
Messages
6,452
Likes
15,798
Location
Oxfordshire
That being said, I do find it humorous how people cling to a 40 year old standard that was chosen for multiple reasons, one of which was the capabilities of electronics at the time.
It is not the age of a standard that makes it worth hanging on to, but its adequacy.

We still use Farenheit for temperature measurement. It is 3 years short of 400 years old and was wrong when it was proposed (he thought 0 was an absolute and 100 was human temperature, which he measured wrong) but it is adequate for its purpose, despite that 400 years.

Yes if you wanted a device that could record all sound levels without a level control on the recorder 20-bits would be needed. For all music I know, and using a level control, 16-bit is plenty.

For a film sound track where at one moment there may be birdsong and a babbling brook and later in the film a gun fight the whole lot would be needed if anybody was daft enough to record at the precise dynamic range. Almost nobody has a system which could then accurately reproduce it.
So theoretically it could be useful but in "real life" never does.
 

KSTR

Major Contributor
Joined
Sep 6, 2018
Messages
2,732
Likes
6,101
Location
Berlin, Germany
We still use Farenheit for temperature measurement. It is 3 years short of 400 years old and was wrong when it was proposed (he thought 0 was an absolute and 100 was human temperature, which he measured wrong) but it is adequate for its purpose, despite that 400 years.
Now that one was for a good laugh, I almost spilled my morning coffee over my laptop.
Only a very small minority of world population uses (or ever used) Fahrenheit.
https://sciencing.com/countries-use-celsius-8077428.html
The only exceptions to the quick adoption of metric scales, and thus Celsius, were English-speaking countries that used the imperial system, such as the United Kingdom, India and South Africa. These countries used Fahrenheit, an imperial unit of temperature. However, by the mid-20th century, even these English-speaking countries began to adopt the metric scale, and thus Celsius. India switched in 1954, the U.K. in 1965, and Australia and New Zealand in 1969. Today, only three countries do not use the metric system: the United States, Liberia and Burma.
[...]
Only a few countries use Fahrenheit as their official scale: the United States, Belize, Palau, the Bahamas and the Cayman Islands.
 

Frank Dernie

Master Contributor
Forum Donor
Joined
Mar 24, 2016
Messages
6,452
Likes
15,798
Location
Oxfordshire
Only a very small minority of world population uses (or ever used) Fahrenheit.
I know. I haven't used it for over 50 years, but I was fairly sure that was the scale the person to whom I was replying was familiar with :)
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,678
Likes
38,779
Location
Gold Coast, Queensland, Australia
In retrospect, viewed as dispassionately as possible, the 1982 introduction of CD was astonishing. It turned out to be effectively perfect, in that it can't sensibly be improved upon. How rare is that? How often do long-evolving industries suddenly have a game-over, mike-drop moment?

Which was an oh shit moment for the manufacturers. They realized they had left themselves nowhere to go. It was a crazy mistake. Hence the pointless new formats and marketing nonsense.

It was an unsettling moment for listeners too, in one respect - for a couple of generations, improvements were the name of the game in hi-fi, and most of them were pretty genuine. Especially because most of us were operating on twin axes - as technical progress was made in the labs, we were moving from penniless students to starting-out workers, so that every new thing likely was a real, jaw-dropping upgrade. Those years formed a habit of mind, which was hard for some to abandon. Hence, first, the unthinking assumption that there would be "better digital" one day, and then the endless quest for it.

Problem is, I would expect those fallacies among older folk, because they're the ones who learned the habit of mind. Why have younger folk inherited it? I mean, we surely have posters here, now and then, whose parents were barely out of diapers in 1982, and yet they're on the same mental conveyor belt. Human nature, I suppose.

Your best post ever. :)
 

mansr

Major Contributor
Joined
Oct 5, 2018
Messages
4,685
Likes
10,703
Location
Hampshire
Didn't read the whole thread so apologies if someone already pointed this out, but down-sampling and rounding to 16/44.1 is the worst, most naive way to lossily compress the original 24/96 master. The quality is poor compared to what could be done by a modern lossy compression algorithm targeting 24/96 with an output bitrate comparable to flac-encoded 16/44.1.
How do you figure? Lossy compression is better termed perceptual coding since uses a model of human perception to encode only that which can actually be perceived, discarding the rest. Some perceptual codecs are quite advanced indeed, taking advantage of psychoacoustic effects such as masking to avoid coding unnecessary information. Now if you have a recording sampled in 24 bits at 96 kHz, what is the most obvious way to reduce the amount of information without affecting perception? Why, remove everything above 20 kHz of course. Then remove anything below about -100 dB. These parts are uncontroversially inaudible, and besides, most playback systems can't reproduce them accurately, if at all, anyway.

Alternatively, suppose you want to do a lossy encode of a 24/96 file without discarding (all) the high-frequency content. How do you choose what to keep above 20 kHz when the perceptual model says none of it is audible? What do you throw out instead to make room for it?
 

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,463
Location
Australia
Last edited:

mansr

Major Contributor
Joined
Oct 5, 2018
Messages
4,685
Likes
10,703
Location
Hampshire
Fahrenheit is directly convertible to Centigrade, as is Absolute. What disturbs you?
The Fahrenheit scale is silly because it is ill-defined. He had the right idea about basing it off absolute zero, then totally botched finding that point. At the very least, he could have checked with someone a few miles north. The other reference point is even worse. Aside from the conceit of using the human body temperature, this isn't exactly stable. There is variation both between individuals and over time. And then he got that one wrong too. A curious fact is that the average temperature of humans has actually been decreasing steadily for the last 150 years or so, and nobody knows why.

As someone else said, Fahrenheit was the scientist, the temperature scale is Fahrenheit's monster.
 

Count Arthur

Major Contributor
Joined
Jan 10, 2020
Messages
2,231
Likes
5,004

Never mind all that, what happened to the flying cars, the 3 day week and the jet packs we thought we'd have by now.
 

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,463
Location
Australia
The Fahrenheit scale is silly because it is ill-defined. He had the right idea about basing it off absolute zero, then totally botched finding that point. At the very least, he could have checked with someone a few miles north. The other reference point is even worse. Aside from the conceit of using the human body temperature, this isn't exactly stable. There is variation both between individuals and over time. And then he got that one wrong too. A curious fact is that the average temperature of humans has actually been decreasing steadily for the last 150 years or so, and nobody knows why.

As someone else said, Fahrenheit was the scientist, the temperature scale is Fahrenheit's monster.

There is a Fahrenheit Standard regardless of its beginnings. I am not saying it is better but it is able to be (cross)referenced like the other standards.
 

EB1000

Senior Member
Joined
Jan 8, 2020
Messages
484
Likes
579
Location
Israel
Dithering is done to reduce quantization error which reduces intermodulation distortion. The problem with 24 bit recording is the fact that most recording do not take the full advantage of 24 bit data because they record at - 20 dB lower than full scale which losses dynamic range. Also, due to noise floor, no DAC can reach beyond 21 bit of SINAD, making 24 bit recording sounding no better than 16 bit recordings...
 

Raindog123

Major Contributor
Joined
Oct 23, 2020
Messages
1,599
Likes
3,555
Location
Melbourne, FL, USA
The Fahrenheit scale is silly because it is ill-defined


@mhardy6647 just posted this in Humor:

1617715426073.png
 

magicscreen

Senior Member
Joined
May 21, 2019
Messages
300
Likes
177
So your opinion is that 16/44 is enough.
Here is the bad news, you are a brainwashed victim of the CD audio industry.
 

nimar

Active Member
Forum Donor
Joined
Jan 25, 2021
Messages
213
Likes
216
Location
Ontario, Canada
So your opinion is that 16/44 is enough.
Here is the bad news, you are a brainwashed victim of the CD audio industry.
Care to elaborate? In what way do we _need_ more than 16/44.1 for playback?

I think people fall victim to the assumption that more is better as it is the case in other realms.

Eg. Digital photography, which interestingly is the analogy Amir used to describe quantisation issues. Over the years sensors have both had an increase in the number of pixels (people equate to sample rate) and bit depth (dynamic range in both cases).

Dynamic range is primarily useful in the editing process as it allows one to recover areas that look too dark / too bright but the final image will have a total dynamic range less than the total available range. Especially if reproduced in analogue (eg printed) as papers / inks can't reproduce the full dynamic range the sensor can capture.

Pixels are again primarily useful in editing to allow for increased flexibility in cropping, or if the image needs to be reproduced in a large format though the analogy falls apart for audio. There are not pixels in sound, no parallel for large reproduction. Higher sample rates allow you to capture higher frequency sounds, but these are sounds we can not hear. This would be like adding infrared capture to cameras, assuming they would reproduce infrared images, images we couldn't see.

If you want to argue that 192 sample rates are interesting as we can record audio beyond our hearing range, and modulate it down into our hearing range then sure, this is like using an infrared camera now, it doesn't reproduce an infrared image but a version of the image in visible light. These super high frequencies could then be modulated down and reproduced happily in 16/44.1.
 
Top Bottom