• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Is there any music that actually requires 24 bits for replay?

Assuming you are not talking about doing these changes in a DAW as it would immediately flip to floating.

Well exactly - A DAW doesn't use 16 bit processing. Precisely to avoid the build up of audible errors I described above.

Also you should only quantize at the end of all those manipulations not at every step.
That's the problem, if you are using integer, or fixed point maths then quantisation inevitably happens at the result of EVERY calculation with a non integer result. As soon as you have to store the result in a limited bit depth container (such as a 16 bit container if doing 16 bit math) then quantisation occurs.

And the way CPU's do math is they take (Operand 1) and (Operand 2) (Say 2 16 bit values if doing 16 bit math). Apply an operator between them, and then store the result back in a 16 bit storage location. As soon as the result is stored, it must also be quantised. That will be done many many times to implement a simple single parametric filter.
 
Last edited:
Still not particularly clear on this.

Playback only. Room EQ, DSP and digital volume attenuation...
16bit...24bit.

Possible audible differences?
Or, no?
All the DSP will be done at 24 bit or higher resolution. Volume might be 24bit fixed point in simple devices, but DSP will normall be at least 32 bit floating point.

Resolution of source material doesnt matter. What matters is the resolution of the maths.

Take redbook CD, put it into a miniDSP, it will be converted to 32 bit Floating point, processed, then converted back to integer PCM before sending to the DAC.
 
Seems to me to be a bit more nuanced when DSP and volume attenuation is involved.
So, like the lossy vs lossless debate, I am going to make my own judgement, and keep getting 24bit when available.
Again - DSP doesn't care about the resolution of the source file. Whatever resolution it is will be converted to the processing resolution (32 bit floating point normally). If the resolution of the source is lower than that - eg 16 bit, then it's full (audible) resolution will be maintained through the DSP math.
 
Last edited:
At the recording stage, 24 gives you more dynamic range headroom, making it more of a 'set it and forget it' option than 16. IOW, typically not required if you know the maximum level you'll be recording and can set your recorder accordingly, but useful generally.

If you perform lots of production DSP , as became typical long ago, if you do it at 16 bits you risk accumulating quantization errors to the point of audibility. So do production (digital processing) at 24 or 32

When you finally render down to 16 from that, you *might* be able to rely on self-dither from noisy sources, but why not just be safe and dither rather than truncate (truncation can be audible at fades and 'silences')?

Please don't be pedantic.
Yeah sorry I was being just that, pedantic about one small portion of your entire point :)
 
What the heck is he hitting with that giant mallet??? Seems like he is aiming for that drum player! :D
Nah, looks that way but I think he's aiming for the box, which is the "drum" part of that peculiar percussion instrument.

I think that's the BSO visiting Tokyo in 2022. The folk in the choir seats are either true stoics or not fully aware of what's about to happen.

I have a video made by a bone player friend of my friend in the Porto Sym. of a rehearsal. During some rest bars he gets up to turn on the video camera in his phone, returns to his seat, plays some big bone notes and then the hammer drops. It's great to have video from so close to the action showing what orch players have to deal with. I'd love to share it but the consents are complex and unlikely to be granted.
 
Last edited:
. There were a lot of questionable transfers from analog masters,
A myth that won't die. The Stereophile review criticises the recording and mastering of the supplied demo discs, it is not mentioned anywhere that they are transfers from analogue masters.

Thousands of total amateurs have made digital copies of their LP records with no issues whatsoever. Seems unlikely that professionals had any problems. It is not hard to do it properly.

Many people including myself actively seek out early CD releases of analogue recordings as they are usually the best version.
 
A myth that won't die. The Stereophile review criticises the recording and mastering of the supplied demo discs, it is not mentioned anywhere that they are transfers from analogue masters.

They didn't need to.

A cursory survey of the release timeline of DDD classical albums would tell you what you need to know. Stereophile's review was published in early 1983, with a bunch of followups from the much-missed JGH in which he slags the terrible quality of nearly all of the CDs (classical) he and others around him had access to over the next year or so. AAD/ADD was the norm for a long time--I very clearly recall looking high and low for a "pure" DDD recording to feed my Kenwood CD player in 1984.

Telarc had done a few digital recordings early on (I later owned a copy of their first effort: the Holst Suites for Military Band) and in retrospect it was mostly pretty gimmicky (like their notorious "digital cannon" 1812 Overture which once I heard through four(!) Klipschorns driven by a pair of H/K Citation amps in late 1985 or so).

Thousands of total amateurs have made digital copies of their LP records with no issues whatsoever. Seems unlikely that professionals had any problems. It is not hard to do it properly.

They didn't do it in the early 80s, and they didn't do it with inflation-adjusted $150k digital recorders as "total amateurs".

Many people including myself actively seek out early CD releases of analogue recordings as they are usually the best version.

How early? Best in what way? I haven't heard anyone in the recording world with this opinion.
 
Mahler 6. Let us give thanks we don't have to endure this every day.

View attachment 416744
Quite interesting. Have never heard this done LIVE, but obviously quite familiar with Mahler and his symphonies, but just quite recently, did see an orchestra and still astonished every time I see a classical concert, the sheer power of the big bass drum!!

It is not so much LOUD, or anything like what my system at home does, but more like BIG and ROOM FILLING sound. Even though its a fairly BIG room!

Always leave, feeling that most audio systems can only "hint" at the size and scale of an actual orchestra, and its room filling ease.
 
Again - DSP doesn't care about the resolution of the source file. Whatever resolution it is will be converted to the processing resolution (32 bit floating point normally). If the resolution of the source is lower than that - eg 16 bit, then it's full (audible) resolution will be maintained through the DSP math.

Thanks for your explanations. Possibly my (unfounded) concern originates from early days of DSP, when it was a bit more basic. Not sure.

Anyhoo, good to know that it is not an issue these days.
 
They didn't need to.

A cursory survey of the release timeline of DDD classical albums would tell you what you need to know. Stereophile's review was published in early 1983, with a bunch of followups from the much-missed JGH in which he slags the terrible quality of nearly all of the CDs (classical) he and others around him had access to over the next year or so. AAD/ADD was the norm for a long time--I very clearly recall looking high and low for a "pure" DDD recording to feed my Kenwood CD player in 1984.

Telarc had done a few digital recordings early on (I later owned a copy of their first effort: the Holst Suites for Military Band) and in retrospect it was mostly pretty gimmicky (like their notorious "digital cannon" 1812 Overture which once I heard through four(!) Klipschorns driven by a pair of H/K Citation amps in late 1985 or so).



They didn't do it in the early 80s, and they didn't do it with inflation-adjusted $150k digital recorders as "total amateurs".



How early? Best in what way? I haven't heard anyone in the recording world with this opinion.
Nimbus were releasing Digital recordings early enough.
One of my favourites is Brass Bacchanals by the Equale Brass Ensemble from 1982. One of the first CDs I bought and still much enjoyed.
Nobody had considered dynamic range compression as a "thing" for CD as opposed to LP back then.
How times have changed!
 
Nimbus were releasing Digital recordings early enough.
One of my favourites is Brass Bacchanals by the Equale Brass Ensemble from 1982. One of the first CDs I bought and still much enjoyed.
Nobody had considered dynamic range compression as a "thing" for CD as opposed to LP back then.
How times have changed!
Early digital recordings were done at various sample rates, I seem to recall that the Mitsubishi recorder sampled at 50kHz, About the only possible reason I can think of why early digital recordings may possibly have sounded worse than later ones is that sample rate conversion to 44.1kHz back in the early 1980s wasn't as precise as now. Some conversion was done analogue by DAC-ADC which even then was audibly transparent. I remember evaluating ADC-DAC pairs in the mid 1990s, so some 10 years later than CD, by putting them in the monitoring chain and blind testing whether it was possible to tell if the ADC-DAC or DAC-ADC were in-line or bypassed. It wasn't.

Of course, Nimbus and other LPs recorded digitally wouldn't have needed sample rate conversion, just DAC into the cutting lathes.

S.
 
Nimbus were releasing Digital recordings early enough.
One of my favourites is Brass Bacchanals by the Equale Brass Ensemble from 1982. One of the first CDs I bought and still much enjoyed.
Nobody had considered dynamic range compression as a "thing" for CD as opposed to LP back then.
How times have changed!
I had, (now long gone) many articles about early cd and vinyl of the time, going on and on about how limited vinyl was and its dozen issues, that IF SOLVED, would make music more transparent and HI-FI
 
It's a game changer in field recording, sound for picture, etc. where your possible input levels are all over the place. It's common to record a "safety" track at lower input gain to insure against this, but with 24 bits or more you don't need to worry nearly as much.

Even for 24 bits, people recorded safety tracks, 32 bit was the game changer, but that's because 32 bit is not stored in the same fashion as lower bit depths, it's a completely different type of algorithm.
 
They didn't need to.

A cursory survey of the release timeline of DDD classical albums would tell you what you need to know. Stereophile's review was published in early 1983, with a bunch of followups from the much-missed JGH in which he slags the terrible quality of nearly all of the CDs (classical) he and others around him had access to over the next year or so. AAD/ADD was the norm for a long time--I very clearly recall looking high and low for a "pure" DDD recording to feed my Kenwood CD player in 1984.

Telarc had done a few digital recordings early on (I later owned a copy of their first effort: the Holst Suites for Military Band) and in retrospect it was mostly pretty gimmicky (like their notorious "digital cannon" 1812 Overture which once I heard through four(!) Klipschorns driven by a pair of H/K Citation amps in late 1985 or so).



They didn't do it in the early 80s, and they didn't do it with inflation-adjusted $150k digital recorders as "total amateurs".



How early? Best in what way? I haven't heard anyone in the recording world with this opinion.
I don't know what the deal was with classical recordings but for other genres there were digital recordings pre-dating consumer digital.

The Stereophile criticisms of the sample CDs are poor microphones and mic placement, and too much compression. Nothing to do with the transfer from analogue tape. We don't know for sure if they were analogue or digital recordings as it isn't mentioned.

Original rock/pop CD releases are often flat transfers from a master tape, made when the tapes were not so old (depends on the album ofc). The benefit is higher dynamic range. I thought this was fairly well known, maybe not.


(I have no idea if this is also true of Classical music, I don't own any Classical CDs. However early CD take up is supposed to have been driven by Classical music afficianados so would seem most were not unimpressed by the quality of the catalogue on offer).

I think the myth was started by people unimpressed by their first experiences with CD back in the 1980s and looking for a reason as to why, given they are now fine with 'modern' digital - however having revisited early CD players and early CD issues since, I now suspect this was nothing to do with the tech or the transfers but instead issues with the systems being used at the time (poorly designed speakers and/or amplifier overload/clipping).
 
Even for 24 bits, people recorded safety tracks, 32 bit was the game changer, but that's because 32 bit is not stored in the same fashion as lower bit depths, it's a completely different type of algorithm.

True, I was being sloppy in vaguely waving at "... or more".

I was thinking further back to the ADAT/Tascam DA/etc. tape and early hard disc recorder days when they recorded at 16-20 bits. "Jagged Little Pill" was recorded on blackface ADATs @ 48k/16, for instance. There's not much room for error! Mixing was of course done on analog desks, commonly to DAT at the same sample rate & bit depth. I mixed a friend's album that way in 1997 or so.
 
Last edited:
No, if you are using integer, or fixed point maths then quantisation inevitably happens at the result of EVERY calculation with a non integer result. As soon as you have to store the result in a limited bit depth container (such as a 16 bit container if doing 16 bit math) then quantisation occurs.

And the way CPU's do math is they take (Operand 1) and (Operand 2) (Say 2 16 bit values if doing 16 bit math). Apply an operator between them, and then store the result back in a 16 bit storage location. As soon as the result is stored, it must also be quantised. That will be done many many times to implement a simple single parametric filter.
Agreed, but in mastering you would never be in integer, you would always be in floating. All good though, I hear what you are saying.
 
Early digital recordings were done at various sample rates, I seem to recall that the Mitsubishi recorder sampled at 50kHz, About the only possible reason I can think of why early digital recordings may possibly have sounded worse than later ones is that sample rate conversion to 44.1kHz back in the early 1980s wasn't as precise as now. Some conversion was done analogue by DAC-ADC which even then was audibly transparent. I remember evaluating ADC-DAC pairs in the mid 1990s, so some 10 years later than CD, by putting them in the monitoring chain and blind testing whether it was possible to tell if the ADC-DAC or DAC-ADC were in-line or bypassed. It wasn't.

Of course, Nimbus and other LPs recorded digitally wouldn't have needed sample rate conversion, just DAC into the cutting lathes.

S.
I've got a Denon digital recording from 1975. DDD. 13 bits. Sounds great.
 
(I have no idea if this is also true of Classical music, I don't own any Classical CDs. However early CD take up is supposed to have been driven by Classical music afficianados so would seem most were not unimpressed by the quality of the catalogue on offer).

I think the myth was started by people unimpressed by their first experiences with CD back in the 1980s and looking for a reason as to why, given they are now fine with 'modern' digital - however having revisited early CD players and early CD issues since, I now suspect this was nothing to do with the tech or the transfers but instead issues with the systems being used at the time (poorly designed speakers and/or amplifier overload/clipping).
On the contrary I think most classical music listeners were generally pleased by CD, I certainly was, the complaints came from, admittedly the majority of, people who listen to pop music.

My theory was that most fashionable record player systems were pretty rolled off at the high end and the CD players were not, also maybe, contemporary preamps had input sensitivities suited to the 200mV output of line level kit like tuners but CD players had 2V output which maybe clipped the input of some of them.

CD was a good step forward from day 1 for classical music IME.
 
Back
Top Bottom