• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

High Resolution Audio?

edechamps

Addicted to Fun and Learning
Forum Donor
Joined
Nov 21, 2018
Messages
910
Likes
3,620
Location
London, United Kingdom
Is there any dac that have 24 bit precision at 40khz??

No. The best @amirm ever measured, IIRC, is -122 dBFS in the Okto DAC8, or about 20.3 bits. Even @amirm's laboratory-grade state-of-the-art audio analyser wouldn't be able to measure 24 bits. My understanding is that analog electronics can't do much better than 20 bits because you start running into fundamental limits such as Johnson noise in resistors.

Yes but is there a precision in the frequency ? Is this always the same for the whole frequency range ?

Yes, because the DAC is driven by a single clock. As far as I know, there is no reason to believe the relative frequency error would be dependent on the frequency itself.
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,524
Likes
37,057
I once build a passive headphone adapter to align various headphones´ levels to the speakers level. I asked myself how far the level steps could be separated and did some experiments. Whereas 1dB level difference were just distinguishable for me, 2dB were on the save side.
Maybe it depends on training, but I would not consider myself to be able to differ 0.2dB level difference...
You couldn't. The smallest discernible difference from memory is .8 to about 1.2 db in volume depending upon frequency. That is why matching levels by ear isn't enough. Despite the fact you will never listen to a .25 db level difference and hear it as a level difference, that difference is enough in blind testing to let you hear the louder one as different very reliably.

I used to use passive switched resistor volume controls, and messed with it quite a bit. 2db steps wasn't horrible, but it wasn't small enough. 1.5 db was usually good enough I wouldn't complain much. 1 db steps were as good as continuously variable volume to me.
 
Last edited:

solderdude

Grand Contributor
Joined
Jul 21, 2018
Messages
15,891
Likes
35,912
Location
The Neitherlands
@edechamps @Blumlein 88
I think no dac can exploit all the data of high res files for the moment. So first we should find a true 24bit dac that has 24 bit precision until fs/2 that mean at least until 40khz.
Is there any dac that have 24 bit precision at 40khz??

The answer is yes, have a look here:

As Blumlein88 already explained... the resolution is there (DS) BUT noise is louder.
That doesn't mean 24 bit precision is not reached it is just buried under noise.
With R2R using the same method ... nope the tolerances of the biggest values are not small enough for this.
 

Calexico

Senior Member
Joined
May 21, 2019
Messages
358
Likes
72
The answer is yes, have a look here:

As Blumlein88 already explained... the resolution is there (DS) BUT noise is louder.
That doesn't mean 24 bit precision is not reached it is just buried under noise.
With R2R using the same method ... nope the tolerances of the biggest values are not small enough for this.
To me no dacs are not yet able to render perfectly 24 bit 96 khz. So before aksing if the format is good we should find a dac that can do it without loosing some data (earable or not).
If the dac render the 16/44.1khz with same amount of data as 24/192 then there is no need to have 24/192 files.
Also if the adc cannot record true 24bit precision at 40khz then no need to encode in high res (except for mixing and mastering)
 

Calexico

Senior Member
Joined
May 21, 2019
Messages
358
Likes
72
Resolution and Accuracy are terms that are often interchanged when the performance of
an ADC is discussed. It is important to note that Resolution does not imply Accuracy nor
does Accuracy imply Resolution.
The resolution of ADC is determined by the number of bits it uses to digitize an input
signal. For a 16-bit device the total voltage range is represented by 216 (65536) discrete
digital values or output codes. Therefore the absolute minimum level that a system can
measure is represented by 1 bit or 1/65536th of the ADC voltage range.
The accuracy of the A/D converter determines how close the actual digital output is to the
theoretically expected digital output for a given analog input. In other words, the
accuracy of the converter determines how many bits in the digital output code represent
useful information about the input signal.
As explained earlier, for a 16-bit ADC resolution the actual accuracy may be much less
than the resolution because of internal or external error sources. So for example a given
16-bit ADC may only provide 12 bits of accuracy. In this case, the 4 LSb’s (Least
Significant Bit) represent random noise produced in the ADC.
 

edechamps

Addicted to Fun and Learning
Forum Donor
Joined
Nov 21, 2018
Messages
910
Likes
3,620
Location
London, United Kingdom
To me no dacs are not yet able to render perfectly 24 bit 96 khz

There are definitely DACs that can do 20 bit/96 kHz, though. If you can't hear the difference between 16bit/44.1kHz and 20bit/96kHz, why would you expect to hear the difference with 24bits/96kHz?

a 16-bit ADC resolution the actual accuracy may be much less than the resolution because of internal or external error sources. So for example a given 16-bit ADC may only provide 12 bits of accuracy. In this case, the 4 LSb’s (Least Significant Bit) represent random noise produced in the ADC.

Yes. This is what DAC measurements (such as @amirm's), such as linearity and dynamic range, are for: to avoid the poorly engineered DACs that can't reach the advertised bit depth in practice. Most devices, even cheap ones, will be able to do "true" 16-bit, though. That's pretty much a solved problem.
 

Calexico

Senior Member
Joined
May 21, 2019
Messages
358
Likes
72
There are definitely DACs that can do 20 bit/96 kHz, though. If you can't hear the difference between 16bit/44.1kHz and 20bit/96kHz, why would you expect to hear the difference with 24bits/96kHz?



Yes. This is what DAC measurements (such as @amirm's), such as linearity and dynamic range, are for: to avoid the poorly engineered DACs that can't reach the advertised bit depth in practice. Most devices, even cheap ones, will be able to do "true" 16-bit, though. That's pretty much a solved problem.
No one has proven that there is 24 bit precision at 40khz on any dac. Precision at 1khz is far more easy for a dac.
I just say that if dacs cannot exploit the high res format there is no need to ask questions about high res. It's like asking if watching 4k is better than full hd on a full hd tv
 

solderdude

Grand Contributor
Joined
Jul 21, 2018
Messages
15,891
Likes
35,912
Location
The Neitherlands
Why on earth would you need 24 bit precision at 40kHz ?
You did not find the measurements 'proof enough' ?
you can clearly see each 'step' in the LSB precision... why would that differ at 40kHz ?
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,065
Location
Zg, Cro
Why on earth would you need 24 bit precision at 40kHz ?
You did not find the measurements 'proof enough' ?
you can clearly see each 'step' in the LSB precision... why would that differ at 40kHz ?

Guys, it's a pity you can't win this battle as you are really trying hard! :D
The truth is he's simply not getting it no matter which way you try to explain it to him..
 

Calexico

Senior Member
Joined
May 21, 2019
Messages
358
Likes
72
Guys, it's a pity you can't win this battle as you are really trying hard! :D
The truth is he's simply not getting it no matter which way you try to explain it to him..
@solderdude
I don't say you need that.
I say that high res provide this precision but if no dacs can use it there is no need to ask if the format is good or not as no dacs can render the extra precision provided by the high res format. (At least not fully)

First we should find the limit of the dac.
If the format give more than the limits there is no need of the format.

It's known that delta sigma looses precision as frequency is higher.
Modern one does good up to 20khz it seems but how at 40khz?
 

kevinh

Senior Member
Joined
Apr 1, 2019
Messages
337
Likes
273
So I am 66 and the idea that I can't hear the difference between a 320k mp3 and a Hi Res file is sorta nice.

I find that I really enjoy browsing You Tube, I listen the rock, things like the Grateful Dead, Allman Brothers,
a lot of Jazz Marcus Miller, Miles, Coltrane, Hancock Metheny ect, bluegrass and Classical, mostly symphonic music.

I don't much worry about the audio quality, especially the live recordings, most of which aren't available elsewhere.
I am delighted that such an array of previously unobtainable music is there for me to enjoy. The equipment not adding harshness is a benefit and sometimes I wish I had a good real tine program eq like the Cello Palette built into a Dac.
 

eliash

Senior Member
Joined
May 29, 2019
Messages
407
Likes
209
Location
Bavaria, near lake Ammersee
You couldn't. The smallest discernible difference from memory is .8 to about 1.2 db in volume depending upon frequency. That is why matching levels by ear isn't enough. Despite the fact you will never listen to a .25 db level difference and hear it as a level difference, that difference is enough in blind testing to let you hear the louder one as different very reliably.

I used to use passive switched resistor volume controls, and messed with it quite a bit. 2db steps wasn't horrible, but it wasn't small enough. 1.5 db was usually good enough I wouldn't complain much. 1 db steps were as good as continuously variable volume to me.

I would have gone also to 1.5dB if the max. number of available 12 switch positions would have been higher...
 

blueone

Major Contributor
Forum Donor
Joined
May 11, 2019
Messages
1,179
Likes
1,495
Location
USA
Answer to those questions were obtain by blind hearing tests, not by physics/medicine. The same tests confirmed that CD quality is fine and that higher sample rates don't really bring any advantage. You can raise bit depth to 24 to avoid collecting noise during the recording/mastering process, but that's about it. 16/44.1 format has not been chosen "out of the blue". ;)

While I continue to think 16/44.1 is completely adequate for playback, it does seem likely 16/44.1 was chosen by corporate politics. Phillips was thinking the correct answer was 14/44.1 while the Red Book was being written, and their first DAC design, in development prior to Red Book completion so they could lead the market, was 14bit. I'm not sure why Sony insisted on a 16bit word depth; the two possibilities seem to be: a) Sony engineers honestly thought 16bits were necessary, or b) Sony was behind Phillips in DAC development in some way and wanted to reset the playing field.
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,065
Location
Zg, Cro
While I continue to think 16/44.1 is completely adequate for playback, it does seem likely 16/44.1 was chosen by corporate politics. Phillips was thinking the correct answer was 14/44.1 while the Red Book was being written, and their first DAC design, in development prior to Red Book completion so they could lead the market, was 14bit. I'm not sure why Sony insisted on a 16bit word depth; the two possibilities seem to be: a) Sony engineers honestly thought 16bits were necessary, or b) Sony was behind Phillips in DAC development in some way and wanted to reset the playing field.

Or c) it is practical to have number of bits which divides with 8 to nicely fit into a byte. :D
 

Costia

Member
Joined
Jun 8, 2019
Messages
37
Likes
21
16bits would be easier to handle on a PC than 14bits.
It fits perfectly into 2 bytes which is also the standard "short" integer format.
 

GrimSurfer

Major Contributor
Joined
May 25, 2019
Messages
1,238
Likes
1,484
it did´nt become clear to me which was the faintest level you could hear in your listening position?.

I haven't measured it... and I don't think it is all that relevant to any conversation between two human beings.

Whatever measurement I captured as a 50-something year old person who did cold water diving in the Navy (pressure changes are very hard on the ear drums when one is congested or affected by a cold virus) as a young man and warm water surfing as an older man (white noise in the surf zone very hard on ears) would be meaningless to somebody else.

Human hearing is reliable but lacks accuracy and precision. It's response to frequency varies (which raises the question that @andreasmaaan asked, why 400 Hz? This is outside the most sensitive part of human hearing range). It's sensitivity changes throughout the day and is affected by temperature, humidity, mood, stress levels, etc.

All of this questions the relevance of the kind of listening test that you appear to be performing... Why does this matter to you and what "hard data" are you seeking to capture?
 
Last edited:

blueone

Major Contributor
Forum Donor
Joined
May 11, 2019
Messages
1,179
Likes
1,495
Location
USA
16bits would be easier to handle on a PC than 14bits.
It fits perfectly into 2 bytes which is also the standard "short" integer format.

The first edition of the CD specifications (nicknamed the "Red Book" because it was printed with a red cover) was released in 1980, so I am skeptical that any software-related digital manipulation issues were in their thinking. ;)
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,065
Location
Zg, Cro
The first edition of the CD specifications (nicknamed the "Red Book" because it was printed with a red cover) was released in 1980, so I am skeptical that any software-related digital manipulation issues were in their thinking. ;)

Digital electronics were using 8bit words from the beginning.
 
Top Bottom