R2R dacs have far more ability to be precise as they just receive the 16bits and leave them untouched. That is real bit perfect.
This article explain why delta sigma is less than 16 bit resolution in reality.
http://www.mother-of-tone.com/conversion.htm
Some quote:
: sigma-delta DACs are coarse noise-generators and when measured the way they should be measured they never make it to 16-bit resolution, don't even think about 24 bits.
1000x frequency doesn't mean same precision as 16 bit. 16 bit means 65536 levels. So to have bit perfect same precision at 20khz you should have frequency of 20 000 * 65 536 with one bit dac. I think noise shaping and dithering does the trick to keep frequency lower but on multibit you don't need any trick to get the 16 bit precision of cd. So you re sure all bit are there untouched. Bits are directly exploited.
I found more articles on internet once ago but it seems i cannot find them anymore on google. It s very strange. It seems lots of people are lying and want us to believe that delta segma dacs are better with no drawbacks and don't give arguments others than simple test that are not fair. They don't make any test with real dynamic. Each topology has drawbacks. And the drawback of delta sigma is the precision. Maybe new chipset are better for that but no test can show it. We would like to see test that show differences of precision on complex signals between dacs. Also the output stage is important on the precision. But tests only show that you have no noise. They don't show if complex signal are simplified. At the end you find lot of people who have listened to lots of dacs and they always choose to keep a multibit dac. That means something is unpleasant on delta sigma and that tests don't show it. To my mind and to my hear that is due to lack of precision.
So please stop giving strange argument like 1000*frequency means same precision as multibit. It's too simple and don't proove anything.
@andreasmaaan I think the link can interest you.