Newer versions of the D10 seem to have lowered the voltage to deal with inter-sample overs, as archmiago measured 1.5Vrms, and his is more recent than Amir’s. Not sure if a firmware change or physical.
That is very interesting! Just measured both of mine with a low-end meter and get just over 2v RMS on each channel. 1808xxxxxx serial numbers.
For anyone wanting to check their DACs RMS voltage output, it's really easy with an AC measuring multimeter.
Shove the black probe to the ground connection (for unbalanced RCA/phono that is the inside hole of the connector), and the red to the hot/signal (the outside barrel of the RCA connector). Set the multimeter to measure Voltage AC and appropriate range if it doesn't auto range.
Create a sine wave test tone 0 dBFS at 400 Hz - why not 1 kHz? Most cheap multimeters are calibrated in the 40-400Hz area.
Play the test tone to the DAC without any gain changes or DSP or anything, just played directly - preferably with a bit-perfect playback method (ASIO/WASAPI on Windows).
The measurement on the multimeter shouldn't keep fluctuating, it should stay static after a few seconds (if auto ranging is enabled). You'll get an RMS voltage reading. Most consumer line level output is around 2v RMS or a little more.
Archimago's review/test
Using a Focusrite Forte as an ADC to capture the DAC output. I may be missing it, but it doesn't say how the input level to the ADC was calibrated. How do we know the preamp to the ADC is not just turned down a bit? I'm not trying to find reasons to dismiss Archimago's results, I just wondered. That said the test on the Dragonfly DACs seemed like levels were about right - so maybe 1811xxxxxx(?) versions of the D10 are lower in output and perhaps polarity inverted, where the earlier ones weren't?