• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Weiss DAC501 Streamer and DAC Review

Rate this DAC/Streamer

  • 1. Poor (headless panther)

    Votes: 161 48.1%
  • 2. Not terrible (postman panther)

    Votes: 132 39.4%
  • 3. Fine (happy panther)

    Votes: 29 8.7%
  • 4. Great (golfing panther)

    Votes: 13 3.9%

  • Total voters
    335
I think it would be good for the site's reputation to add an explanation for the measurement issue in the initial post. OK, so this may be the first device ever to show the problem, but it's only a problem in the lab, not real life.

It's unfair to the manufacturer to suggest their device is not working correctly when it's simply a measurement issue...
 
So what exactly is going on? The error is in the AP and not the Weiss? If so, then why does this error occur in some devices but not others? Amir should account for this behavior of the AP in tests going forward.
 
So what exactly is going on? The error is in the AP and not the Weiss? If so, then why does this error occur in some devices but not others?
It is only an AP measurement issue, the root problem is that USB feed to a DAC, or the DAC itself, has significant latency which the AP is *not* automatically aware of. ASIO buffer sizes and intrinsic latency (some DACs have high latency, for example when using upsampling with extreme precision sinc() reconstruction filters). Stuff like that.

The error symptom is when stepping down the levels in the sweep the AP is not aware of, and does not compensate for, this latency. Sometimes -- when timing conditions fall just in between two stable states -- it measures the 'stale' signal from the previous step and other times the intended step, in a semi-random fashion.
Amir should account for this behavior of the AP in tests going forward.
I'm sure he will do, now that we know how the error pattern looks like. Any immediate level jump in the plot which is exactly the dB step size is a red flag and one has to play with ASIO buffer sizes and the settling parameters of the AP until the plot looks trustworthy. It might well be that with extreme latencies like several seconds the delay can't be compensated anymore...
 
Yes, if there was a function that automatically measured and corrected the DUT delay before sweeping, I'm sure there would be fewer level jumps.
However, this is not easy if the DUT includes equipment that is not a real-time system (such as a computer). Latency changes significantly each time.

APx500 has faster step sweep speed than AP2700. Version 8 was even faster. (I used it in the demo. I don't have it)
If that happens, it will become even more difficult.

In order to perform a high-speed step sweep under random latency without making mistakes, I think it is necessary to ensure the mutuality of the oscillation signal and the measurement signal.
For example, it would be interesting to be able to superimpose/capture a "step number signal" at the beginning of each step signal during sweep measurement.
 
Last edited:
Yes, if there was a function that automatically measured and corrected the DUT delay before sweeping, I'm sure there would be fewer level jumps.
However, this is not easy if the DUT includes equipment that is not a real-time system (such as a computer). Latency changes significantly each time.

APx500 has faster step sweep speed than AP2700. Version 8 was even faster. (I used it in the demo. I don't have it)
If that happens, it will become even more difficult.

In order to perform a high-speed step sweep under random latency without making mistakes, I think it is necessary to ensure the mutuality of the oscillation signal and the measurement signal.
For example, it would be interesting to be able to superimpose/capture a "step number signal" at the beginning of each step signal during sweep measurement.
Yep, these days one could pack way more intelligence into measurement to avoid such problems. The algorithm currently used dates back to very early beginnings of the AP systems (while still at Tektronix, probably).

As for USB and other playback latencies, those are variable but once the stream is running it never changes, obviously. So, the key is to keep the USB output stream alive at all times.

The attempt to speed up the sweep makes sense only for production testing. In the lab it makes no difference if it takes 10 seconds or 3 minutes.

------:------

But the overall good news is that the DAC linearity test is superfluous anyway for 99% of the DACs because they are Delta-Sigma type which is always 100% linear even way down into the noise floor. R2R DACs is probably the only type that really shows something in the linearity test.

EDIT: Some DACs have a "hidden" attenuation of, say, 10dB once the signal toggles only the last 4 bits or so (of 24) to bring the noise down when silent (and giving 'better' noise specs). The Apple Dongle is an example. With a fine-tuned linearity test one sometimes can see the transition into that attenuation which usually has time constants involved. It's visible in other tests as well, notably very low level (south of -100dB) stepped sine spectrum.
 
Last edited:
It's a relatively simple problem to solve: just start the measurement with a timing pulse to determine the latency of the system. From there on it's easy going. No silly manual tweaking would be needed.
 
It's a relatively simple problem to solve: just start the measurement with a timing pulse to determine the latency of the system. From there on it's easy going. No silly manual tweaking would be needed.
I would certainly agree. The idea/method, whatever it may be, is conceptually easy.

But I doubt that the AP engineers, assumed they would agree in principle, can easily modify the APx software to accommodate such an intelligent latency detection (perhaps as an option) as that would be a significant change... which excites the full QC control loop for the software until it is safe for release again, compatible with existing setup presets etc. They sure have more relevant bugs to be handled first...
 
I have a kind of issue here as regards pricing. The kind of buyer who'll spend ten grand on a dac can obviously afford it and their sound system wouldn't be seen dead with a $100 Topping or SMSL, let alone a $/£250 model with balanced outs. We can get on ourt high horses as regards 'value,' but the high end out there (who arguably scoff ASR), would expect a high end dac to cost this. I can't always get my head round it, but then I'm more an SMSL SU-1 customer than an 'exalted' model such as this one...
 
That would be a technically illiterate/ rich buyer , but would that same buyer spend £10k if he knew that there was absolutely no difference ?
Keith
 
That would be a technically illiterate/ rich buyer , but would that same buyer spend £10k if he knew that there was absolutely no difference ?
Keith
Did anyone reviewed here the sophisticated EQ menus based on Weiss intellectual property? I cannot find these menus in these Topping DACs based on off the shelf DAC chips and proven reference designs. I guess Amir would even prefer those Topping DACs after having measured DACs costing some 200 k$ as this one: https://wadax.eu/reference/
 
Did anyone reviewed here the sophisticated EQ menus based on Weiss intellectual property? I cannot find these menus in these Topping DACs based on off the shelf DAC chips and proven reference designs. I guess Amir would even prefer those Topping DACs after having measured DACs costing some 200 k$ as this one: https://wadax.eu/reference/
Well ,

I don't know about you, but apart from the price, to me it looks absolutely hideous.
 
What a sick Joke! It even looks ugly!
Matrix stuff for example is pricy, but their products look like form of art, but this! HA
 
Did anyone reviewed here the sophisticated EQ menus based on Weiss intellectual property?
No, and @amirm explained why… in any case, do you think these features are worth several thousand extra?
I cannot find these menus in these Topping DACs based on off the shelf DAC chips and proven reference designs.
No need, a PC based DSP can do all that is needed just fine.
guess Amir would even prefer those Topping DACs after having measured DACs costing some 200 k$ as this one: https://wadax.eu/reference/
Given that you can buy a 1000 of these:


For the same price as a single Wadax, I’d say: yeah! I bet it won’t measure as well as the $ 199 DAC.
 
No, and @amirm explained why… in any case, do you think these features are worth several thousand extra?

No need, a PC based DSP can do all that is needed just fine.

Given that you can buy a 1000 of these:


For the same price as a single Wadax, I’d say: yeah! I bet it won’t measure as well as the $ 199 DAC.
given that we don't even know how it measures, and probably never will, at least not through ASR
 
given that we don't even know how it measures, and probably never will, at least not through ASR
We do know it uses TI DACs though, and they don’t have any models that have SOTA performance. So one has to come with a whole lot of tricks to actually improve these by at least another order of magnitude to get even close. Now, the box is big enough… so who knows ;)
 
That would be a technically illiterate/ rich buyer , but would that same buyer spend £10k if he knew that there was absolutely no difference ?
Keith
Even if the SQ was the same, a $10K DAC has the difference of being exclusive which clearly matters to people.
 
Even if the SQ was the same, a $10K DAC has the difference of being exclusive which clearly matters to people.
Except that that is never brought up as an argument for buying these things.
 
  • Like
Reactions: GDK
Did anyone reviewed here the sophisticated EQ menus based on Weiss intellectual property?
It’s a measurement review, not a feature review. If you want a feature review, there are plenty of other sites that offer that—John Darko, for example.
 
Last edited:
Thanks for that.

Now we know 100% for sure that it is a measurement setup issue in form of a rather long chain latency which the AP is not aware of and thus does its level measurement at the wrong times, that is, too early

It has absolutely nothing to do with the device under test.

So, @amirm, enough evidence shown now to convince you to update that section of the review to explain the real reason for the failing plot?
Hmmm. The APx knows ASIO very well. The ASIO driver is supposed to signal the latency, and that value is also shown in the AP software (left panel). If that value is not correct (here too low it seems) then I would not say that AP is guilty of anything.

Also: the latency can be modified easily by typing in a different value, within the AP software. That should enable one to measure the sweep correctly.

I wonder if the ASIO values that the Thesycon driver sends to the ASIO host are set to a realistic value (that would be a task for Weiss, AFAIK), or still on some Thesycon default values that cause the issues here.
 
Back
Top Bottom