• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

WiiM Amp Ultra Streaming Amplifier

Rate this streaming amplifier:

  • 1. Poor (headless panther)

    Votes: 4 1.3%
  • 2. Not terrible (postman panther)

    Votes: 8 2.5%
  • 3. Fine (happy panther)

    Votes: 68 21.7%
  • 4. Great (golfing panther)

    Votes: 234 74.5%

  • Total voters
    314
As for the latency issues mentionned. I have used Toslink as well as HDMI arc with an Apple TV and 4K Blu Ray player. I see no sync issues at all. The spoken words line up with the movements on screen.
As for lack of Airplay, I really don't care!
 
Do audio reviewers make stuff up? Serious question. I recently watched a video by Darko, in which he compares a 75 watt Arcam Class A/B integrated amp to the Wiim amp ultra.
He makes some startling claims... He implies that the Arcam is more powerful than the Wiim. Really? Watts are watts! 100 watts > 75 watts. He then goes into detail about how the Arcam is more this, more that than the Wiim.
I myself know the Wiim compares favorably to my former high end system of Naim separates. What Darko says baffles me.
Then again, his livelihood depends on him being able to "hear" and describe differences bewteen components. He is either fooling himself, or outright making stuff up.
I don't mean to pick on Darko. Other reviewers make dubious claims as well.
 
Yes, audio reviewers make stuff up. At least, many of them do.
When you see/hear them making stupid claims, you know who it ignore in the future.
 
Would adding something like a Dayton 6C microphone to my iPhone, improve the Roomfit measurements? Will it equate to better sound through Roomfit?
 
Would adding something like a Dayton 6C microphone to my iPhone, improve the Roomfit measurements? Will it equate to better sound through Roomfit?
Adding a mic should give you a more accurate response when you run roomfit, whether that gives you "better sound" would likely also depend on other factors. It certainly won't make it worse!
I have a Dyton mic and it did improve my roomfit result compared to the inbuilt mic on my old android phone. Just make sure you calibrate it.
 
Would adding something like a Dayton 6C microphone to my iPhone, improve the Roomfit measurements? Will it equate to better sound through Roomfit?
If you were using an Android phone then almost certainly - there's a lot of variation in mic response between models and manufacturers. With iOS devices it's less certain as Apple have put some work into keeping the mic response consistent, although a case may interfere with that. I don't think they calibrate them individually though, so the Dayton should be a bit better.
 
In my experience, ADC and DAC affects passthrough latency minimally in most cases (i.e. just a few ms).
It is the digital buffers which in my experience contribute the bulk of the passthrough latency in a digital device, and a buffer would in this case apply equally to any input.

To illustrate, if I look at the spec sheet of my RME Babyface, at 44.1kHz sample rate the ADC adds only 1ms and DAC adds 0.6ms additional latency.
But a 2048 sample buffer adds 46ms on top of that (2048/44.1kHz=46.4ms).
That's a humongous buffer size. Why choose such a big one? One of the points of professional interfaces: ultra low latency. Even 20 years ago 1-2ms (128 samples buffer) was perfectly doable. You need that when the computer is supposed to be a realtime audio processor and instrument for live and jamming studio applications. You can't have the keyboarder standing there with a laptop running his instruments and have it react 50ms late to his note input from the keyboard!

In that light, I really wonder what Wiim mean when they say "50ms is because ADC". What converter takes more than a ms? The rest is implementation to achieve lowest possible system latency, which - again - is what you pay for in professional interfaces.

Weird stuff. I don't really get it.
 
Last edited:
That's a humongous buffer size. Why choose such a big one? One of the points of professional interfaces: ultra low latency. Even 20 years ago 1-2ms (128 samples buffer) was perfectly doable. You need that when the computer is supposed to be a realtime audio processor and instrument for live and jamming studio applications. You can't have the keyboarder standing there with a laptop running his instruments and have it react 50ms late to his note input from the keyboard!

In that light, I really wonder what Wiim mean when they say "50ms is because ADC". What converter takes more than a ms? The rest is implementation to achieve lowest possible system latency, which - again - is what you pay for in professional interfaces.

Weird stuff. I don't really get it.
50ms end to end is unfavomable.

Even the old Dan Lavry products where they traded latency for much better filter performance were only 7ms in the worst case. These days, due to changes in the topology of the DAC they can do the equivalent in ~450us at 48k and 250us at 96k.
 
This is not an audio interface. This device is not designed for DJ or (home) studio use. It's an all-in-one Network stereo device for home use.

The existing latency is not a problem, including HDMI-ARC usage. Features like uninterrupted streaming, subwoofer management, room correction and EQ are much more important in this environment than latency.

If the WiiM Amp Ultra is suited for your intended use case, by all means go and get something else. Changes are, none of the WiiM products are made for you and that should be OK with all parties involved. No need to be personally upset. :)

Did I say none of the WiiM products? Well, maybe except for the WiiM Wake-up Light.

IMG_20260218_143719.jpg
 
Last edited:
That's a humongous buffer size. Why choose such a big one? One of the points of professional interfaces: ultra low latency. Even 20 years ago 1-2ms (128 samples buffer) was perfectly doable. You need that when the computer is supposed to be a realtime audio processor and instrument for live and jamming studio applications. You can't have the keyboarder standing there with a laptop running his instruments and have it react 50ms late to his note input from the keyboard!
That buffer size was just an example, selected to get a number close to WiiM latency.

But in general, why wouldn't we choose a big buffer size if we don't need real-time monitoring through the DAW?
When I'm recording I do all my monitoring through hardware so I don't need low buffer sizes in my audio interface and I still have close to zero effective latency, while having much more safety margin against audio dropouts.
Buffer size is configurable in professional interfaces because not everyone has the same needs or environment.
In that light, I really wonder what Wiim mean when they say "50ms is because ADC". What converter takes more than a ms? The rest is implementation to achieve lowest possible system latency, which - again - is what you pay for in professional interfaces.

Weird stuff. I don't really get it.
50ms latency is simply not an issue for non-real-time applications that WiiM devices are designed for.
It would however be an issue if you tried to play a musical instrument through it, and it might be an issue for gaming.
 
Last edited:
50ms latency is simply not an issue for non-real-time applications that WiiM devices are designed for.
It would however be an issue if you tried to play a musical instrument through it, and it might be an issue for gaming.

As a "living room product", shouldn't gaming be a fairly high priority? Obviously, music and movie/TV watching are more important, but gaming is, I think, a fairly common use for these kinds of systems?

(Now, perhaps 50ms isn't issue there either, but it still feels a bit strange not to optimise the product for gaming in this day and age).
 
This is not an audio interface. This device is not designed for DJ or (home) studio use. It's an all-in-one Network stereo device for home use.

The existing latency is not a problem, including HDMI-ARC usage. Features like uninterrupted streaming, subwoofer management, room correction and EQ are much more important in this environment than latency.

If the WiiM Amp Ultra is suited for your intended use case, by all means go and get something else. Changes are, none of the WiiM products are made for you and that should be OK wit all parties involved. No need to pe personally upset. :)

Did I say none of the WiiM products? Well, maybe except for the WiiM Wake-up Light.

View attachment 512009
A 50 ms processing latency is *absolutely* an issue for lip sync to video a TV outputting via ARC to one of these things.

Technically HDMI is supposed to be able to provide latency information over the link at least about the video processing delay, practically most implementations don’t provide the information correctly or act upon it correctly.

There is absolutely no reason for the end to end processing latency including room correction to be this high, for example the worst case latency through minidsp products which are doing far more processing than this thing is around 20ms and it’s usually lower.

You might not be sensitive to lip sync errors but you cannot unilaterally declare it’s not a problem for people who are such as me and the original person complaining.
 
As a "living room product", shouldn't gaming be a fairly high priority? Obviously, music and movie/TV watching are more important, but gaming is, I think, a fairly common use for these kinds of systems?

(Now, perhaps 50ms isn't issue there either, but it still feels a bit strange not to optimise the product for gaming in this day and age).
I agree, better gaming support might be good reason to optimise latency. We can only guess why this wasn't a priority for WiiM up to now.
 
Technically HDMI is supposed to be able to provide latency information over the link at least about the video processing delay, practically most implementations don’t provide the information correctly or act upon it correctly.
IIRC that's only in eARC.
 
You might not be sensitive to lip sync errors but you cannot unilaterally declare it’s not a problem for people who are such as me and the original person complaining.
I'm extremely sensitive to lip sync errors and when I had the WiiM Amp Ultra in my main living room setup there were absolutely zero issues, at least with a FireTV Stick 4k Max.

I'm also not into gaming. The Xbox is for playing Blu-rays.
 
@harkpabst it's quite possible you might not have experienced noticeable lip-sync timing because you're TV has 50ms+ video processing. This would minimize the difference between the audio and video.

If you have a modern OLED TV, the gaming mode can have under 10ms video processing... and you're therefore closer to bothersome de-sync with the audio. Just because you don't game doesn't mean this problem doesn't exist. If this could be optimized in firmware then everybody would be happier.
 
I've been using this amplifier for a few weeks and I completely agree – technically it's flawless, the UI is very functional and frequently updated, and RoomFit corrects bass modes well enough, at least, etc. If only it weren't for that sound... it's just another of many, countless amplifiers based on the TP3255 chips - cheap, easy to implement, and excellent in measurements, always praised by Amir. But to me, they sounded uninteresting. Dry, clinical, unmusical. I have transparent Fyne Audio F501SP speakers with quite high sensitivity. But this Wiim seemed a bit too weak for them; the sound was quite flat, and the 'holography' was moderate. Vocals, acoustic instruments lacked saturation, subtlety.

With relief, I returned to SMSL amplifiers based on Infineon chips (like SMSL RAW-HA1), and then to the Sabaj A30a, which, despite its slightly lower power output, drove the Fynes much better, and the sound was much more musical (although this is my subjective opinion, my friends who listened to both systems also agreed with it). I connected a Wiim Pro to Sabaj, which gives me the same functionality, and the sound is much more satisfying.

I hope a more musical competitor to the TP3255 will eventually emerge, breaking the chain of similarly mass-produced amplifiers based on this chip. It's a shame so few use Infineon or (I know, quite expensive) Purify modules.
I find it surprising the Wiim Amp Ultra did not properly drive your Fynes. I am using one to drive Fyne F303i speakers very well. My Fynes have the same sensitivity as your Fynes. Curious...
 
IIRC that's only in eARC.
HDMI had an optional static ‘Latency Indication Protocol’ (LIP) all the way back to 1.3, the issue is even most TVs that supported it just reported zero and those that did report a number frequently had different processing delays for different video types and there was no way of renegotiating the latency thus the feature was essentially useless.

HDMI 2.0 updated LIP to be dynamic and renegotiated when video formats changed however for standard data flows it’s still optional.

The difference between ARC and eARC is it’s mandatory to implement dynamic lipsync correction in eARC.
 
I'm extremely sensitive to lip sync errors and when I had the WiiM Amp Ultra in my main living room setup there were absolutely zero issues, at least with a FireTV Stick 4k Max.

I'm also not into gaming. The Xbox is for playing Blu-rays.
Good for you I can detect snapback delays under 20ms.

The reason that WiiM thinks that this delay is acceptable is that ATSC standards allow the audio to lag the video by up to 125ms this is partially based on research that many people won’t even notice that delay and partially because historically achieving good sync was hard. Interesting people are much less tolerant of audio leading the video and the ATSC spec there is 45ms.

These days there is absolutely no excuse not to have well subframe synchronisation between audio and video.

It’s as bad as PAL movies that were just sped up by 4% from 24 to 25fps including speeding up the audio causing a pitch shift for those of us with perfect pitch these discs are unwatchable.
 
@harkpabst it's quite possible you might not have experienced noticeable lip-sync timing because you're TV has 50ms+ video processing. This would minimize the difference between the audio and video.

If you have a modern OLED TV, the gaming mode can have under 10ms video processing... and you're therefore closer to bothersome de-sync with the audio. Just because you don't game doesn't mean this problem doesn't exist. If this could be optimized in firmware then everybody would be happier.
That's perfectly possible.

Good for you I can detect snapback delays under 20ms.
You don't know the actual delay in my setup. See above. Don't twist the facts.
 
Back
Top Bottom