• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

miniDSP Tide16 - Holy Grail with 16 Channel Atmos/DTS:X, high SINAD

Does it support 4k120FPS?
No, but you already know that, don't you? You haven’t answered yet, so I’ll ask again: Do you need 4K at 120 FPS?
Edit: Also if Denon is able to squeeze hdmi 2.1 in a 500 € AVR, miniDSP should be able to do so too. HDMI 2.1 is out for what, 3 years now?
You’re ignoring the business realities. Denon (which is part of Harman, a subsidiary of Samsung) is in a completely different position to bring HDMI 2.1 even to its entry-level products. miniDSP’s position is explained here: https://support.minidsp.com/support...84706-understanding-hdmi-2-1-vs-2-0-on-tide16
 
It's not a bug, but it's only a smart feature for the manufacturer so they can save money on their hardware.
No, keeping everything at 48kHz greatly improves system stability by eliminating many of the troublesome edge cases that tend to cause major uproars on forums like this one.
 
- Startup time - 10 seconds or less is fine. Close to a minute is ANNOYING.
About 30s.
- Upgradability - An HDMI upgrade option would be nice.
The design is quite modular, but whether there will be a hardware upgrade depends on many other factors. The current implementation isn’t even on sale yet.
- How does it sound? - Sounds like a question for Spock. :)
As good as the recording, the room, and your other equipment allow.
 
About 30s.

The design is quite modular, but whether there will be a hardware upgrade depends on many other factors. The current implementation isn’t even on sale yet.

As good as the recording, the room, and your other equipment allow.
Right - with so many unknowns probably best to wait and see what this actually can and can't do?
 
I do agree if you are paying the prices asked for some of these processors you would think they could process at any sample rate which leads me to believe the issue may be latency. For some types of filters latency is inherent and can not be reduced even with infinite processing power but latency can be reduced with lower sampling rates.
I believe that the latency created by the filters is a function of the resolution of the filter, expressed as a frequency range (like 10Hz for example).
The latency is intrinsic to that range, rather than the sample rate. If the sample rate is higher, they will simply have to be processed proportionately faster.
 
Last edited:
No, but you already know that, don't you? You haven’t answered yet, so I’ll ask again: Do you need 4K at 120 FPS?

You’re ignoring the business realities. Denon (which is part of Harman, a subsidiary of Samsung) is in a completely different position to bring HDMI 2.1 even to its entry-level products. miniDSP’s position is explained here: https://support.minidsp.com/support...84706-understanding-hdmi-2-1-vs-2-0-on-tide16
SoundUnited didn’t get it properly right until 2024 after 3 years of issues. Nuvoton were meant to ship a working 48Gbit/s switch chip in 2025 but I don’t know if they did or not.

TVs are different as they use SoCs like

 
No, keeping everything at 48kHz greatly improves system stability by eliminating many of the troublesome edge cases that tend to cause major uproars on forums like this one.
That's the best excuse Storm can offer. The hard truth is that it's a hardware limitation. 48kHz is not a high sample rate for 21st century electronics. Video processing???

Digging deeper, I think Storm quite likely designed their hardware to run at 96kHz. However, when they started developing the software and the filters, I expect they found they could get better room correction by using longer filters that consumed more DSP resource, which constrained them to a lower sample rate. They probably judged that the benefits of the improved digital room correction outweighed the benefits of the higher sample rate (not that there aren't any benefits to the latter). 48kHz was probably adopted because it's commonly adopted for most immersive material, and it's probably good enough for most people.
 
About 30s.

The design is quite modular, but whether there will be a hardware upgrade depends on many other factors. The current implementation isn’t even on sale yet.

As good as the recording, the room, and your other equipment allow.
Is there a fast start option? I've been accepting the RMC-1 ersatz off for FAF (Family Approval Factor). :p

- Rich
 
SoundUnited didn’t get it properly right until 2024 after 3 years of issues. Nuvoton were meant to ship a working 48Gbit/s switch chip in 2025 but I don’t know if they did or not.

TVs are different as they use SoCs like


The AV20 may be using older chips, or they have a mixed parts out there because QMS-VRR is not working.

- Rich
 
That's the best excuse Storm can offer. The hard truth is that it's a hardware limitation. 48kHz is not a high sample rate for 21st century electronics. Video processing???

Digging deeper, I think Storm quite likely designed their hardware to run at 96kHz. However, when they started developing the software and the filters, I expect they found they could get better room correction by using longer filters that consumed more DSP resource, which constrained them to a lower sample rate. They probably judged that the benefits of the improved digital room correction outweighed the benefits of the higher sample rate (not that there aren't any benefits to the latter). 48kHz was probably adopted because it's commonly adopted for most immersive material, and it's probably good enough for most people.
Speculating what Storm did and for what reason is uncertain. They had to choose a way to implement more ART filters as they have more channels, but per channel support there has been no report that their filters are "better". Latency in audio processing is a wall so we will need smarter chips and algos to go beyond that.

BTW, I am firmly in the 48kz camp. Good enough, and don't suspect that there is actually meaningful 96khz demand so it would ever be developed by major brands. For god's sake, most people don't even understand what is the sampling rate.
 
No, but you already know that, don't you? You haven’t answered yet, so I’ll ask again: Do you need 4K at 120 FPS?

You’re ignoring the business realities. Denon (which is part of Harman, a subsidiary of Samsung) is in a completely different position to bring HDMI 2.1 even to its entry-level products. miniDSP’s position is explained here: https://support.minidsp.com/support...84706-understanding-hdmi-2-1-vs-2-0-on-tide16

I am guessing you don’t play video games? 4K 120Hz is absolutely a minimum requirement if you have a beefy gaming PC or a PS5 Pro. There is nothing controversial about wanting 4K 120Hz when it has been industry standard for many years now.
 
I am guessing you don’t play video games? 4K 120Hz is absolutely a minimum requirement if you have a beefy gaming PC or a PS5 Pro. There is nothing controversial about wanting 4K 120Hz when it has been industry standard for many years now.
Agree. 4K 144 or 165 is what TVs now are moving towards. Unfortunately eARC will be the way to go specially because things like GSync are never gonna work through a processor.
 
That's the best excuse Storm can offer. The hard truth is that it's a hardware limitation. 48kHz is not a high sample rate for 21st century electronics. Video processing???

Digging deeper, I think Storm quite likely designed their hardware to run at 96kHz. However, when they started developing the software and the filters, I expect they found they could get better room correction by using longer filters that consumed more DSP resource, which constrained them to a lower sample rate. They probably judged that the benefits of the improved digital room correction outweighed the benefits of the higher sample rate (not that there aren't any benefits to the latter). 48kHz was probably adopted because it's commonly adopted for most immersive material, and it's probably good enough for most people.
It’s not an excuse, it’s actually a real advantage in terms of how reliably and error-free the system operates. In practice, all movie audio is delivered at 48kHz anyway. The system could just as well run at 96kHz, provided the entire pipeline is fixed at that rate. However, 96 Hz offers no real benefit, it simply generates a much larger amount of data, which demands more powerful hardware without improving performance. If anything, it’s more beneficial to increase the precision of the system’s calculations (i.e. bit depth) rather than raising the sample rate.
 
Is there a fast start option? I've been accepting the RMC-1 ersatz off for FAF (Family Approval Factor). :p

- Rich
Yes but the firmware is evolving rapidly, so it’s difficult to make any definitive statements at this stage.
 
I am guessing you don’t play video games? 4K 120Hz is absolutely a minimum requirement if you have a beefy gaming PC or a PS5 Pro. There is nothing controversial about wanting 4K 120Hz when it has been industry standard for many years now.
My gaming days are behind me. With all the AV processing in a home theater system, the added latency wouldn’t make it ideal for gaming anyway. In any case, you can simply connect your console directly to the TV and send the audio back via eARC.
 
My gaming days are behind me. With all the AV processing in a home theater system, the added latency wouldn’t make it ideal for gaming anyway. In any case, you can simply connect your console directly to the TV and send the audio back via eARC.
yup that’s my plan now, very excited for the release.
 
No, but you already know that, don't you? You haven’t answered yet, so I’ll ask again: Do you need 4K at 120 FPS?

You’re ignoring the business realities. Denon (which is part of Harman, a subsidiary of Samsung) is in a completely different position to bring HDMI 2.1 even to its entry-level products. miniDSP’s position is explained here: https://support.minidsp.com/support...84706-understanding-hdmi-2-1-vs-2-0-on-tide16
Any one who owns a PC, PS5 or Xbox needs 4k120fps.

Edit: Oh I see this was already discussed above, you that's the point. Gaming. I think a LOT of people just wire their TVs to the PC and play with wireless controllers nowadays.

BUT I just read statement of minidsp (which is nice they adress that) and the HDFury Arcana2 thingy seems to solve many problems, if not all.
 
Last edited:
It’s not an excuse, it’s actually a real advantage in terms of how reliably and error-free the system operates. In practice, all movie audio is delivered at 48kHz anyway. The system could just as well run at 96kHz, provided the entire pipeline is fixed at that rate.
Of course it's just spin. Their only rival can run at native rate, and they can't. How on earth do you justify your position? I've been the person on the inside with high technology electronic systems R&D programmes many times. When everything isn't going right, you have one story on the inside, and one story on the outside. This is what it is like.
 
The AV20 may be using older chips, or they have a mixed parts out there because QMS-VRR is not working.

- Rich
That’s because the HDMI forum changed the spec for QMS through switches in a backwards incompatible way between 2.1 and 2.1a, Nuvoton at least have either not yet or only just updated their switch chips.

Thus for any vendor that uses Nuvoton it will be 2027 products before this works properly.
 
That’s because the HDMI forum changed the spec for QMS through switches in a backwards incompatible way between 2.1 and 2.1a, Nuvoton at least have either not yet or only just updated their switch chips.

Thus for any vendor that uses Nuvoton it will be 2027 products before this works properly.

The Emotiva RMC-1+, updated in 2025, uses newer Nuvoton chips and QMS-VRR is working with my LG G5 and Apple TV4K.

Marantz has access to chips that work. It could well be will not enable it until inventory is exhausted and it can appear in the top end products and/or a new model where all shipping versions can support it.

- Rich
 
Back
Top Bottom