• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

miniDSP Tide16 - Holy Grail with 16 Channel Atmos/DTS:X, high SINAD

Throwing my hat in the ring to describe my use case and ramble. My current setup uses the Flex HTx which is wonderful but I have no surround and too few channels even for stereo. My current setup is stereo 4 way plus rumblers on the seat with their own compression/filters daisy-chained off the Flex's L/R Woofer ouputs.

I foresee the Tide, as a pre/pro/XO doing a few things for me:
  • Eliminating the rumbler daisy-chain and the associated secondary DSP box
  • Center channel, prob 2 way active
  • L/R Surrounds
  • Adding 2x spatial channels or going active 2-way with the surrounds
My experience with the Flex has been massively positive and I'm very excited for the prospect of reincorporating surround in my space. Just having some simple DSP derived surround at a low level will help my asymmetrical room to sound more consistent for stereo, that's the biggest failing in my space, is you can "hear" that one side is open. Horns and multisub mitigate it but ultimately it's wall vs. no wall and having more system-sourced filler will reduce that effect when managed/EQ'd correctly.

Most users here seem to prefer speaker systems with integral XOs, so my massively multiway active use case isn't the norm around here. But I think that's what makes this unit so special- a broad toolkit, with very transparent electronics (the low noise... absolutely necessary for my setup) facilitates a setup like mine, where low noise and the flexible channel-routing with modern codecs is the value add- 5.2.2 is about as much as this space could tolerate.
Off subject. I got a Flex HTX a couple of months ago. I am really impressed with what it can do. BUT BUT it was a steep learning curve.... which I am probably half way up! :)
 
"Atmos" in the context of a bitstream refers to TrueHD or DD+ with a substream containing the spatial data needed to derive the height channels. This is not what the Apple TV uses, the Apple TV decodes the bitsteam to PCM, retaining the spatial information via metadata.

This is done so that the Apple TV can insert its own audio effects and such like. What will be passed on to the Tide16 is not the bitstream, it's the Apple TV derived PCM signal.

Personally I see nothing wrong with this, I have no issues with the output of my Apple TV to my AVR and I don't think @-Matt- needs to be concerned.
I think you’re both saying the same thing. Dolby Atmos uses spatial encoding to “hide” audio objects in a limited number of channels, while accompanying metadata tells the decoder (the AVR) how to render those objects during playback.

Apple TV converts Dolby Digital Plus into PCM, then sends the PCM along with the metadata to the AVR using Dolby MAT. In other words, the AVR receives a “pre-digested” Atmos signal, allowing it to skip the Dolby Digital Plus decoding step entirely.

Dolby intentionally abstracts all of this from the user, presenting it simply as “Atmos.” That’s not especially informative, but it’s clearly a marketing decision.

Source:
https://professional.dolby.com/site...on/dolby-atmos/dolby_atmos_renderer_guide.pdf
 
Last edited:
Ok, LPCM in a MAT container is not bitstream.

I have an Apple TV hooked up to a Denon 4800h. Atmos on Apple Music sounds great and it’s decoded perfectly by the Denon.
As I thought, you can get some sound out (and maybe it even sounds great) but the AppleTV is doing the decoding (at least into the bed channels). It is this lack of clarity about the AppleTV's processing chain that really annoys me.

Personally I see nothing wrong with this, I have no issues with the output of my Apple TV to my AVR and I don't think @-Matt- needs to be concerned.
Good for you. It probably works fine for most people but, personally, I just will never accept this approach of decoding the bed channels to LPCM before sending to the AVR. (Apple lost my trust when they messed up the channel orders output from the first gen AppleTV).

I have concerns about how the decoding is done when your speaker config doesn't exactly match the source material. For example if...

Source is 7.1
Speakers are 5.1
AppleTV decodes to 7.1 LPCM

The problem is that because the AVR is receiving the audio as LPCM it does not attempt to downmix and just discards the surround back channels.

If the AVR instead received the original Dolby bitstream (because it knows that there are only 5.1 speakers) it can process it properly and downmix the surround back signals into the surrounds.

Note, I don't know for sure that today AppleTV handles this wrong; I would just prefer to have the option of bitstream to avoid all doubt.
 
Source is 7.1
Speakers are 5.1
AppleTV decodes to 7.1 LPCM

The problem is that because the AVR is receiving the audio as LPCM it does not attempt to downmix and just discards the surround back channels.
The DD+ core mix is either 5.1 or 7.1. If it’s 7.1, either the Apple TV or the AVR will downmix it as needed, with no loss of information.
 
The DD+ core mix is either 5.1 or 7.1. If it’s 7.1, either the Apple TV or the AVR will downmix it as needed, with no loss of information.
How does the AppleTV know that you only have 5.1 speakers and not 7.1? This is why I asked above what settings there are in the AppleTV to define the speaker setup.

If there are no such settings, then the AppleTV likely just decodes to the number of channels in the source; which might prevent some AVRs from doing the downmix correctly.
 
HDMI EDID
...which is buggy as hell, in many cases can be blocked or spoofed by other devices in the HDMI path and can even be manually edited. (The fix for incorrect channel order on first gen AppleTV was to manually edit the EDID). I'm still scarred from the experience and it is why I won't accept anything but proper bitstream.

Anyway sorry for being so off topic of the MiniDSP Tide 16. I will stop now.
 
...which is buggy as hell, In many cases can be blocked or spoofed by other devices in the HDMI path and can even be manually edited. (The fix for incorrect channel order on first gen AppleTV was to manually edit the EDID). I'm still scarred from the experience and it is why I won't accept anything but proper bitstream.

Anyway sorry for being so off topic of the MiniDSP Tide 16. I will stop now.
Connect your ATV to your AVR and you’ll quickly see how well it performs, though some AVRs handle it better than others.

The desire for bitstream audio is as old as the ATV itself, and there have been plenty of rumors over the years. In classic Apple fashion, you’ll know bitstream audio has arrived only when it actually does. Until then, the ATV doesn’t support bitstream audio, but in my experience the Atmos MAT approach works quite well.
 
Ok, LPCM in a MAT container is not bitstream.


As I thought, you can get some sound out (and maybe it even sounds great) but the AppleTV is doing the decoding (at least into the bed channels). It is this lack of clarity about the AppleTV's processing chain that really annoys me.


Good for you. It probably works fine for most people but, personally, I just will never accept this approach of decoding the bed channels to LPCM before sending to the AVR. (Apple lost my trust when they messed up the channel orders output from the first gen AppleTV).

I have concerns about how the decoding is done when your speaker config doesn't exactly match the source material. For example if...

Source is 7.1
Speakers are 5.1
AppleTV decodes to 7.1 LPCM

The problem is that because the AVR is receiving the audio as LPCM it does not attempt to downmix and just discards the surround back channels.

If the AVR instead received the original Dolby bitstream (because it knows that there are only 5.1 speakers) it can process it properly and downmix the surround back signals into the surrounds.

Note, I don't know for sure that today AppleTV handles this wrong; I would just prefer to have the option of bitstream to avoid all doubt.
When your AVR receives a native 5.1 or 7.1 bitstream, it unpacks to PCM and still has to decide what to do with the routing of channels based on your customized AVR settings so it really makes no difference if the Apple TV does the unpacking or the AVR (the AVR can only manipulate raw PCM anyway before going to the DACs).

Back in the day, the PS3 was the fastest BD player on the market when it came out but it only decoded codecs to PCM like the Apple TV is doing. I've compared PCM vs bitstream and it really makes no difference as long as the audio codec is supported and volume is level-matched.

As one of the most expensive and UI-responsive streaming devices on the market, one would hope that the Apple TV has the proper chipsets and audio codec support to do a proper decoding. I can understand if you still have reservations but this is the "science" portion speaking and not the heart.
 
When your AVR receives a native 5.1 or 7.1 bitstream, it unpacks to PCM and still has to decide what to do with the routing of channels based on your customized AVR settings so it really makes no difference if the Apple TV does the unpacking or the AVR (the AVR can only manipulate raw PCM anyway before going to the DACs).

Back in the day, the PS3 was the fastest BD player on the market when it came out but it only decoded codecs to PCM like the Apple TV is doing. I've compared PCM vs bitstream and it really makes no difference as long as the audio codec is supported and volume is level-matched.

As one of the most expensive and UI-responsive streaming devices on the market, one would hope that the Apple TV has the proper chipsets and audio codec support to do a proper decoding. I can understand if you still have reservations but this is the "science" portion speaking and not the heart.
thanks for the ELI5, that was educational for me and helpful as I contemplate a multi-channel set-up using the Tide16 and an ATV
 
thanks for the ELI5, that was educational for me and helpful as I contemplate a multi-channel set-up using the Tide16 and an ATV
No problem! I've been dabbling w/home theater gear for well over 20 years now (selling it as a part-timer for some of that) and just paying it forward.

Before I forget, I have been streaming on the ATV 4K for a number of years now and enjoying both ATMOS movies and music. As someone who already subscribes to Apple Music and very entrenched in the Apple eco system, you could say I'm a captive audience. LOL!
 
Ok, LPCM in a MAT container is not bitstream.


As I thought, you can get some sound out (and maybe it even sounds great) but the AppleTV is doing the decoding (at least into the bed channels). It is this lack of clarity about the AppleTV's processing chain that really annoys me.


Good for you. It probably works fine for most people but, personally, I just will never accept this approach of decoding the bed channels to LPCM before sending to the AVR. (Apple lost my trust when they messed up the channel orders output from the first gen AppleTV).

I have concerns about how the decoding is done when your speaker config doesn't exactly match the source material. For example if...

Source is 7.1
Speakers are 5.1
AppleTV decodes to 7.1 LPCM

The problem is that because the AVR is receiving the audio as LPCM it does not attempt to downmix and just discards the surround back channels.

If the AVR instead received the original Dolby bitstream (because it knows that there are only 5.1 speakers) it can process it properly and downmix the surround back signals into the surrounds.

Note, I don't know for sure that today AppleTV handles this wrong; I would just prefer to have the option of bitstream to avoid all doubt.
Just a point. How do you know the AVR is not discarding the other two channels when receiving bitstream?? I can't think of a way to be sure - am I missing something?

As new minidsp user I can downmix the other two channels - and I know that its happening.
 
When your AVR receives a native 5.1 or 7.1 bitstream, it unpacks to PCM and still has to decide what to do with the routing of channels based on your customized AVR settings so it really makes no difference if the Apple TV does the unpacking or the AVR (the AVR can only manipulate raw PCM anyway before going to the DACs).
Unfortunately some older AVRs if they receive PCM input will completely bypass any further processing. In those cases it absolutely will make a difference if the AppleTV has already done the unpacking as it disables the AVRs processing.

As one of the most expensive and UI-responsive streaming devices on the market, one would hope that the Apple TV has the proper chipsets and audio codec support to do a proper decoding. I can understand if you still have reservations but this is the "science" portion speaking and not the heart.
The really bad part is that in some cases people get sound and assume everything is working correctly, when they may not actually be getting the optimally processed signal. Anything a bit non-standard in your setup (e.g. a video processor in the signal chain) can easily mess things up, and it won't necessarily be immediately obvious that anything is wrong.

This means there are endless examples of Apple fanboys (using their hearts) and saying that everything is fine "it just works". Science could show that in some cases (for specific combinations of equipment) it isn't work properly. But how many actually take the time to check that the surround channels have been assigned correctly or that downmixing has been done with the right channel ratios?
 
Last edited:
Unfortunately some older AVRs if they receive PCM input will completely bypass any futher processing. In those cases it absolutely will make a difference if the AppleTV has already done the unpacking as it disables the AVRs processing.


The really bad part is that in some cases people get sound and assume everything is working correctly, when they may not actually be getting the optimally processed signal. Anything a bit non-standard in your setup (e.g. a video processor in the signal chain) can easily mess things up, and it won't necessarily be immediately obvious that anything is wrong.

This means there are endless examples of Apple fanboys (using their hearts) and saying that everything is fine "it just works". Science could show that in many cases (for specific combinations of equipment) it isn't work properly. But how many actually take the time to check that the surround channels have been assigned correctly or that downmixing has been done with the right channel ratios?
How old are the AVRs that have this "deficiency"? I was working w/stuff from 2006-2008 back in the PS3 days!

Couldn't you take a full ATMOS demo track and play normally vs a downmixed 5.1 system (for example) and compare?
 
.This means there are endless examples of Apple fanboys (using their hearts) and saying that everything is fine "it just works". Science could show that in many cases (for specific combinations of equipment) it isn't work properly. But how many actually take the time to check that the surround channels have been assigned correctly or that downmixing has been done with the right channel ratios?
Have you done this checking? I’m curious to the statistics. How many instances have you counted where there were errors?

This is something that “most” people leave up to processors so my expectation is zero.
 
Just a point. How do you know the AVR is not discarding the other two channels when receiving bitstream?? I can't think of a way to be sure - am I missing something?

As new minidsp user I can downmix the other two channels - and I know that its happening.
Not sure with the ATV, and will of course depend a bit on your AVR, but for example when I send a 7.1 PCM signal from my Macbook to the Cinema 50, which has 4.3 speakers configured (L,R,SL,SR,3 Subs)), I can use the testtones in the Mac Audio-Setup to send a signal to each channel individually. That way I can hear clearly if downmixing is done, which it is.

Btw. for Marantz/Denon receivers, downmixing is not done when selecting "Direct" or "Pure Direct", which is of course intentional and correct.
 
Have you done this checking? I’m curious to the statistics. How many instances have you counted where there were errors?

This is something that “most” people leave up to processors so my expectation is zero.
I am only relating to you my specific case from way back when the first gen AppleTV was released. So statistically 100% of the cases I've tested!

At about that time AVRs were going from 5.1 to 7.1 (I think I actually had one that did 6.1 Dolby Digital EX and DTS-ES Discrete). So it was a confusing time and things were evolving (to add further complexity I had a Lumagen video processor)!

In my setup, when the AppleTV outputted PCM, it did so with the incorrect channel order for my AVR (verified using calibration test files). The only way to fix this was to manually update the EDID which was done with the help of one of the lead developers of Kodi (XBMC at the time).

Following all the messing around with that original AppleTV, I just can't trust that Apple does the decoding properly. I'm only happy if the AVR can show me the input channels and signal type of the original bitstream. I don't ever want to see it say PCM or LPCM or Dolby MAT (or for the AVR to lie and say it is Atmos when it is really MAT).

The lack of passthrough audio eventually caused me to jump from first gen AppleTV hardware to Kodi running on an Intel NUC (and subsequently onto various android boxes).

When @staticV3 said that AppleTV now does bitstream it got me excited. I'd go back to AppleTV hardware if it really did bitstream, but I don't think that will ever happen.
 
While the conversion to LPCM is a non issue for many concerning the Apple TV 4K and some other devices, it does still pose a problem to those with systems supporting ARC but not eARC. Those multichannel PCM signals need more bandwidth than ARC can support.

Dolby MAT2.0 works well enough but many AVP/AVRs will simply report the signal as “Atmos” when streaming Dolby Atmos from an Apple TV 4K while some will more accurately display them as “Atmos/PCM.”

Apple added a “Continous Audio” setting update to the Apple TV 4K which is not a Bitstream/Passthrough setting and is not supported on the Gen 1. As a result, playback of Dolby Atmos on the Gen 1 is broken and it must be disabled in settings or playback will not occur of material streamed in Dolby Atmos.

While I prefer the GUI of the Apple TV, I’ve been using the Nvidia Shield TV Pro more than anything else lately and will continue to do so until an update fixes my Apple TV 4K Gen 1. I do prefer bitstream of signals to an LPCM conversion for a few reasons myself and the Shield is a far more customizable device.

It is silly that Apple cannot get macOS and tvOS on the same page. My mac mini M1 can Bitstream/Pasthrough Dolby Atmos, though only from the Apple TV and Apple Music apps.
 
While the conversion to LPCM is a non issue for many concerning the Apple TV 4K and some other devices, it does still pose a problem to those with systems supporting ARC but not eARC. Those multichannel PCM signals need more bandwidth than ARC can support.

Dolby MAT2.0 works well enough but many AVP/AVRs will simply report the signal as “Atmos” when streaming Dolby Atmos from an Apple TV 4K while some will more accurately display them as “Atmos/PCM.”

Apple added a “Continous Audio” setting update to the Apple TV 4K which is not a Bitstream/Passthrough setting and is not supported on the Gen 1. As a result, playback of Dolby Atmos on the Gen 1 is broken and it must be disabled in settings or playback will not occur of material streamed in Dolby Atmos.

While I prefer the GUI of the Apple TV, I’ve been using the Nvidia Shield TV Pro more than anything else lately and will continue to do so until an update fixes my Apple TV 4K Gen 1. I do prefer bitstream of signals to an LPCM conversion for a few reasons myself and the Shield is a far more customizable device.

It is silly that Apple cannot get macOS and tvOS on the same page. My mac mini M1 can Bitstream/Pasthrough Dolby Atmos, though only from the Apple TV and Apple Music apps.
sorry for the noob question, but with your Mac mini setup do you use an external Apple TV box or you're using the Apple TV app installed on the mini?
 
Apple added a “Continous Audio” setting update to the Apple TV 4K which is not a Bitstream/Passthrough setting and is not supported on the Gen 1. As a result, playback of Dolby Atmos on the Gen 1 is broken and it must be disabled in settings or playback will not occur of material streamed in Dolby Atmos.
I have a 2nd gen ATV 4K purchased in 2021 and would be frustrated by that limitation in 1st gen.
 
Back
Top Bottom