• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

New IOTAVX 17-Channel Home-Cinema AV Processor

Technomania

Member
Forum Donor
Joined
Feb 26, 2021
Messages
60
Likes
56
Location
UK
PC and laptops have less and less displayport and more and more either HDMI or just USB-C. A displayport to HDMI adapter costs a few dozens of dollars, same for USB-C to HDMI, this is not a big issue.
Adapters can be used, of course, however adapters are yet another cable, more cost and waste, if you will, and often lag behind new standards, in reliability and signal integrity. Conversion of signal in adapters will lose you VRR, for example, as many adapters are not designed to convert and pass through entire package of features from original signal. It's a pain. One adapter will work with Gsync, another one will not. HDMI 2.1 adapters have not seen any success so far, to the best of my knowledge. If adapters are to be used, those need to be designed properly, without skimping on features. I have not seen any DP to HDMI 2.1 adapter that would successfully handle FRL signalling and package of video features going through a pipeline.

Information about DisplayPort is not accurate. DisplayPort is the primary video pipeline in all CPUs and GPUs and there will never be less of it; quite opposite. Every single modern PC and laptop have DisplayPort signalling in one of four forms - traditional DP port, DisplayPort signal tunneled either through USB-C or through USB-C Thunderbolt and eDP - embedded DisplayPort for flat panels on laptops or other mobile devices. I have all of them at home, which gives flexibility to connect to literally all monitors. DP is here to stay, not to go anywhere. The fact that it can run over several interfaces is brilliant. HDMI ports also feature on many motherboards and GPUs, which is great, as several interfaces simply give us more choice to connect to whatever display devices we wish, including TVs and projectors. It's about versatility. Just because PC has HDMI port as another option alongside DP, it does not mean that AVRs must stay so conservative regarding interfaces and not support anything else apart from HDMI.

AVR companies could really try harder to finally catch-up with modern, diverse connectivity of home media entertainment devices and start installing at least one or two in/out USB-C with tunneled DP (40 Gbps), to give people more choices to connect whatever they please. It's ludicrous to stay HDMI-only device in 2021 and claim "it's a hub". Silly and short-sighted.

This just protect you from having to trash your AVR in the bin when a new HDMI standard will come. The world would had significantly less trash if instead of AVRs (ie managing audio+video), we just had multi-channels processor/amps, ie just managing audio, ie "AR".
I agree with you that trash is bad enough, hence proposition to well integrate everything in one box, without need for splitters, adapters, transmistters, etc. AR is a good solution too, I support it, and there are plenty of DAC-s that can do audio only.

AVRs are not going to disappear. They need to become much better by being modernised. Companies need to change the way they design AVRs' functionality, enrich I/O ports and speed-up systems. They got audio part mostly right and it is pretty mature. Video support and interfaces are lagging behind. Many mainstream AVRs are by decision and design "trashable" after 5-6 years, so that business can continue. Many of AVRs have usb 2.0 only, which is totally bizarre... This does not have to be the case and Trinnov and Mad VR are proof of that - one box that does it all for AV for more than a decade, with free software upgrades and exchangeable parts; just like PC. Modularity is the key. This modularity mind-set is also arriving to laptop sector, little by little. AV processors are modular enough, but still very high-end, niche and expensive. This will need to change if we are to produce less e-trash. As soon as there are consumers willing to buy trashable devices, companies will sell them. So, education and eco-mind-set need to come on both sides. Consumers should expect from companies integrated and modular AV features without need to buy extra devices for conversion, correction and/or processing. The only thing you would upgrade after 5-8 years are individual parts, such as board, capacitors, graphics card, CPU, DAC chip, case or similar. That's why Sound United should have recalled all new AVRs to change HDMI 2.1 chip rather than issuing that horrible adapter box that is meant to correct the signal from only one source on only one port. Ridiculous. As soon as they think "what is cheaper", those are kinds of solutions we would see on markets, no matter how embarrassing it looks for AVR sector.
 

Vincentponcet

Active Member
Forum Donor
Joined
Mar 13, 2020
Messages
248
Likes
106
Adapters can be used, of course, however adapters are yet another cable, more cost and waste, if you will, and often lag behind new standards, in reliability and signal integrity. Conversion of signal in adapters will lose you VRR, for example, as many adapters are not designed to convert and pass through entire package of features from original signal. It's a pain. One adapter will work with Gsync, another one will not. HDMI 2.1 adapters have not seen any success so far, to the best of my knowledge. If adapters are to be used, those need to be designed properly, without skimping on features. I have not seen any DP to HDMI 2.1 adapter that would successfully handle FRL signalling and package of video features going through a pipeline.

Information about DisplayPort is not accurate. DisplayPort is the primary video pipeline in all CPUs and GPUs and there will never be less of it; quite opposite. Every single modern PC and laptop have DisplayPort signalling in one of four forms - traditional DP port, DisplayPort signal tunneled either through USB-C or through USB-C Thunderbolt and eDP - embedded DisplayPort for flat panels on laptops or other mobile devices. I have all of them at home, which gives flexibility to connect to literally all monitors. DP is here to stay, not to go anywhere. The fact that it can run over several interfaces is brilliant. HDMI ports also feature on many motherboards and GPUs, which is great, as several interfaces simply give us more choice to connect to whatever display devices we wish, including TVs and projectors. It's about versatility. Just because PC has HDMI port as another option alongside DP, it does not mean that AVRs must stay so conservative regarding interfaces and not support anything else apart from HDMI.

AVR companies could really try harder to finally catch-up with modern, diverse connectivity of home media entertainment devices and start installing at least one or two in/out USB-C with tunneled DP (40 Gbps), to give people more choices to connect whatever they please. It's ludicrous to stay HDMI-only device in 2021 and claim "it's a hub". Silly and short-sighted.


I agree with you that trash is bad enough, hence proposition to well integrate everything in one box, without need for splitters, adapters, transmistters, etc. AR is a good solution too, I support it, and there are plenty of DAC-s that can do audio only.

AVRs are not going to disappear. They need to become much better by being modernised. Companies need to change the way they design AVRs' functionality, enrich I/O ports and speed-up systems. They got audio part mostly right and it is pretty mature. Video support and interfaces are lagging behind. Many mainstream AVRs are by decision and design "trashable" after 5-6 years, so that business can continue. Many of AVRs have usb 2.0 only, which is totally bizarre... This does not have to be the case and Trinnov and Mad VR are proof of that - one box that does it all for AV for more than a decade, with free software upgrades and exchangeable parts; just like PC. Modularity is the key. This modularity mind-set is also arriving to laptop sector, little by little. AV processors are modular enough, but still very high-end, niche and expensive. This will need to change if we are to produce less e-trash. As soon as there are consumers willing to buy trashable devices, companies will sell them. So, education and eco-mind-set need to come on both sides. Consumers should expect from companies integrated and modular AV features without need to buy extra devices for conversion, correction and/or processing. The only thing you would upgrade after 5-8 years are individual parts, such as board, capacitors, graphics card, CPU, DAC chip, case or similar. That's why Sound United should have recalled all new AVRs to change HDMI 2.1 chip rather than issuing that horrible adapter box that is meant to correct the signal from only one source on only one port. Ridiculous. As soon as they think "what is cheaper", those are kinds of solutions we would see on markets, no matter how embarrassing it looks for AVR sector.

But you cannot have an official 4K HDR source displayed through displayPort, like netflix or disney plus.
The content creator requires HDCP 2.2 for that, and that is supported only on HDMI 2.0+.
VRR is an HDMI standard, not a displayport one.
HDMI 2.1 VRR is still a buggy standard, there are incompatibilities between chips.
The equivalent on displayport are NVIDIA G-Sync and AMD Freesync.
Any GPU have HDMI output since many years.

Trinnov cost 15K$, the madVR box is about 10K$, this is far from mainstream.

There are so many video standards which are always changing that dreaming of a universal box is just a dream, unless you go into the 15K$ systems with 2K$ upgrade each time a new video standard is released.
For mainstream people, it would have been so simpler to have just multi-channels processor/amps and having a standalone cheap swiching box that you can change easily and for less than 100$ when a new standard is released, than this situation today where a lot of people trashed their nice amps just because it didn't support 4K video. I prefer to have cheap adaptors than trashing a nice amp just because there is a new video standard preventing me to use it.
 

Technomania

Member
Forum Donor
Joined
Feb 26, 2021
Messages
60
Likes
56
Location
UK
But you cannot have an official 4K HDR source displayed through displayPort, like netflix or disney plus. The content creator requires HDCP 2.2 for that, and that is supported only on HDMI 2.0+

Of course you can. HDCP 2.2 has been supported on DisplayPort since 1.3 version (2014). All new graphics cards and monitors support it too. Have a look. DisplayPort standard

VRR is an HDMI standard, not a displayport one.

True. It was officially adopted by HDMI 2.1. However, it is based on and originates from VESA's Adaptive Sync for DisplayPort from 2013.
HDMI 2.0 did not have its own Sync feature. It was AMD that developed FreeSync, first for DP monitors and then for HDMI ports on monitors.
FreeSync and VRR on HDMI work in similar way. One difference is how those two Sync features are advertised in device's EDID. They are so similar that LG TVs just needed to add VRR range to their TV EDID in order to support both VRR and FreeSync.

HDMI 2.1 VRR is still a buggy standard, there are incompatibilities between chips.

Sadly, this is true, as it has not universally matured. Chip and consumer electronics manufacturers are to blame for lack of sufficient interoperability testing. VRR mostly messes with HDR. Most TVs cannot get it right together and show gamma curve issues.

Any GPU have HDMI output since many years.
Exactly. AVRs could finally start adopting examples of best practice from other companies that are willing to provide several interfaces to their consumers without any fuss, for years. I am glad you made this observation. It shows how inflexible AVRs' I/O boards are in comparison to GPUs. It's mind-boggling in this day and age to offer consumers one type of port only, including 8-10 same ports. Absurd.

Trinnov cost 15K$, the madVR box is about 10K$, this is far from mainstream.
You are right. I made that point too. Those companies are showing future trends, where mainstream might move towards in 5-10 years. Once this happens, devices are going to become cheaper, just like OLED TVs. It's about creating a critical mass and triggering snow-ball effect in mass production. It will take time. AVR engineers and designers will need to adopt a host of features from AV processors for this to happen, including installation of full OS and faster CPUs. AVRs are already analog-digital hybrid. Their digital side needs a rapid modernisation. When this happens, you will not need to trash AVR to gain access to new audio and video standards. It will come via OS in Linux and perhaps a board change once in 5-7 years. This is why Trinnov is able to rapidly deploy new Atmos configurations, DTS:X Pro or Audio over IP (Ravenna). They receive code lines from Dolby or DTS, embed it in software, test it for 6 months and roll it out several years before any next gen of AVRs is ready for markets. It's much faster. You can ask whether AVR companie are deliberately slow in rolling out new standards, in order to force consumers to buy a new receiver every time a major change happens? Like light bulb manufacturers in XX century deliberately decreasing life span of bulbs, so that they can sell more.

There are so many video standards which are always changing that dreaming of a universal box is just a dream, unless you go into the 15K$ systems with 2K$ upgrade each time a new video standard is released.
All software upgrades from Trinnov are free during life cycle of devices. This is their promise for owners. You pay one-off huge fee and you are calm for 10-15 years. It's not about dream, it's a about decisions companies make to earn money. "Universal box" can generate profit in different way. They are not forced by anyone to produce new line of receivers every 2-3 years. It's their own decision. They could change the way they operate and focus on charging for upgrading components rather then entire devices. There are ways to do this and we see companies trying to do this. If more of them jump on similar path, devices will become more eco-friendly and can stay for longer in our houses.

For mainstream people, it would have been so simpler to have just multi-channels processor/amps and having a standalone cheap swiching box that you can change easily and for less than 100$ when a new standard is released, than this situation today where a lot of people trashed their nice amps just because it didn't support 4K video. I prefer to have cheap adaptors than trashing a nice amp just because there is a new video standard preventing me to use it.

Your preference is your freedom of choice. There are different configurations and there is no one size fits all. Some home theatre systems are based on audio amps only, some add video, dependeing on needs and devices. Growing number of households have speedy WiFi mobile devices and those consumers would like to see significant improvements in AVR's network capabilities, to be able to send audio and images wirelessly to AVR and then via cables to its speakers and any display, which means AVR need to install WiFi 6E chips, already available on PCs, phones, laptops and tablets. There is a huge change in nature of home entertainment in millions of households. What is a simple solution for you is a hassle for others who appreciate versatility, diversity and mobility. It's not easy to be an AVR box in such environment engulfed by flux.
 

Vincentponcet

Active Member
Forum Donor
Joined
Mar 13, 2020
Messages
248
Likes
106
All software upgrades from Trinnov are free during life cycle of devices. This is their promise for owners. You pay one-off huge fee and you are calm for 10-15 years. It's not about dream, it's a about decisions companies make to earn money. "Universal box" can generate profit in different way. They are not forced by anyone to produce new line of receivers every 2-3 years. It's their own decision. They could change the way they operate and focus on charging for upgrading components rather then entire devices. There are ways to do this and we see companies trying to do this. If more of them jump on similar path, devices will become more eco-friendly and can stay for longer in our houses.


I was talking about hardware upgrades. A new version of HDMI requires a new HDMI input/output board.
 

Vincentponcet

Active Member
Forum Donor
Joined
Mar 13, 2020
Messages
248
Likes
106
Your preference is your freedom of choice. There are different configurations and there is no one size fits all. Some home theatre systems are based on audio amps only, some add video, dependeing on needs and devices. Growing number of households have speedy WiFi mobile devices and those consumers would like to see significant improvements in AVR's network capabilities, to be able to send audio and images wirelessly to AVR and then via cables to its speakers and any display, which means AVR need to install WiFi 6E chips, already available on PCs, phones, laptops and tablets. There is a huge change in nature of home entertainment in millions of households. What is a simple solution for you is a hassle for others who appreciate versatility, diversity and mobility. It's not easy to be an AVR box in such environment engulfed by flux.

Standards for wireless multi-channels audio are very recents. Dante is one standard, used in the professional world. Only the expensive JBL processor uses it.
 

Technomania

Member
Forum Donor
Joined
Feb 26, 2021
Messages
60
Likes
56
Location
UK
I was talking about hardware upgrades
Fair enough. I understand now. You mentioned "new video standard", which can be addressed either through software and/or hardware. Dolby Vision was a new video codec, loosely understood as 'standard'. DV also has several flavours. It did not require a new board on all devices. It can be rolled out via software update to AV processors. HDMI 2.1 does require a new board. You are right, but HDMI is also not a 'standard'. It is a specification of bunch of features that can be arbitrarily interpreted and adopted by companies, which is why we have such a mess now with HDMI 2.1. To be a standard, technology must be understood in the same way by all parties. Examples of standards are IEEE Ethernet, PCIe 4.0 for PCs, DisplayPort 1.4, etc. Those must be adopted in the same way by all parties to work consistenty. HDMI 2.1 is not that. It's a loose bunch of features deliberately designed to be loose and watered down, so that companies could introduce it in variety of ways. One exemption are Ultra High Speed cables.
 

rccarguy

Senior Member
Joined
May 9, 2020
Messages
373
Likes
133
That's why I think av pre should be audio only, like older ones you used coaxial from dvd to av pre, then s-video or component from dvd to TV...if you upgrade tv to component what the av pre has or hasn't got for video doesn't make a difference
 

ZeDestructor

Active Member
Joined
Jul 28, 2019
Messages
119
Likes
68

This is looking very interesting for my needs, which is (at the moment) 7.1ch from my PC, to eventually go up to Atmos/DTS:X when I upgrade to a lot of small Genelecs. I do wonder how high the latency is, since I do plan on using it with games...
 

mdsimon2

Major Contributor
Forum Donor
Joined
Oct 20, 2020
Messages
2,478
Likes
3,316
Location
Detroit, MI

linger63

Active Member
Joined
Jan 27, 2021
Messages
103
Likes
73
Location
Australia
For the measurement obsessed those specs are very unimpressive. Although I do like the XLR outputs and considering the state of most AVRs and processors might as well not pay tons of money if they all measure rather poorly.

Michael

XLR out for ONE Sub...........pathetic
 

linger63

Active Member
Joined
Jan 27, 2021
Messages
103
Likes
73
Location
Australia
Simple passive splitter should fix that just fine, since LFE are considered directionless

Nope......
Multiple subs should ideally be treated separately with independent controls by the AVR or Pre/Pro.
Even now most don't but my ancient Integra DHC 80.3 does.
 

mhardy6647

Grand Contributor
Joined
Dec 12, 2019
Messages
11,217
Likes
24,183
Warning: random, cantankerous musing ahead...

You know... the proliferation of channels in home theater is eerily similar to puts me to mind of the calculation of "pi" to outrageous numbers of decimal places*. At some point, it's just noise, at the extent of bandwidth.

Since I now "know" (from the same reference, see footnote) that "39 digits" of pi will provide sufficient resolution to assess "the circumference of the observable universe to within the diameter of a single atom", and applying similar logic... how many channels are required to to reprodiuce the vibrational spectrum of every atom molecule in the observable universe? :cool:

OK, enough musing for now. More coffee is required...
____________
* https://www.audiosciencereview.com/forum/index.php?threads/news-you-cant-use.25873/#post-884043
 
Last edited:

ZeDestructor

Active Member
Joined
Jul 28, 2019
Messages
119
Likes
68
Nope......
Multiple subs should ideally be treated separately with independent controls by the AVR or Pre/Pro.
Even now most don't but my ancient Integra DHC 80.3 does.

1. They are typically treated as a single channel *because* low sound frequencies are directionless - why "waste" a channel to generate the same signa output another channel is already generating? (note: this is only true for low-frequencies under about 120Hz or so)
2. The Tonewinner AT-300 has a 1 balanced sub out and 2 more single-ended outputs (for a total of 3 single-ended).

In general (though seemingly not on the AT-300), you can freely reassign any output channel to any location freely to suit your needs on AVRs and AVPs with more than a 7.1 output configuration. They usually do multiple subwoofer outputs for volume control reasons.
 

linger63

Active Member
Joined
Jan 27, 2021
Messages
103
Likes
73
Location
Australia
1. They are typically treated as a single channel *because* low sound frequencies are directionless - why "waste" a channel to generate the same signa output another channel is already generating? (note: this is only true for low-frequencies under about 120Hz or so)
2. The Tonewinner AT-300 has a 1 balanced sub out and 2 more single-ended outputs (for a total of 3 single-ended).

In general (though seemingly not on the AT-300), you can freely reassign any output channel to any location freely to suit your needs on AVRs and AVPs with more than a 7.1 output configuration. They usually do multiple subwoofer outputs for volume control reasons.

Plenty of articles you should read including this one........ https://www.reddit.com/r/hometheater/comments/6ixtjz
 

Everett T

Addicted to Fun and Learning
Forum Donor
Joined
Aug 2, 2020
Messages
648
Likes
486
1. They are typically treated as a single channel *because* low sound frequencies are directionless - why "waste" a channel to generate the same signa output another channel is already generating? (note: this is only true for low-frequencies under about 120Hz or so)
2. The Tonewinner AT-300 has a 1 balanced sub out and 2 more single-ended outputs (for a total of 3 single-ended).

In general (though seemingly not on the AT-300), you can freely reassign any output channel to any location freely to suit your needs on AVRs and AVPs with more than a 7.1 output configuration. They usually do multiple subwoofer outputs for volume control reasons.
If using the internal eq, that would make the settings global. That isn't what you would want when doing bass management.
 

hemiutut

Member
Joined
Aug 15, 2020
Messages
97
Likes
87
Location
España
You want to see rewiews with objective data, even though it seems to be a very good proposal for those looking for a processor with a very attractive price.
Written with Google translate
Greetings
 
Top Bottom