• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Marantz AV10 AV Processor Review

Rate This AV Processor:

  • 1. Poor (headless panther)

    Votes: 5 1.6%
  • 2. Not terrible (postman panther)

    Votes: 21 6.8%
  • 3. Fine (happy panther)

    Votes: 83 27.0%
  • 4. Great (golfing panther)

    Votes: 198 64.5%

  • Total voters
    307
If eARC works for you and you don't need many inputs, then that's good. I think it's just as simple the other way around - in fact in my experience it's less prone to problems.



I don't know if this is in relation to my comment on standards. There is a standard for HD, rec 709. and Ultra HD, rec 2020, the TV switches automatically between the two.
To each its own I guess. If one likes AVR hub and it works, it works. TVs have been alternative hubs for more than 10 years and many use it in such a way to connect higher bandwidth stream directly.

HD signal needs more post processing than 4K signal. It will be much softer and more noisy especially if from HD master (like original Blu Ray). Depending on the processing capabilities of display, one might want to adjust processing differently than for 4K. Then in many cases there is even lower quality signal for a lot of TV broadcast material either through cable or internet box. This will also require some adjustments.

Finally, one would not want to over process high quality 4K signal, but there might still be some relevant settings to adjust depending on the bit rate (stream vs 4k disc) or frame rate.

Displays generally offer one or two film mode or equivalent profiles, so having additional per input and profile capabilities might be relevant to some that feel finer adjustments are needed.
 
Modern AVRs do should support all contemporary video formats, resolutions, and refresh rates.
Some also have video overlays to display status information, a feature that I would miss.

The AV10 does all of this.
At one point, AVRs were doing video processing but that was ill conceived. I don't know a single instance of video calibration using an AVR.

- Rich
 
HD signal needs more post processing than 4K signal.

No they don't, a 4K TV automatically upscales HD content to maximise the image quality (of whatever incoming resolution).

It will be much softer and more noisy especially if from HD master (like original Blu Ray). Depending on the processing capabilities of display, one might want to adjust processing differently than for 4K.

You don't need to adjust processing for either 4K or HD, certainly that would not be the purist approach. Of course it's your TV and you can do with it what you want to.

Depending on the processing capabilities of display, one might want to adjust processing differently than for 4K. Then in many cases there is even lower quality signal for a lot of TV broadcast material either through cable or internet box. This will also require some adjustments.

Finally, one would not want to over process high quality 4K signal, but there might still be some relevant settings to adjust depending on the bit rate (stream vs 4k disc) or frame rate.

None of these instances require specialised adjustment, but if you want to deviate from the image as presented, the adjustments you'd make would either be aesthetic preference, in which case you'd want them applied universally, or if content specific, only applied to that content.

Displays generally offer one or two film mode or equivalent profiles, so having additional per input and profile capabilities might be relevant to some that feel finer adjustments are needed.

Each input has setting subsets that are specific to the incoming format. You can also store multiple profiles on the TV and swap between them as needed, if this is really a concern of yours.

You talk about simplicity, but it sounds like you prefer making things more complicated than they need or should be.
 
Last edited:
No eArc problems in any of my setups. And who really “intended” to route video through audio processor when new displays are perfectly capable of doing their job as hubs within limits of their input capabilities. Actually adding another component in the mix is when mess really occurs. A/V sync being the worst of them.

And why there would be no need to adjust video by input in your view? Like streamer, 4K player, BR player? If you expect these to perform well on common display video settings then looks like you might have an extraordinarily smart display.

Lucky you, the forums are riddled with complaints about eARC. These devices have literally been called Audio Video Receivers/Processors since the 1980s when they were invented and TVs didn't have multiple inputs, so yes, AVRs were always intended to serve as the audio and video hub. A/V sync is far better going through an AVR than through eARC from a display.

Why would one use separate 4k and BD players? Any modern HDMI device working properly should output essentially perfect video. Go ahead and explain what adjustments you are making between your devices.

Arguing that something simple should be made complicated is always a bad argument.

If there are reasons why to make it complicated I am all for it and do it all the time. But IME in this case unless you have input limitations or significant A/V sync issues with eARC, that is obviously a shortest path theoretically prone to least problems.

You are making zero sense. The AVR strips the audio from the video on the HDMI signal and is programmed to deal with whatever internal delays there are. In your eARC scenario, the TV strips the audio output instead and then has to sync it and send to the AVR. No delays/lags are accounted for. And how is that a shorter or simpler path?

It is actually very interesting that someone thinks HD and 4K need same kind or level of video processing. I would be very much interested to learn more about the reasons why would that be the case.

What processing are you referring to, exactly? Upscaling HD to 4k is done automatically by any 4k display...what difference would it make if you fed the devices to separate inputs on the TV?

To each its own I guess. If one likes AVR hub and it works, it works. TVs have been alternative hubs for more than 10 years and many use it in such a way to connect higher bandwidth stream directly.

Yeah, many people choose to use a more error-prone setup instead of using an AVR like it was intended... For the record, that practice mostly started because people were still using HD AVRs and had added 4k sources, not because of there being some magical advantage of using the TV as a hub.

HD signal needs more post processing than 4K signal. It will be much softer and more noisy especially if from HD master (like original Blu Ray). Depending on the processing capabilities of display, one might want to adjust processing differently than for 4K. Then in many cases there is even lower quality signal for a lot of TV broadcast material either through cable or internet box. This will also require some adjustments.

Yeah, HD or 480p content needs to be upscaled, which any 4k display does automatically without any effect whatsoever on 4k content. Again, what does feeding multiple TV inputs change the processing chain? You mention adjustments for specific content, what exact adjustments are you talking about, and wouldn't you have to do them regardless of whether you were using separate inputs on the display or not?


Finally, one would not want to over process high quality 4K signal, but there might still be some relevant settings to adjust depending on the bit rate (stream vs 4k disc) or frame rate.

Explain how a 4k display overly processes a 4k signal just because that input may also get a 1080p signal? Every 4k display has completely different settings/picture modes for 4k vs non-4k content. Again, you mention relevant settings for bit rate or frame rate -- what specific settings are you talking about and, again, why wouldn't you have to do them regardless of whether you are using separate inputs or not?


(ninja-ed by @KenMasters )
 
Lucky you, the forums are riddled with complaints about eARC. These devices have literally been called Audio Video Receivers/Processors since the 1980s when they were invented and TVs didn't have multiple inputs, so yes, AVRs were always intended to serve as the audio and video hub. A/V sync is far better going through an AVR than through eARC from a display.

Why would one use separate 4k and BD players? Any modern HDMI device working properly should output essentially perfect video. Go ahead and explain what adjustments you are making between your devices.



You are making zero sense. The AVR strips the audio from the video on the HDMI signal and is programmed to deal with whatever internal delays there are. In your eARC scenario, the TV strips the audio output instead and then has to sync it and send to the AVR. No delays/lags are accounted for. And how is that a shorter or simpler path?



What processing are you referring to, exactly? Upscaling HD to 4k is done automatically by any 4k display...what difference would it make if you fed the devices to separate inputs on the TV?



Yeah, many people choose to use a more error-prone setup instead of using an AVR like it was intended... For the record, that practice mostly started because people were still using HD AVRs and had added 4k sources, not because of there being some magical advantage of using the TV as a hub.



Yeah, HD or 480p content needs to be upscaled, which any 4k display does automatically without any effect whatsoever on 4k content. Again, what does feeding multiple TV inputs change the processing chain? You mention adjustments for specific content, what exact adjustments are you talking about, and wouldn't you have to do them regardless of whether you were using separate inputs on the display or not?




Explain how a 4k display overly processes a 4k signal just because that input may also get a 1080p signal? Every 4k display has completely different settings/picture modes for 4k vs non-4k content. Again, you mention relevant settings for bit rate or frame rate -- what specific settings are you talking about and, again, why wouldn't you have to do them regardless of whether you are using separate inputs or not?


(ninja-ed by @KenMasters )
I will pass on any explanations as I was relatively clear in my previous posts, and honestly don't have much time right now. You are free to connect everything the way your heart desires, and do or skip any video processing on your display. It's a free world out there...

Best of luck with your audio/video endeavours.
 
I'm currently looking to get either the Marantz Cinema 30 or the AV10.

In my setup I have a 4.2.4 setup.... and in my current AVR... I use preouts for my Front L&R speakers.

Ideally for me I prefer the Cinema 30... but not sure if I'm going to miss out on better DAC's for AV10.

Let me know your opinion.
Thanks.
 
I'm currently looking to get either the Marantz Cinema 30 or the AV10.

In my setup I have a 4.2.4 setup.... and in my current AVR... I use preouts for my Front L&R speakers.

Ideally for me I prefer the Cinema 30... but not sure if I'm going to miss out on better DAC's for AV10.

Let me know your opinion.
Thanks.
No audible difference in DACs between the two. Nor any difference in software. Of course, with the AV10, you'll need to procure an additional 6 channels of amplification.
 
No audible difference in DACs between the two. Nor any difference in software. Of course, with the AV10, you'll need to procure an additional 6 channels of amplification.
I would have thought better sound from AV10 due to better DACs ? Yes, I would need additional 6 channel of amplification.
 
I would have thought better sound from AV10 due to better DACs ? Yes, I would need additional 6 channel of amplification.
No. Better rated DACs just means less THD+N. 96 dB SINAD is already CD quality, e.g. exactly as good as lossless Dolby TrueHD/Atmos found on Blu-ray. Beyond that, it's just a measuring contest. In a blind ABX test, you couldn't distinguish between 70 dB and 100 dB SINAD, I assure you.
 
No audible difference in DACs between the two. Nor any difference in software.
I expect you are right. I would be surprised if the same isn’t true of the Denon X6800H compared to the Cinema 30, and at a lower price.

I get the impression that Denon-Marantz are positioning Marantz as the target product for the music-oriented audiophile who is sucked in by things that they presume to be audible but actually aren’t, like toroidal transformers, HDAM modules, expensive DACs and slow DAC filters, and a ‘musical sound tuning guru’.
Of course, with the AV10, you'll need to procure an additional 6 channels of amplification.
Yes, if one doesn’t actually need its functionality, one is paying for look and feel and premiumosity.
 
No. Better rated DACs just means less THD+N. 96 dB SINAD is already CD quality, e.g. exactly as good as lossless Dolby TrueHD/Atmos found on Blu-ray. Beyond that, it's just a measuring contest. In a blind ABX test, you couldn't distinguish between 70 dB and 100 dB SINAD, I assure you.
OK... Got it... Thanks for your reply. :)
 
Objectively, audio quality wise, I would consider those two equal, if people listen in direct/pure direct mode without DSP. The AV10's 2-3 dB better in SINAD is better on paper but for real world use, 103 or 107 dB SINAD = transparency anyway, no one can hear such level of noise/distortions in probably 99% or HT rooms. The other 1% might hear the noise in their exceptionally quiet room, probably more like 0.01% lol.. So again, the two are practically equal in terms of bench test performance, the AV10's phono input did score a little better in frequency response based on Amir's comments.

If I remember right, @GXAlan might have spotted that the AVM90 did a little better at low output level but also iirc, I pointed out to him he was not comparing the same graphs, otherwise the two are, again, equal in that measurement.

If listen with DSP such as watching movies, engaging room correction, then all bets are off, because it would depend on one's preference in terms of the so called sound signature of each, and their skill/knowledge level of room corrections and how to use Audyssey/Dirac/ARCG effectively to their liking.

In the feature department, though, the AV10 might be better for those who would like to use two different RC software, directional bass, tactile subwoofers, XLR inputs with their ext. DACs, and more onscreen info display, diagnostics etc., among other gadgets. For me, I would take either, but the AV10 would be slightly better for me because I happen to like DL that can get me much prettier graphs (does not always equal audibly better sound though).

If you read between the lines, I think Gene seems to like the AV10 a little more, and Theo clearly prefer the AVM90 but neither one seems to have issue with their sound quality, obviously..
How do you like Auro 3D up-mixing?
 
Seldom discussed is the AV10's video processing hardware (which by all accounts is stellar), and reviews from both Audioholics and Marantz discuss that extra video engineering built into the AV10. That stellar audio and video combination makes the AV10 a great deal for the combination audio/videophile crowd. Expensive yes. Great deal compared to market for these processors? Also a resounding yes.

IMHO, if you remove the video section for an "AV20", the only people who would buy the hypothetical unit would be audiophiles for the XLR outputs. If the AV10 is a niche market to Marantz, then an "AV20" would be even more niche.
I don’t think Anthem has any video processing? Personally I prefer the source to do the processing such as my Panasonic UBP9000, OPPO UHD-205 or Apple TV
 
Yeah, after paying $15 or $50k for video processor (which I don't have BTW), I guess you need to start "seeing things" :facepalm:

Sony truly has best video processing, but has been behind on the hardware. Their new mini-LEDs seem to address that but then they are limited to 85" which is really small in the recent mini LED space. At that size, I would still choose 83" OLED as reference display.
We need 85” or better yet 97” QD-OLED
 
Curious to all you that may understand these things more than I.

I'm picking up a AV10 and have for my amps two NAD M23's for my main speakers and a NAD M28 for my Atmos speakers.

Based upon the measurements would you use XLR cables or RCA cables to the amps?
 
Curious to all you that may understand these things more than I.

I'm picking up a AV10 and have for my amps two NAD M23's for my main speakers and a NAD M28 for my Atmos speakers.

Based upon the measurements would you use XLR cables or RCA cables to the amps?
You should always use XLR where possible. It will help prevent ground loops, particularly when you are running lots of external amps on different circuits. It'll also help reduce noise, though that is a secondary benefit.
 
Curious to all you that may understand these things more than I.

I'm picking up a AV10 and have for my amps two NAD M23's for my main speakers and a NAD M28 for my Atmos speakers.

Based upon the measurements would you use XLR cables or RCA cables to the amps?
It doesn't matter, but if you can hear ground loop related hum (unlikely) then use XLRs.
On paper, for the M23, I would use XLR because then you can set the amp to the lower gain (mid gain looks good) setting. For M28,, it makes no difference because it has fix gain of 29.5 dB so the higher voltage of the XLR output won't make a difference practically speaking, though why not just use XLR if you already have the cables. The little bench measurement results are too little to make any audible difference.
 
Thanks for the information.

On the NAD M28 (using XLR's) with my current Marantz 8805A on the center channel I'm getting a much elevated sound level when calibrating it using MultiEQ-X. It was a trying to reduce it down to -12dB which is at its lowest parameter. It shows it as around -12.38dB but can only go to -12dB.

I know there can be several factors for this. It is an extremely short speaker cable of about 3-4 feet. But I wanted to get it down to a mid negative range and ended up using a RCA cable to get it from its -12.38dB range to about -6dB range.

I guess the 29.5dB fixed gain along with the XLR was too much.

Thoughts?
 
Thanks for the information.

On the NAD M28 (using XLR's) with my current Marantz 8805A on the center channel I'm getting a much elevated sound level when calibrating it using MultiEQ-X. It was a trying to reduce it down to -12dB which is at its lowest parameter. It shows it as around -12.38dB but can only go to -12dB.

I know there can be several factors for this. It is an extremely short speaker cable of about 3-4 feet. But I wanted to get it down to a mid negative range and ended up using a RCA cable to get it from its -12.38dB range to about -6dB range.

I guess the 29.5dB fixed gain along with the XLR was too much.

Thoughts?
What's the sensitivity of the center speaker? With the M28, you do have the option to use rca outputs for the center channel, if necessary.
 
What's the sensitivity of the center speaker? With the M28, you do have the option to use rca outputs for the center channel, if necessary.

From Paradigm's site

Sensitivity Room / Anechoic 94dB / 91dB



That's what I did is use a RCA cable for the center channel only. But before its level is set, it's clearly louder than any speaker in my setup.
 
Back
Top Bottom