• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Marantz AV10 AV Processor Review

Rate This AV Processor:

  • 1. Poor (headless panther)

    Votes: 5 1.6%
  • 2. Not terrible (postman panther)

    Votes: 21 6.8%
  • 3. Fine (happy panther)

    Votes: 83 27.0%
  • 4. Great (golfing panther)

    Votes: 198 64.5%

  • Total voters
    307
Everyone uses whatever they feel is best solution for their system. Some of the people with most expensive systems (where AV-10 is cheap in comparison) don’t use Dirac as they don’t think it is up to the task for such systems.

One can only tell by actually trying and deciding what is best for their room and system.
 
OB,C
Have we tried Obsessive Audio Compulsive Audyssey One with REW calibration process?

I hear the results are much improved over the typical Audyssey sweep built into the receiver / processor?
Agreed with others, that you need to try it yourself, because sound quality is subjective, unless you are okay to rely on measurements. Audyssey One, based on OCA's videos, seem easy and quick enough to try. Interestingly, it actually got noticed by D+M, but I have no idea why they would include some sort of warning, I would think that all they needed to do is to emphasized that AudysseyOne is not one of their products, as to such sort of warning about "damage", that seems far fletch to me?? If I were still using D or M for my HT, I would have tried it for sure, but I am only using my little Denon for 2 channel use with a tiny sub so I would just use Dirac (PC version), unless someone tells me even for 2.1 use, AudysseyOne can still be beneficial, vs DLBC.

 
Seldom discussed is the AV10's video processing hardware (which by all accounts is stellar), and reviews from both Audioholics and Marantz discuss that extra video engineering built into the AV10. That stellar audio and video combination makes the AV10 a great deal for the combination audio/videophile crowd. Expensive yes. Great deal compared to market for these processors? Also a resounding yes.

IMHO, if you remove the video section for an "AV20", the only people who would buy the hypothetical unit would be audiophiles for the XLR outputs. If the AV10 is a niche market to Marantz, then an "AV20" would be even more niche.
 
It would be interesting to know if someone has plugged a PC and played something using GPU with all the graphic options on to check if the signal gets altered.
 
Video is not my forte, but it is an interest, and there is something that confuses me.

If your video player has all the right features (eg 4K or more, Dolby Vision, HDR10), and your TV or projector has them too, then what use are the same features in the AVR/AVP?

Does the AVR/AVP video circuitry add any goodness? Can an AVR/AVP with extra special video circuitry do better than an AVR/AVP with less fancy video circuitry, as long as both tick the boxes regarding features list?

I ask because my AV-ware is outdated, and I am unsure about where money is well spent on video, beyond simply having the right features list.
 
Does the AVR/AVP video circuitry add any goodness? Can an AVR/AVP with extra special video circuitry do better than an AVR/AVP with less fancy video circuitry, as long as both tick the boxes regarding features list?
No, you don't want the AVR/processor to do anything to the video, it only needs to transparently pass whatever the source content is to the display. If you want the volume control to be displayed on the display, then the AVR/processor needs to have the correct hardware/software to not impact the picture quality, but nowadays this is pretty standard even for entry-level AVRs.

If you don't do high-end gaming, pretty much any AVR/processor on the market will suit in terms of video. This is assuming you only have modern HDMI sources, of course.
 
If you have outdated AVR between the more modern source and display components (i.e. source-AVR-display connections), you will be limited to features that AVR supports for both audio and video. AVR should still support video-pass through, but should check if it also passes advanced video formats like DV or HDR and 4k signal.

As noted, if your display and AVR have ARC or even better eARC support, then can connect the source directly to display and beam the audio back to AVR via HDMI ARC or eARC ports.

For enhancing video you would need a separate video processor, but they tend to be expensive. They are commonly used for PJs but not so commonly with TVs that have all the features you need (i.e. 4K, DV, HDR, their own video processor). Depending on the brand and model, TVs tend to do relatively decent video processing job. Some brands do have better processing than the others and Sony is usually considered to do the best job.
 
Video is not my forte, but it is an interest, and there is something that confuses me.

If your video player has all the right features (eg 4K or more, Dolby Vision, HDR10), and your TV or projector has them too, then what use are the same features in the AVR/AVP?

Does the AVR/AVP video circuitry add any goodness? Can an AVR/AVP with extra special video circuitry do better than an AVR/AVP with less fancy video circuitry, as long as both tick the boxes regarding features list?

I ask because my AV-ware is outdated, and I am unsure about where money is well spent on video, beyond simply having the right features list.
The processor is part of the video chain, and you either want it to do no harm, or to improve quality.

If you have an AVR that can process video i.e. upscaling, deinterlacing, sharpness, etc... perhaps just see which looks better, pass through or AVR processing. Probably different for everyone depending on quality of the video chain i.e. some TVs are really bad at some of this stuff.
 
The processor is part of the video chain, and you either want it to do no harm, or to improve quality.

If you have an AVR that can process video i.e. upscaling, deinterlacing, sharpness, etc... perhaps just see which looks better, pass through or AVR processing. Probably different for everyone depending on quality of the video chain i.e. some TVs are really bad at some of this stuff.
I would say if this is the case, then time for a new TV. New mini LEDs are amazing even in the lower priced range. AVR's were never really meant to do any serious video processing and they don't really do it well as companies that produce them don't have expertise in video processing. These are just convenience features so that they can add "V" to "AVR". AVR, even the best one, should really be a video switcher at best.

Given that this is purist forum, the cleaner approach is to connect video sources directly to eARC display and loop back audio to AVR. This way you have higher bandwidth signal (video) through the shortest path and lower bandwidth signal (audio) going through a hub. You also have the benefit of display level video adjustments/profiles by input, which is lost if you go through AVR (although there will still by adjustment by video signal type aka HD, HDR, DV). Modern TVs with multiple inputs are actually made to accommodate that approach with some limits as to number of inputs. Some people run out of inputs so yeah, AVRs with 8 HDMIs will serve the purpose of primary or secondary video switch in that respect.
 
I would say if this is the case, then time for a new TV. New mini LEDs are amazing even in the lower priced range. AVR's were never really meant to do any serious video processing and they don't really do it well as companies that produce them don't have expertise in video processing. These are just convenience features so that they can add "V" to "AVR". AVR, even the best one, should really be a video switcher at best.

Given that this is purist forum, the cleaner approach is to connect video sources directly to eARC display and loop back audio to AVR. This way you have higher bandwidth signal (video) through the shortest path and lower bandwidth signal (audio) going through a hub. You also have the benefit of display level video adjustments/profiles by input, which is lost if you go through AVR (although there will still by adjustment by video signal type aka HD, HDR, DV). Modern TVs with multiple inputs are actually made to accommodate that approach with some limits as to number of inputs. Some people run out of inputs so yeah, AVRs with 8 HDMIs will serve the purpose of primary or secondary video switch in that respect.
I have a couple of friends whose video arguments are equivalent to audiophile arguments.

They just see things, and I have no idea what they are seeing. They can see dead people :)
 
Yeah, after paying $15 or $50k for video processor (which I don't have BTW), I guess you need to start "seeing things" :facepalm:

Sony truly has best video processing, but has been behind on the hardware. Their new mini-LEDs seem to address that but then they are limited to 85" which is really small in the recent mini LED space. At that size, I would still choose 83" OLED as reference display.
 
I am still confused. I asked my questions above because of the quote below:
Seldom discussed is the AV10's video processing hardware (which by all accounts is stellar), and reviews from both Audioholics and Marantz discuss that extra video engineering built into the AV10.
Are we actually saying that an AV10 will give superior video to a lower-range AVR/AVP that has the same list of video features like Dolby Vision, HDR10, etc?

On what basis would that be true, given that it isn’t actually a video processor unit?
 
I am still confused. I asked my questions above because of the quote below:

Are we actually saying that an AV10 will give superior video to a lower-range AVR/AVP that has the same list of video features like Dolby Vision, HDR10, etc?

On what basis would that be true, given that it isn’t actually a video processor unit?
Have never tested it but I would doubt it does any additional magic to video. I just use pass through and it does that as well as expected. TV if newer model will have much more sophisticated video processing.

One could potentially argue that it will do it better due to overall lower level of noise and interference in the box but I can’t really can’t see difference between my noisy and hot Denon 6700H and quiet and cool AV-10 in this respect.
 
I am still confused. I asked my questions above because of the quote below:

Are we actually saying that an AV10 will give superior video to a lower-range AVR/AVP that has the same list of video features like Dolby Vision, HDR10, etc?

On what basis would that be true, given that it isn’t actually a video processor unit?
It won't have superior video over any properly functioning receiver. Virtually any modern receiver is completely transparent for video (hell, the vast majority of them use the exact same HDMI sub-board).
 
Given that this is purist forum, the cleaner approach is to connect video sources directly to eARC display and loop back audio to AVR. This way you have higher bandwidth signal (video) through the shortest path and lower bandwidth signal (audio) going through a hub. You also have the benefit of display level video adjustments/profiles by input, which is lost if you go through AVR (although there will still by adjustment by video signal type aka HD, HDR, DV). Modern TVs with multiple inputs are actually made to accommodate that approach with some limits as to number of inputs. Some people run out of inputs so yeah, AVRs with 8 HDMIs will serve the purpose of primary or secondary video switch in that respect.
I'm sorry, but this is not good advice IMO. eARC has all kinds of problems that are 100% eliminated by routing through the AVR/processor as was intended. Any modern receiver passes through video transparently, including for gaming (from a console, anyway). There should be zero need to adjust video settings by source with modern HDMI components, and even if there was a need, IMO it wouldn't offset the hassle and errors that eARC has a strong potential to create.
 
I'm sorry, but this is not good advice IMO. eARC has all kinds of problems that are 100% eliminated by routing through the AVR/processor as was intended. Any modern receiver passes through video transparently, including for gaming (from a console, anyway). There should be zero need to adjust video settings by source with modern HDMI components, and even if there was a need, IMO it wouldn't offset the hassle and errors that eARC has a strong potential to create.
No eArc problems in any of my setups. And who really “intended” to route video through audio processor when new displays are perfectly capable of doing their job as hubs within limits of their input capabilities. Actually adding another component in the mix is when mess really occurs. A/V sync being the worst of them.

And why there would be no need to adjust video by input in your view? Like streamer, 4K player, BR player? If you expect these to perform well on common display video settings then looks like you might have an extraordinarily smart display.
 
No eArc problems in any of my setups. And who really “intended” to route video through audio processor when new displays are perfectly capable of doing their job as hubs within limits of their input capabilities.

That's it right there, they are limited to 2 or 4 HDMI 2.1 inputs, whereas an AVR or processor allows you to connect and switch conveniently between a lot more devices.

Actually adding another component in the mix is when mess really occurs. A/V sync being the worst of them.

Find out what the display lag is (can be eyed with a test pattern, measured with a lag tester or just lifted from a review), enter per input into the processor or AVR, done - not complicated.

And why there would be no need to adjust video by input in your view? Like streamer, 4K player, BR player? If you expect these to perform well on common display video settings then looks like you might have an extraordinarily smart display.

They all adhere to the same standards, if your source deviates significantly (unlikely), it's not fit for purpose. TVs all come with an accurate preset, Filmmaker Mode is common these days and is very accurate across a number of brands.
 
Last edited:
Arguing that something simple should be made complicated is always a bad argument.

If there are reasons why to make it complicated I am all for it and do it all the time. But IME in this case unless you have input limitations or significant A/V sync issues with eARC, that is obviously a shortest path theoretically prone to least problems.

It is actually very interesting that someone thinks HD and 4K need same kind or level of video processing. I would be very much interested to learn more about the reasons why would that be the case.
 
Arguing that something simple should be made complicated is always a bad argument.

If there are reasons why to make it complicated I am all for it and do it all the time. But IME in this case unless you have input limitations or significant A/V sync issues with eARC, that is obviously a shortest path theoretically prone to least problems.

If eARC works for you and you don't need many inputs, then that's good. I think it's just as simple the other way around - in fact in my experience it's less prone to problems.

It is actually very interesting that someone thinks HD and 4K need same kind or level of video processing. I would be very much interested to learn more about the reasons why would that be the case.

I don't know if this is in relation to my comment on standards. There is a standard for HD, rec 709. and Ultra HD, rec 2020, the TV switches automatically between the two.
 
Back
Top Bottom