I've got a 2 month-old cheap Marantz with powered studio monitors for my desktop 5.1 system. I just ran a highly-instrumented "Put it in ECO-on Pure Direct mode, play music, feel top of receiver" test. Cold as if it wasn't on. Ditto in stereo mode to feed the sub. In surround, it's feeding the center and rears, so it gets a little warm, but not much.
Eco mode is fine if you only want to listen at low levels. I ran sweeps in room with and without eco mode. With eco mode on, horrendous clipping, distortion, and compression at higher MV. Audioholics did some tests with Eco mode if I recall correctly....all it really did was limit power to something very low like 20-30 watts/ch. Since discovering how bad Eco mode affects performance with my own testing, and after seeing AH's results, I've left it off ever since. I *think* my 3300 is around 2 years old with zero issues. Despite the 3600's better bench test, I don't think there is a real world benefit for me personally that would make it worthwhile to spend $1100 right now to upgrade.
Edit: found the AH article:
"With Eco mode engaged, it limited power on the bench to 20 watts/ch no matter how many channels were driven. This is worse than the dreaded 4-ohm impedance switch many receiver companies are using today. Make triple sure you NEVER set this receiver to Eco if you plan on using the internal amplifiers. I'd go so far as to omit the button from the remote control in case someone accidentally hits it and engages that mode."
Loudest I listen is around -10 MV with 91 dB speakers at about 11'. The power using the internal amps is more than enough to support this level, with SINAD around -84 dB......vs -88 dB for the 3600. Both are inaudible and would make no difference in what I hear between the two. The pre-outs measure better on the 3600 with disabling the amps, but even if I were to add an external amp for my LCR, there are a couple of reasons why the 3600 may not offer a real, audible benefit.
First, the 3500(and hence 3300), has a very clean pre-out signal up to 1.5 volts at around -95 dB.....this is plenty to drive most external amps and is indistinguishable from a -99dB signal from the 3600. Also, even if I were to play at such a high volume that I would need up to 2 volts from the pre-outs on my 3300....I don't think anyone would hear a difference between a -73 dB signal and a -99 dB signal at rock concert levels. This is still far far far far below the distortion from the speakers.
I'm glad to see disabling the amps results in better *measured* performance, it seems like a cheap and easy way to ensure higher *measured* fidelity. The 3600 is likely what I would buy if my 3300 bit the dust. But I question if anyone would be able to hear any difference in blind testing under almost any circumstance between the 3300 and the 3600, even using an external amp for the L/R or, more appropriately, for the LCR.
In the case where we are using an external amp for the LCR, it seems as though the measurement advantage the 3600 holds would disappear if the center channel internal amp, which is the heaviest demand for home theater and probably multi channel music, would degrade the pre out signal to the same level anyways.
Not trying to crap on the great measurements put up for this AVR, I'm happy to see them. I just wonder, in the real world, with accurate test conditions, if anyone would be able to tell the difference between the 3500 and 3600.
Having said all that, if I didn't own an AVR yet(or just needed one) and had to buy a 3500 or 3600, I'd certainly get the 3600. But I'd probably wait for the 3600 to be last years model for half price.