• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Emotiva RMC-1 AV Processor Review

Casey Leedom

Member
Joined
Dec 7, 2019
Messages
69
Likes
31
Location
Palo Alto, CA, USA
The one thing the Monoprice HTP-1 is missing is an Analog XLR input and a pass through mode for that. That would have allowed you to hook something like a Topping D90 to it and use the HTP-1 as a simple Preamplifier for it. Note that I’ve been told that all of the HTP-1’s Analog Inputs go through an Analog-to-Digital Conversion.
Casey
 

thxultra

Member
Joined
Feb 26, 2020
Messages
40
Likes
26
The one thing the Monoprice HTP-1 is missing is an Analog XLR input and a pass through mode for that. That would have allowed you to hook something like a Topping D90 to it and use the HTP-1 as a simple Preamplifier for it. Note that I’ve been told that all of the HTP-1’s Analog Inputs go through an Analog-to-Digital Conversion.
Casey
No on screen display either. That is a deal breaker for me.
 

Casey Leedom

Member
Joined
Dec 7, 2019
Messages
69
Likes
31
Location
Palo Alto, CA, USA
To be honest, I don’t care about On Screen Displays (though I do understand that others do care). I can say that the On Screen Display was a source of a lot of agony for Emotiva because of the need to handle it overlaid on all sorts of different resolutions and formats. At one point Lonnie asked folks if the continued efforts were worth the investment. So I definitely understand Monoprice/ATI’s decision not to do one.
Casey
 

thxultra

Member
Joined
Feb 26, 2020
Messages
40
Likes
26
To be honest, I don’t care about On Screen Displays (though I do understand that others do care). I can say that the On Screen Display was a source of a lot of agony for Emotiva because of the need to handle it overlaid on all sorts of different resolutions and formats. At one point Lonnie asked folks if the continued efforts were worth the investment. So I definitely understand Monoprice/ATI’s decision not to do one.
Casey

At this price point and I think people put them in racks where they aren't visible I don't know how you can leave a OSD out. At $4k I think it should be there even a cheap AV receiver has one.
 

Magnus

Member
Joined
Feb 25, 2020
Messages
88
Likes
67
Any $5,000 amp that requires a cheat code to play it (every time) is a joke

My god.... THAT is your conclusion for a quick and dirty way to get around the bug (an INAUDIBLE bug at that for most people) until the firmware is publicly released? Did you ever have the slightest intention of considering this unit or do you just enjoy reading test results? I get the feeling most on here don't give two bits about any of the items being reviewed. They have 5.1 (as one person even admitted) or whatever and just like reading technical data and looking at graphs.

As I've said before, D&M products get poor reviews here too for measurements. I haven't found a single user of them yet that says they sound awful. That was my entire point from the start. Measurements are nice to look at, but concluding a product is worthless by them when the results are likely not audible is absurd to me. If my machine sorts candy bars just fine within specs do I really care if the triggering signal is a perfect square wave on the machine? No, not until it's a problem. I'd be looking for things that are actually affecting production, not harping about things that aren't. In Emotiva's case, I think they have far bigger problems to worry about than improving an already perfectly usable measurement (like major bugs). But no, they've been diverted to chasing minor bugs instead because of people making a loud stink about something that has no audible impact.... Amazing. It's a bit like complaining the right footwell light isn't quite as bright as the left one and demanding the designers fix it RIGHT NOW when the ABS brakes still don't work.... :rolleyes:
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,757
Likes
242,215
Location
Seattle Area
My god.... THAT is your conclusion for a quick and dirty way to get around the bug (an INAUDIBLE bug at that for most people) until the firmware is publicly released? Did you ever have the slightest intention of considering this unit or do you just enjoy reading test results? I get the feeling most on here don't give two bits about any of the items being reviewed. They have 5.1 (as one person even admitted) or whatever and just like reading technical data and looking at graphs.
The membership here didn't design the product. And they didn't test or review it. Emotiva did the former. I did the latter.

What their role is, is to put a spotlight on the measured performance of this product so that the next version is better.

The only reason to get upset at them is to lack a desire for improvement in these AV products. If that is your motivation, then you don't belong here.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,757
Likes
242,215
Location
Seattle Area
As I've said before, D&M products get poor reviews here too for measurements. I haven't found a single user of them yet that says they sound awful.
Not sounding awful is not our criteria here. We are used to superb measured performance, fantastic design and great usability in audio products. We won't stop until we identify such jewels and shower them with our praise.

No AV product tested so far meets the performance bar that is advertised (implicitly or explicitly). We are going to continue to test them. We don't care about your criteria because we are advocates for consumers. Your criteria is met with any audio product regardless of competence in its design and ultimate fidelity.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,757
Likes
242,215
Location
Seattle Area
Measurements are nice to look at, but concluding a product is worthless by them when the results are likely not audible is absurd to me.
Product is not said to be useless. It decodes audio and sends it to many speakers. Should they one day have room EQ, then that improves sound too. Our objection is to the product saying on its LED display: "Redefine High-end" when you power it on. It doesn't even meet the bar of high-end let alone redefine it.

This is what the top level description says about the product:
1582844246191.png


Plenty of people are buying it based on what I have highlighted. Repeatable and objective measurements show that statement to not be remotely true.

Your angst and frustrations should be aimed at Emotiva if you care about us as consumers. They should not advertise state of the art performance, then deliver what is not even as good as mass market, dirt cheap audio products in stereo world.

Statements like above convinced me to go into this review and expect superb performance. Even I was not immune to power of their promotion.

Decide what your purpose in this domain is. Papering over faults in audio products is not something anyone here subscribes to. Be part of a change for the better.
 

Doodski

Grand Contributor
Forum Donor
Joined
Dec 9, 2019
Messages
21,669
Likes
21,953
Location
Canada
If my machine sorts candy bars

like complaining the right footwell light isn't quite as bright as the left one and demanding the designers fix it RIGHT NOW when the ABS brakes still don't work

I would be displeased if this where a $1000.00 AV processor with bugZ never mind this Emotiva RMC-1 for big kilo bucks.
It reminds me of the Sony STR-AVxxxx receivers series of the mid 90's.
They would turn on in the middle of the night, go to full volume and then start changing FM radio stations randomly.
Sony denied the issue for several months until it was proven with a storage oscilloscope that the grounds where intermittently floating and the processor was going nutterZ.
People continued purchasing them, they continued selling them in large amounts and techs globally had to fix every single one of them.
It's no way to do business and this Emotiva RMC-1 bugZ stuff is no way to do business.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,757
Likes
242,215
Location
Seattle Area
In Emotiva's case, I think they have far bigger problems to worry about than improving an already perfectly usable measurement (like major bugs). But no, they've been diverted to chasing minor bugs instead because of people making a loud stink about something that has no audible impact.... Amazing.
What is more amazing is someone thinking performance improvements, when they come so easily as in a tiny change in a setting, should be left on the table. Be thankful that work we have done for free, at our expense to ship this unit back and forth, is helping them debug the product and make it better.

The disservice is yours by clouding the issue with these word arguments, slowing down the pace of getting issues resolved. They don't need a consumer acting as their PR person. Let them hear the message loud and clear so they resolve the problems. In this unit or future ones.
 

Dj7675

Major Contributor
Forum Donor
Joined
Jan 12, 2019
Messages
2,145
Likes
2,821
My god.... THAT is your conclusion for a quick and dirty way to get around the bug (an INAUDIBLE bug at that for most people) until the firmware is publicly released? Did you ever have the slightest intention of considering this unit or do you just enjoy reading test results? I get the feeling most on here don't give two bits about any of the items being reviewed. They have 5.1 (as one person even admitted) or whatever and just like reading technical data and looking at graphs.

As I've said before, D&M products get poor reviews here too for measurements. I haven't found a single user of them yet that says they sound awful. That was my entire point from the start. Measurements are nice to look at, but concluding a product is worthless by them when the results are likely not audible is absurd to me. If my machine sorts candy bars just fine within specs do I really care if the triggering signal is a perfect square wave on the machine? No, not until it's a problem. I'd be looking for things that are actually affecting production, not harping about things that aren't. In Emotiva's case, I think they have far bigger problems to worry about than improving an already perfectly usable measurement (like major bugs). But no, they've been diverted to chasing minor bugs instead because of people making a loud stink about something that has no audible impact.... Amazing. It's a bit like complaining the right footwell light isn't quite as bright as the left one and demanding the designers fix it RIGHT NOW when the ABS brakes still don't work.... :rolleyes:
The importance of sites like this, audioholics etc that do measurements are that manufacturer's like Emotiva are not publishing any specs or measurements. Other manufacturers are vague or misleading. How do you even know if any basic engineering performance is met without trusted published specs or measurements? In previous posts you talk about how these issues are inaudible. There are many published scientific studies that show differently that you believe. Right now we are looking at the RMC-1... Sinad (noise + distortion) of 75, 85, or 95 depending on the firmware, mode it is in, or who knows what at this point. My point is this, at what point do you care... 70, 60, 50? How do you know how well the unit is engineered if there are no specs or measurements or the measurements are false? You know by testing it.
This is something that many here find important. You don't have to. But I would submit you don't need to get mad because Amir does testing and if in general the numbers don't matter to you. The membership here does not even agree on the audibility thresholds a lot of the time. If you think a Sinad of 75 is good enough that is fine...but we wouldn't know the SINAD of a unit without it being tested.
Basically, how much noise and distortion is ok? My point is nobody is telling you what level of SINAD should be meaningful to you if any. That is up to you. But to call into question the value of tests like this, in my opinion isn't a good one. With carefully engineering I want to push manufacturers to eliminate as much as possible. Cheers
 

BDWoody

Chief Cat Herder
Moderator
Forum Donor
Joined
Jan 9, 2019
Messages
7,101
Likes
23,662
Location
Mid-Atlantic, USA. (Maryland)
... I haven't found a single user of them yet that says they sound awful. That was my entire point from the start.

Right.

We got that early on...

Not 'sounding awful' really isn't the goal for most of us here. When I buy one of these, it will be one that doesn't need to make excuses for itself. You think that's absurd.

Got it.

You make a lot of sweeping generalities about who reads these reviews in your repetitive defenses of this product, most of which are very dismissive of anyone who thinks engineering competence and elegance matters.

I'll just wait until someone gets it right.

I hope I don't get a lecture about how I'm one of the poor misled masses...
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,251
Likes
17,222
Location
Riverview FL
Do folks object to third party tests of

Cars
Bicycles
Toaster Ovens
(pick anything except Audio)

?
 

Kishore

Member
Forum Donor
Joined
Nov 3, 2019
Messages
26
Likes
35
Location
San Diego, CA
Again, we still don't know what problem they have fixed. I identified a number of them. Not a single issue.

Emotiva has closed the thread on this (started by @SOWK in Emotiva forum)- so seems like they took 1 step forward and a dozen back and are going back into their shell after addressing only 1 issue Amir found. What a pity! I thought there would be a point by point rebuttal (or disagreement with test criteria). Till date I have not seen any such response from AV companies.. ((sad))
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,757
Likes
242,215
Location
Seattle Area
There seems to be continued remarks that I did not give enough time for Emotiva to answer.

Let's start with the fact that they knew the review was coming the week before. They reached out to one of their customers who is a member to pass on a message to contact Lonnie before the review is published. I did that Friday evening. I then sat on a fully completed review.

The member gave me a link to 1.8 firmware. So I tested that over the weekend. Still no response from Lonnie.

I then wanted a full working day to hear back. I still got no response. Nothing. Not even, "thanks for the message but will get back to you later." Nothing.

Thinking it is not a priority for them, I post the review late Monday:

1582846414056.png


So posts like this on their form is simply not correct:
1582846512174.png


As members know in this forum, unless a manufacturer has sent me something to test, I don't chase them to approve reviews. They are welcome to comment after the data comes out and I am happy to revise results if there are mistakes. My service is to the community to test their gear much like a mechanic comes to check out a car with you before you buy it. He is not expected to call your car manufacturer to get their input before giving you his opinion. Same here.

The fact that I did not get an answer then, or now, shows that my policy is the right one anyway. Contacting the company did no good and now is even getting complaints that I did not wait long enough.

Companies should publish their own measurements anyway. If they had done so then any discrepancy with mine would have been investigated.
 

Doodski

Grand Contributor
Forum Donor
Joined
Dec 9, 2019
Messages
21,669
Likes
21,953
Location
Canada
There seems to be continued remarks that I did not give enough time for Emotiva to answer.

Let's start with the fact that they knew the review was coming the week before. They reached out to one of their customers who is a member to pass on a message to contact Lonnie before the review is published. I did that Friday evening. I then sat on a fully completed review.

The member gave me a link to 1.8 firmware. So I tested that over the weekend. Still no response from Lonnie.

I then wanted a full working day to hear back. I still got no response. Nothing. Not even, "thanks for the message but will get back to you later." Nothing.

Thinking it is not a priority for them, I post the review late Monday:

View attachment 52006

So posts like this on their form is simply not correct:
View attachment 52008

As members know in this forum, unless a manufacturer has sent me something to test, I don't chase them to approve reviews. They are welcome to comment after the data comes out and I am happy to revise results if there are mistakes. My service is to the community to test their gear much like a mechanic comes to check out a car with you before you buy it. He is not expected to call your car manufacturer to get their input before giving you his opinion. Same here.

The fact that I did not get an answer then, or now, shows that my policy is the right one anyway. Contacting the company did no good and now is even getting complaints that I did not wait long enough.

Companies should publish their own measurements anyway. If they had done so then any discrepancy with mine would have been investigated.
whos awesome - you are.gif
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,757
Likes
242,215
Location
Seattle Area
I should say in my previous roles as technology developer to reviewers, I would hardly if ever get to see a review before it came out. Just the mere mention of what they thought prior to publication would get an angry response. Reviewers also have strict deadline and could care less if we were too busy to get back to them. I would be on call any minute, night and day, weekend or holiday with our marketing and PR folks to deal with upcoming reviews. No way I would not respond to emails for 3 days because I had other things to do. It was critical that we did everything we could to get the best possible outcome in a review.
 
Top Bottom