• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Fixing the Motu M4 Audio Interface

ssashton

Active Member
Joined
Sep 22, 2022
Messages
239
Likes
316
Now this is a story all about how,
My M4 got flipped turned upside down.
I'd like to take a minute, just sit right there,
I'll tell you how this Motu has inputs that aren't quite fair.

I picked up a Motu M4 with ESS ADC and DAC. It’s a great unit overall, but since I use it for measuring other gear, I wanted to see if I could improve the ADC side.

Most of my experiments didn’t lead anywhere, so let’s skip straight to the main issue: the DC bias on the line inputs is set incorrectly for the ESS ADC. This makes the inputs clip a few dB before reaching full scale.

The fix is simple if you’re comfortable soldering. The ADC actually requires a 1.95 V DC bias, but the Motu only applies 1.65 V. I understand why: the ESS datasheet says the bias should be half of AVCC (3.3 V / 2 = 1.65 V), which is common for many ADCs—but not for this one. ESS chips need 1.95 V.

What’s funny is that part of the M4 actually gets this right. The front XLR inputs on channels 1 & 2 are biased correctly at 1.95 V, but the rear TRS line inputs on channels 3 & 4 are at 1.65 V. It looks like two engineers designed different parts of the circuit, and only one of them knew the correct value.

Motu clearly noticed the clipping but didn’t know the cause, so instead of fixing the bias they just reduced the line input sensitivity. That way, the DAC outputs (about 4 V rms) can’t quite drive the inputs to full scale in loopback tests. I didn’t like that solution, so I first adjusted the line input gain to match the line output level, giving me a clean loopback without level changes. That only requires swapping four resistors per channel.


Sensitivity Adjust.png


Now we can measure a loop-back with the input at -0.5dBFS and -1dBFS for comparison.

ADC Bias 1pt65V.png


Now let’s look at the ADC bias. It’s generated by IC10, a dual op-amp. One half of the op-amp takes a divided-down voltage from the 3.3 V AVCC rail (the ADC’s clean analog supply) to produce the bias for the line inputs.

For reference, the AVCC rail itself comes from IC16 which I believe is an ADP151 regulator—it’s a small 3-pin SOT-23 device with about 9 µV noise. The package markings are unclear, but the pinout and performance match.

Here’s the important part: the other half of this same dual op-amp already generates the correct 1.95 V bias for the XLR inputs (channels 1 & 2). So the fix is simple—disconnect the divider feeding the line-input side, and instead link it to the correct 1.95 V bias from the XLR side.

(oops, that's not pin 8, it's pin 5)
Bias Adjust.png


Now let's run the tests again.
ADC Bias 1pt95V.png


Nice :)

Now for some of the other experiments that didn’t work out:

I tried swapping the ES9840 ADC for the supposedly pin- and register-compatible ES9842PRO. In practice, it didn’t work. Most of the channels produced no signal at all, and channel 1 only gave a distorted, low-level output.

At first I thought I’d found the issue: the ES9842PRO datasheet lists AVCC as 4.5 V instead of 3.3 V. I tried supplying 4.5 V, but the result was the same—no usable signal.

It’s easy to assume I messed up the rework since the chip is tiny, but I checked carefully. The soldering was clean, all traces intact, and I even removed the chip, inspected everything, and tried again with a fresh IC. Same behavior. Swapping back to the original ES9840 restored normal operation immediately.

So, despite ESS calling them compatible, the ES9842PRO is not a drop-in replacement for the ES9840.

20250814_210202.jpg
20250815_131547.jpg


Next, I looked at the analog input op-amps. Motu use the THS2145, a differential op-amp designed for very low power and rail-to-rail operation—not exactly optimized for audio performance.

I swapped it for the OPA1632, which offers lower noise and distortion. At a -6 dB input level, the results were encouraging: I measured a noticeable reduction in distortion.

OPA1632 Compare.png


However, once I pushed the level higher, the OPA1632 swap revealed a problem. At about –3 dBFS the input started clipping. The issue seems to be that Motu run these input op-amps from a single +5 V rail (ground-referenced, no –5 V). With the input bias at only 1.95 V, the negative swing of the signal comes too close to ground, which most op-amps can’t handle gracefully.

Interestingly, the M4 does have a –5 V rail, and it’s used for the other op-amps in the circuit. I figured it would be straightforward to power the OPA1632 with that rail. Unfortunately, when I tied it in, the –5 V supply sagged to about –4 V and rose very slowly at startup. I guess that's why Motu stuck with a single-rail, low-power IC in the first place. A missed opportunity.

As a side note, I also replaced the AVCC regulator with an LT3045 (because I lost a leg on the original regulator). That did give me a measurable drop in mid- and low-frequency noise from the ADC, though I don’t have before-and-after plots to show.

20250615_224134.jpg
 
Last edited:
Indeed.

My 2 cents:
1. It is actually common for interfaces to handle greater input than output levels, just in case. On the Focusrite side you'll find a 6 dB difference, and the Behringer ADA8200 comes to mind for its particularly absurd line level handling (something like +31 dBu if I am not mistaken, which is in no way backed up by ADC performance). What are the original part values? That should tell us what maximum input level would be with just the bias fix.
2. Bus-powered interfaces tend to be severely power-constrained, so finding that they went all single-supply on the input side is not entirely surprising. That the -5V side could not handle a few mA more does strike me as surprising, usually the dual rail boost converters used will do at least 100 mA each on +/-5V and I can't imagine that -5V would be that heavily loaded to begin with. But maybe they only used a more stout boost converter for the plus side (which then feeds the AVCC regulator) and there's just a fleapower inverter for -5V with a minimum of output capacitance.

As a side note, I also replaced the AVCC regulator with an LT3045 (because I lost a leg on the original regulator). That did give me a measurable drop in mid- and low-frequency noise from the ADC, though I don’t have before-and-after plots to show.
That's good to know. Whenever I see rising noise levels down there I tend to blame it on consumer-level CMOS input opamps. It's good to know that you could also whack a bigger capacitor on the 1.95 V reference R/RC divider. (I guess that would be in parallel to C74 then, or would it be C75?) The parts being as tiny as seen would explain why noise suppression is a bit meh stock - I can't imagine the caps are more than a few hundred nF, 1 µF tops.
 
Ah, well — maybe the bias isn’t the reason the line inputs were originally not matched to the line output level. The original diff-input resistor values are Rin = 3.24 kΩ and Rfb = 1.24 kΩ.

The AVCC regulator I mentioned powers the AVCC supply of the ADC, which is the analog audio section of the ADC. That’s separate from the bias op-amp, though the divider that sets its input voltage is derived from AVCC. So increasing the capacitance on that divider would only reduce bias noise; it wouldn’t improve the AVCC supply that feeds the ADC’s analog section. Those caps are 0402, so probably 100nF.

A final note — I noticed the balanced TRS outputs on the DAC side are not summed, which explains why Amir’s testing showed some “ESS IMD hump.” Only the RCA outputs use summing. This isn’t ideal practice: differential DAC outputs should always be summed, even when driving a differential output, since that summing is essential for reducing common-mode noise and distortion.
 
Last edited:
Ah, well — maybe the bias isn’t the reason the line inputs were originally not matched to the line output level. The original diff-input resistor values are Rin = 3.24 kΩ and Rfb = 1.24 kΩ.
So let's do the math...

For starters, it looks like the FD opamp in question is actually THS4521, not 2145?

That can typically go down to 0.08 V, leaving us with 1.57 Vp or 3.14 Vpp. Being a FD part, this is effectively doubled, so that's 2.22 Vrms. So given that input gain is 1.24/3.24, it can handle 5.82 Vrms or +17.5 dBu stock. (Even under ideal circumstances, it would only ever be 17.9 dBu, so just barely reaching the +18 dBu spec.)
Modifying just the bias, this rises to 6.93 Vrms or +19.0 dBu. An extra 1.5 dB basically for free is not bad. (Has crosstalk between things that used to be on different bias networks worsened in any way?)
With a modified bias network as well, we are looking at 5.50 Vrms or +17.0 dBu, not even that big of a difference really.

Since distortion apparently hasn't suffered, I assume you used thin film resistors.
 
Yes, good catch on the op-amp part number mix-up!

The resistors are just typical thick-film SMD parts with 1% tolerance. That’s probably what was used originally, given this is consumer-grade gear — I doubt you’d find 0.1% thin-film parts here.

Regarding op-amp distortion near the rails: even though most datasheets claim “swing to within x mV of the rail,” distortion usually starts to creep in well before that point. Still, I don’t think the gain level was a real performance issue — I just didn’t like it not matching the output. FYI the ADC wants 2V rms for full-scale.

I didn’t check crosstalk. The bias is still separately buffered by the two halves of op-amp IC10.

It does keep crossing my mind to make a little PCB with a dual-rail generator, OPA1632 (or similar) for the inputs, and a set of super-bal configured OPA1612s on the DAC outputs to sum the differential signal properly… but I really must stop tinkering and creating more work for myself, haha!
 
The resistors are just typical thick-film SMD parts with 1% tolerance. That’s probably what was used originally, given this is consumer-grade gear — I doubt you’d find 0.1% thin-film parts here.
It doesn't have to be 0.1%, but thin-film is the equivalent to traditional metal film types, and those are what I'd stick with in the audio path for excess noise and tempco (distortion) concerns. Obviously H3 may be dominated by the input opamp, which would make distortion concerns a bit moot.

Watching input noise while applying a variable common-mode DC voltage to the input (i.e. (hot = cold) vs. shield) may prove interesting. Associated excess noise in the resistors would be independent and as such add up. Now your mod should make it less critical as values are higher than they were, but still.

FYI the ADC wants 2V rms for full-scale.
It has to be a bit more than that, or else maximum input would only be +16.5 dBu stock / +14.5 dBu modded.
 
Back
Top Bottom