I've been checking out some of the 3.5mm adapters that provide resistance since I've gotten into low ohm IEM's lately. Couldn't find specific information on what they drop the ohm rating to though. Some amps like the SMSL SP200 output 12 ohms, so trying to determine if they would prevent changing the sound of something like 10 ohm IEM's. I see how they could help to bring up the volume up on an amp for better channel matching, but has any measurements of their output impedence after the adapter been done?
Some IEM's actually sound better with some specific series resistance. This is a technical thing where the ratio between the impedance and output R changes the frequency response to a 'nicer sounding' one.
This certainly isn't the case for most IEMs. They are designed to be driven from low output R sources.
These series resistors will also attenuate.
That could be beneficial when an amp is relatively noisy and the IEM is very sensitive. This way you can lower the noise level heard (from the amp) and you will have to set the volume knob a bit higher in that case.
That could also be handy when your volume control is used in the lowest range where you could experience channel imbalance (varies per volume control setting)
These simple resistors can be used for that BUT the high resistance value can alter the frequency response of said sensitive drivers and can do so in a negative way.
When the goal is to counter selfnoise of an amp/source and or get a more usable volume control range it would be better to use an
attenuator (resistor network with 2 resistors).
This can A: lower the signal in a much more significant way and B: ensure a 'good load' to the amp and ensure a low source resistance as 'seen' by the driver ensuring the frequency response (tonal balance) is not changed.