This is not a matter of hi-fi or not. Impedance matching is only useful for very long cable runs. You incur losses and distortion if you load down the output of your device with 600 ohm.
To amplify:
"In professional audio, however, the goal of the signal
transmission system is to deliver maximum voltage, not
maximum power. To do this, devices need low differential
(signal) output impedances and high differential (signal)
input impedances. This practice is the subject of a
1987 IEC standard requiring output impedances to be
50 Ω or less and input impedances to be 10 kΩ or more
[5]. Sometimes called voltage matching, it minimizes
the effects of cable capacitances and also allows an output
to drive multiple inputs simultaneously with minimal
level losses . With rare exceptions, such as telephone
equipment interfaces, the use of matched 600 Ω sources
and loads in professional audio is generally unnecessary
and compromises performance."
– Balanced Lines in Audio Systems: Fact, Fiction, and Transformers - Bill Whitlock, J. Audio Eng. Soc., Vol. 43, No.6, 1995 June
Reference [5] from above:
[5] IEC Pub. 268, "Sound System Equipment," pt. 15, "Preferred Matching Values," International Electrotechnical Commlssion, Geneva, Switzerland (1987), clause 13. 2.
Since our systems are voltage driven, not power driven, impedance matching isn't necessary or advised.
Just for fun, the measurements below are for a forty year old Audio Development Co BUG6-9 1:1 isolation transformer (similar to Triad SP-67) driven by 2 V RMS (cursor not shown @ 1 KHz) analog in/out on an old MacBook Pro:
Frequency
Distortion