If the volume control on the amp's input is a standard potentiometer (variable resistor) then it's a resistive voltage divider consisting of an Rseries and an Rshunt. The attenuated signal output is taken from the pot's wiper to ground ("Output" in the schematic below).
When you turn down the volume, you're increasing the value of the R in series with the signal and reducing the value or R in parallel (shunting) the signal.
Turning down the volume control reduces the level (amplitude) of signal voltage seen at the wiper of the pot.
Long story short, I can't see any reason why turning down the volume control would not work to reduce your 5V source voltage to a suitable voltage coming from the volume control (pot's) wiper. That's what the volume control is there for.
One problem could be that the taper of the control won't allow you to make fine adjustments in volume, since you'll always being used it down near the start of its rotation (between 7 o'clock and 10 o'clock on the dial). The control might get 'too loud' too quickly. The solution would be to be careful not to turn it up too much too quickly.
The problem I have is that I have several analog sources playing into my amplifier. I have the S/PDIF output from my TV going to a DAC with balanced outputs going into a 1:1 transformer to prevent ground loop hum. (Any electrically grounded signal from my TV causes a bad ground loop in my listening room.) I also have the unbalanced output from my Raspberry Pi's DAC and the output from my phono preamp, all going to an input selector switch/'passive preamp' (with autoformer volume control). The levels from these three sources vary wildly. The TV balanced DAC output is double the amplitude of the unbalanced DAC, and the output from the phono preamp is a little higher in amplitude than from the unbalanced DAC. I had to put a 'trim pot' on the output of the TV balanced DAC to bring it down to the levels from the unbalanced DAC and the phono preamp. Hopefully you're using your DAC as the input selector switch, so you won't have that particular problem with level matching.
Most professional amplifiers will have a 'sensitivity' rating, stating how many volts (or millivolts) of input signal it takes to drive the amp to clipping (max power) into a specified load (4 ohms, 8 ohms, etc.).
For example, the manual for the Hafler P1000:
View attachment 312006
The 'input sensitivity range' spec probably means sensitivity varying with setting of input level controls (one per channel). So with balanced input, that would mean 277mV to full power into a 4 ohm load from the balanced input. Would that mean twice that (554mV) to full power into 4 ohms from the unbalanced input? (I think so.)
Do these small class D amps not have that specification listed?