To perform the calculation, you need to not be looking at watts, but the voltage you want the amplifier to hit. This is because amplifiers do not output wattage, they output voltage into an impedance, which results in wattage (Amplifier wattage = voltage squared divided by resistance). For example, to get 50 watts output at 8 ohms requires your amplifier output 20v, but at 4 ohms, it is 14.14v and 2 ohms it is 10v, but at 16 ohms you need 28.28v. Put another way, to maintain the same 28v that gets you 50watts at 16 ohms into 4 ohms, requires 196 watts.

So lets say you need to hit 28v output on your amplifier with 10.5 db of gain. The question then is what input voltage do you need. The answer is 8.4 volts. This of course assumes that your signal being fed to your preamplifier is at -0dbfs, with your preamplifier at maximum gain.