I'm using a 3rd Gen Firestick into a Denon AVR-X4500H streaming Amazon HD with Ultra HD turned on. This really does sound good, significantly better than Spotify HD, however it will only do 192 kHz 16 bit, not 24 bit. Does anyone have any idea why?
As an aside, I really didn't expect the Ultra HD to make much improvement, but both my wife and I immediately noticed the difference between the Amazon 192/16 and the Spotify HD versions of the same album. We both feel the sound is "crisper" and can tell the instruments apart more clearly. Both streaming services are great, but the Amazon Ultra HD really sounds more like we are listening to live music. Having said that, individual albums are different. For some there doesn't seem to be much difference, for others it's quite pronounced.
Do you have Amazon Fire TV Stick plugged directly in the Denon, or are you routing the audio to the Denon through another path?
I also have the Amazon Fire TV Stick - got it about a week ago - and have a similar experience with Amazon Music HD. The most I get is 16 bit / 192 kHz.
I'm routing the Amazon Music HD audio through my Samsung TV and using a digital audio optical cable to get the audio to my integrated amp. (The amp does not have an HDMI interface.) I believe the Samsung limits bit depth to 16 bits in my case. The other possibility is the 10 ft TOSLINK cable is causing too much optical attenuation. My integrated amp has an internal 24 bit / 192 kHz DAC.
One weird thing was that Amazon Music HD initially reported my device capability as 16 bit / 48 kHz, and then, after several more hours of play, it updated the device capability to 16 bit / 192 kHz. Right now, if the recording is an Ultra HD source at 24 bit / 192 kHz, Amazon Music HD reports it is sending 16 bit / 192 kHz. If the recording is 24 bit / 48 kHz, it sends it at 16 bit / 48 kHz, even though the 24 bit / 48 kHz data rate would be less than 16 bit / 192 kHz.
I speculate that the Amazon Music HD method to calculate the data rate capability of the the path it is streaming to does a lot of averaging (hours or even days) to avoid being fooled by short term transients, and it is also conservative since it would rather get a continuous stream to the device at a lower data rate than suffer dropouts.
UPDATE: I was able to get full 24 bit depth...see my next post below.