• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

One amp hotter than the other, cause of concern?

olds1959special

Major Contributor
Joined
Apr 5, 2024
Messages
1,186
Likes
486
Location
Los Angeles, CA
My Kenwood L-05M's that power the mids/highs of my speakers generate different heat levels. One amp is 3-5 degrees F hotter than the other. My tech says the heat level is within normal range and there is no reason to be concerned about overheating. But why would one amp be hotter? Is it the bias setting? The tech also said the bias settings were within range. My question is, is bias the reason for one amp being hotter, and does this affect the sound? Should I have both amps set to the same setting for consistency? Or is this a total non-issue?
 
Come back when the difference is 30 to 50 degrees hotter than the other.
Thanks for clarifying that this is a non-issue. One amp is 5 degrees F hotter but everyone tells me not to worry about it.
 
Should I have both amps set to the same setting for consistency?
You can check the DC offset on both amplifiers and see if there is a difference, then adjust (if you know how to do it) so that both amplifiers have the smallest difference in DC offset.
Another thing that can be adjusted is the quiescent current in the output stage of the amplifier, which directly affects the heating of the output transistor heat sink.
 
My question is, is bias the reason for one amp being hotter, and does this affect the sound? Should I have both amps set to the same setting for consistency? Or is this a total non-issue?
Could be bias (quiescent/idle current) of the output stage, could be air flow around the heat sinks being different.
Could even be one of the amps transformers running a little warmer.

You would need to have to bring both to your tech guy.
Have them side by side. Heated up (idle) for an hour or so when thermals are settled.
It makes a difference if the bias current is set several minutes after being switched on or after an hour or so.

Then he would have to adjust the bias to the exact same value and check again after another hour.
After that they should have closely the same temperature.

Only do that when your OCD is triggered, otherwise leave it alone. Amplifiers usually have a small 'range' where the idle current is 'optimal'.
So both amps could be in the optimal range but still differ a little in current (and thus temperature).

When both amps were always about the same temperature and suddenly there is a substantial difference in temperature there is cause for concern.
 
Last edited:
As this is a 'science based forum' you really should be quoting temperatures in Centigrade - then you'd see that the difference is too small to worry about :)
 
Bias should make the biggest difference when idle. ...Maybe not the degree difference, but the percentage difference...

As this is a 'science based forum' you really should be quoting temperatures in Centigrade - then you'd see that the difference is too small to worry about.
Let's not be picky. Of course the temperature difference is the same regardless of the units-of-measure.

There's nothing unscientific about Fahrenheit although scientists and science professors might be snobby about it. And he DID specify the scale he was using. :P
 
There's nothing unscientific about Fahrenheit although scientists and science professors might be snobby about it.
Scientists and science professors - or Europeans and most of the world except the USA whose current leadership seems opposed to science :-)
 
Not that C should necessarily be used in lieu of F... but on that point F is a little arbitrary, as in how it was arrived at;
The first modern thermometer, the mercury thermometer with a standardized scale, was invented by Dutch inventor, Daniel Fahrenheit in 1714.Temperature scales require points of reference.In 1714 Daniel Fahrenheit built his scale on the work of Ole Rømer, whom he had met and discussed concepts of scales. With Rømer's scale, salt brine freezes at zero, water freezes and melts at 7.5 degrees,body temperature is 22.5, and water boils at 60 degrees. Fahrenheit multiplied each value by four in order to eliminate fractions. Hence, water freezes and melts at 32 degrees, body temperature is 90, and water boils at 240 degrees. Only 32° F (0° C) is accurate in a modern context Fahrenheit chose the temperature of the body temperature of a healthy person (Fahrenheit`s healthy person happened to be his wife) which he measured in the armpit at 96°. After Fahrenheit died, his successors used the boiling point of water to calibrate the thermometers. And they set it at 212° such that it retains the sizeof Fahrenheit's degree.The zero point is determined by placing the thermometer in brine: he used a mixture of ice, water, and ammonium chloride, a salt, at a 1:1:1 ratio. This resulting temperature 0 °F (−17.78° C). The second point, at 32 degrees, was a mixture of ice and water at a 1:1 ratio. The third point, 96 degrees, was approximately the human body temperature, then called "blood-heat"

One thing it generally avoids is negative temps.

The Celsius/Centigrade scale and Kelvin scale seem a bit less convoluted... C based on the freezing/boiling point of water and K based on the point of no molecular motion.


JSmith
 
I've long been enamored of this handy conversion tool.

1759796214427.jpeg
 
Back
Top Bottom