• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

This is absurd! The ROON windows app draws about 80 to 90W using my GPU.

Did you notice an high power usage (>50W) of the Roon Windows app on your own PC?

  • Yes - Power Usage > 80W

    Votes: 0 0.0%
  • Yes - Power Usage > 50W

    Votes: 0 0.0%
  • Yes - Power Usage > 30W

    Votes: 0 0.0%

  • Total voters
    7

kacos

Active Member
Joined
Sep 19, 2023
Messages
102
Likes
56
That is really an excellent tip. This should solve the issue.

I tried it but it does not to reduce the GPU power draw. My NVidia control center looks a bit different and has different option. I will try to make tinker with this a bit more.

View attachment 321273
FYI,
here's my total power consumption (from the UPS feeding them), including a desktop PC with Roon remote app running, an 8-disk NAS with Roon & Plex server running, 1 standby NAS, 2 Elac ARB51 powered speakers playing music, a streaming DAC (Roon ready), 2 managed network switches, 2 routers, 1 access point, 1 desk light, a 4K 27" monitor, and other small stuff:

1698322086346.png


So my guess is that you got a VERY power hungry GPU
 

TulseLuper

Active Member
Forum Donor
Joined
Oct 1, 2019
Messages
281
Likes
467
Location
Illinois
FWIW I have an RTX 2080 and do not see this power issue with default settings. There’s a quick spike when opening Roon, but output wattage on the UPS goes back near where it was before Roon opened in a few seconds.
 
OP
ironhorse128

ironhorse128

Active Member
Forum Donor
Joined
Jan 9, 2020
Messages
169
Likes
176
Thanks for the feedback. It is strange that my system behaves this way.
 

Dunring

Major Contributor
Forum Donor
Joined
Feb 7, 2021
Messages
1,268
Likes
1,373
Location
Florida
The three things I'd change would be

Open Windows gamebar and set Roon to power saving profile, not let Windows decide.
Nvidia driver set to optimum for that app profile, which is a more efficient version of adaptive.
In power settings profile, media playback settings might help.
I had a water cooled 3080 and it's amazing, but I turned off options in programs like "use hardware acceleration when available" for 2D apps to keep it from kicking into graphics mode over a web browser. It's so fast you don't need it for non games anyway.
 

thulle

Active Member
Joined
Aug 31, 2021
Messages
100
Likes
134
As people have figured out this is due to high core count/high memory GPUs consuming a good amount of power as soon as the compute cores get any higher load.
Stats from my former 3080, tons of stuff running + one video stream decoded by cpu and just rendered by gpu:
(stats are one row per second)
# gpu pwr gtemp mtemp sm mem enc dec mclk pclk pviol tviol fb bar1 sbecc dbecc pci rxpci txpci # Idx W C C % % % % MHz MHz % bool MB MB errs errs errs MB/s MB/s 0 46 45 - 28 19 0 0 810 510 0 0 1178 8 - - 0 318 154 0 46 45 - 27 17 0 0 810 510 0 0 1178 8 - - 0 170 144 0 46 45 - 28 19 0 0 810 525 0 0 1178 8 - - 0 170 152 0 46 45 - 23 17 0 0 810 525 0 0 1178 8 - - 0 169 142 0 46 45 - 25 17 0 0 810 510 0 0 1178 8 - - 0 169 144 0 46 45 - 24 17 0 0 810 510 0 0 1178 8 - - 0 178 155

46W for 810MHz memclock and 510-525MHz GPU clock. As soon as I put a computing load on it (in this case a custom video decoder for the mentioned video stream):

# gpu pwr gtemp mtemp sm mem enc dec mclk pclk pviol tviol fb bar1 sbecc dbecc pci rxpci txpci # Idx W C C % % % % MHz MHz % bool MB MB errs errs errs MB/s MB/s 0 111 48 - 11 3 0 5 9251 1785 0 0 1612 11 - - 0 131 255 0 111 48 - 10 2 0 5 9251 1785 0 0 1612 11 - - 0 132 242 0 110 48 - 11 3 0 5 9251 1785 0 0 1612 11 - - 0 19 127 0 111 48 - 11 3 0 5 9251 1785 0 0 1612 11 - - 0 7 130 0 109 47 - 6 1 0 5 9251 1785 0 0 1610 11 - - 0 0 140 0 108 47 - 5 1 0 5 9251 1785 0 0 1610 11 - - 0 3 127 0 108 47 - 5 1 0 5 9251 1785 0 0 1610 11 - - 0 10 127

Memory clocks up to 9251MHz and GPU to 1785MHz, resulting in ~110W power consumption, even though the load-% is just 5-11%.

Since I already had running scripts that change cpu behaviour depending on what I have running I just figured out what frequencies were adequate to keep my desktop and video acceleration running and limited my GPU to those. If I start some machine learning stuff or a game, then the limits are removed. Numbers from my current 4090 (yea, cost half a kidney):

# gpu pwr gtemp mtemp sm mem enc dec jpg ofa mclk pclk pviol tviol fb bar1 rxpci txpci # Idx W C C % % % % % % MHz MHz % bool MB MB MB/s MB/s 0 59 51 - 6 1 0 2 0 0 5001 810 0 0 1906 15 344 21 0 59 51 - 6 1 0 2 0 0 5001 810 0 0 1906 15 191 34 0 59 51 - 6 1 0 2 0 0 5001 810 0 0 1905 15 177 21 0 59 51 - 6 1 0 2 0 0 5001 810 0 0 1905 15 340 21 0 59 51 - 6 1 0 2 0 0 5001 810 0 0 1910 15 175 15 0 59 51 - 6 1 0 2 0 0 5001 810 0 0 1910 15 172 35

Memory capped at 5001 MHz and GPU at 810 MHz, resulting in 59 W power consumption. There aren't that many steps for the memory clocks so the step below is 810 MHz for memory too, which is too slow to handle 2 video streams and my desktop smoothly, had 600 MHz there earlier. Could probably lower it after GPU upgrade now when I think about it. Note memory usage and rxpci/txpci, I have more stuff running currently than when I measured the wattage before, I'm pretty sure the settings resulted in ~52-53 W usage on the 3080.

I guess the power profiles in nvidia-settings can do the same, or some other tool, so to optimize this you would set clocks manually and see how low everything works as intended and set that as the max in the profile.
An unfortunate backside of us having 10k+ core GPUs that has to run in sync regardless of how many of them we actually need for the current load. By default it's probably running at pretty high GPU clocks too to not cause stutter at intermittent (game) loads, which would look really bad in benchmarks.

edit:
Messing around a bit with the frequency limits it seems 810MHz memclock is fine for video decoding after the GPU upgrade (and a few driver upgrades too btw), so was able to cut another 9 W.

# gpu pwr gtemp mtemp sm mem enc dec jpg ofa mclk pclk pviol tviol fb bar1 rxpci txpci # Idx W C C % % % % % % MHz MHz % bool MB MB MB/s MB/s 0 50 45 - 13 13 0 10 0 0 810 810 0 0 2161 15 42 9 0 50 45 - 10 11 0 10 0 0 810 810 0 0 2176 15 285 20 0 51 45 - 11 12 0 10 0 0 810 810 0 0 2187 15 15 8 0 50 45 - 10 11 0 10 0 0 810 810 0 0 2187 15 25 15 0 49 45 - 10 12 0 11 0 0 810 810 0 0 2185 15 34 10

Behaviour seems a bit random in the range of a few watts though, switching video streams on twitch (all 1080p 8000Kbps) it's sometimes stable at 50±1 W and sometimes at 52±1 W. Maybe depends on the encoder settings and thus how the video stream has to be decoded, even though it's the same codec and bitrate.

edit2: Nevermind, the extra power usage from some streams (52±1 W vs 50±1 W) was the stream-switching bumping the GPU temperature over 60° and thus starting the GPU fans:

2023-10-28-154355_910x288_scrot.png
 
Last edited:

antcollinet

Master Contributor
Forum Donor
Joined
Sep 4, 2021
Messages
7,742
Likes
13,065
Location
UK/Cheshire
I tried it but it does not to reduce the GPU power draw
Are you running any parametric eq/room correction in roon? If so, perhaps that is firing up the GPU. Try disabling it.

Any ohter optional functionality should be turned off also (for diagnostic)
 
OP
ironhorse128

ironhorse128

Active Member
Forum Donor
Joined
Jan 9, 2020
Messages
169
Likes
176
I have not really been able to bring the power consumption down. I stopped using the hdmi port of the GPU and use the hdmi port of my iGPU. This fixed the issue. I can still use the GPU for AI reasearch, the main reason why I have it.

Neverthless, I think people should know about the optential insane power draw from such a simple app like roon if it is badly programmed.

Not running any parametric eq in roon. Neverthless, these parametric eq settings should be computed on the roon server not the client.
 

formdissolve

Senior Member
Forum Donor
Joined
Jul 19, 2019
Messages
391
Likes
329
Location
USA
Did you (or they) end up fixing the insane power draw issue? I run Roon server on an M2 Mac Mini and while the memory usage is a LOT higher than I'd like, the CPU usage and overall power draw is fairly low.
 
OP
ironhorse128

ironhorse128

Active Member
Forum Donor
Joined
Jan 9, 2020
Messages
169
Likes
176
Did you (or they) end up fixing the insane power draw issue? I run Roon server on an M2 Mac Mini and while the memory usage is a LOT higher than I'd like, the CPU usage and overall power draw is fairly low.
The insane Power draw was on my client machine. Not my Server. No I did not solve it. I had to switch the from using the GPU to the iGPU.
 

thulle

Active Member
Joined
Aug 31, 2021
Messages
100
Likes
134
It's drifting a bit off-topic, but might as well add some more observations:

Behaviour seems a bit random in the range of a few watts though, switching video streams on twitch (all 1080p 8000Kbps) it's sometimes stable at 50±1 W and sometimes at 52±1 W. Maybe depends on the encoder settings and thus how the video stream has to be decoded, even though it's the same codec and bitrate.

edit2: Nevermind, the extra power usage from some streams (52±1 W vs 50±1 W) was the stream-switching bumping the GPU temperature over 60° and thus starting the GPU fans:

View attachment 321956

Yesterday I realized that an occational noise I had been hearing was something that happened once in a while as the bottom case fans in my PC starting up from a standstill, so I adjusted their fan curves to let them stay turned off for longer.
Today I woke up to some interesting graphs:

2024-02-08-160856_1192x951_scrot.png


Yellow curve is the fan speed, It has been turning on for short periods when the GPU temperature reaches 60 °C, and then turns off around 42 °C.
Now check orange curve, GPU power consumption as reported by nvidia-smi. As temperature swings + 18° C power consumption swings ~10 W?

Doing a 1min rolling average to denoise the top green graph, total power consumption as measured by the PSU, indicates that's it's more like 5-6 W power usage swings. I guess most of the rest is measurement error as the power measurement circuit(s) on the GPU shifts in temperature.

I guess it makes sense with high thermal resistance losses at these sub-1V GPU:s, and this is something of a worst-case scenario with 3 monitors connected (largest reason for 40+ W idle instead of half this) to a 4090, but I still feel surprised seeing over 10% power usage increase from a 18 °C temperature swing.

Time to experiment in lowering power consumption by increasing fan power consumption :D
 

thulle

Active Member
Joined
Aug 31, 2021
Messages
100
Likes
134
Lifted the floor of fan curves, still silent, CPU mean -8.8 °C, GPU mean -7.5 °C, power consumption mean as measured by PSU: 121W, ie. -12 W, or -9.1%.
Might be some minor changes in load and/or room temperature, but I never see total consumption below 120 W, looking back the last 2 months min-point during idle usually is 126 W or higher, now it was 114 W.
 
Last edited:
Top Bottom