this post was submitted on 01 Oct 2023
34 points (100.0% liked)

Linux

48186 readers
1107 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Background story: I recently bought a computer with AMD 7000 series CPU and GPU.

amdgpu_top reports 15 ~ 20 watts in normal desktop usage, but as soon as I have video playing in VLC, it goes to 45 watts constantly which is undesirable behavior especially in summer. (I hope that is just reporting issue... but my computer is hot)

When I do DRI_PRIME=1 vlc and then play videos, amdgpu_top doesn't report the power surge. (I have iGPU enabled)

Is there anything more convenient then modifying individual .desktop files? KDE malfunctions when I put export DRI_PRIME=1 in .xprofile so that's a no go.


Solved: removing mesa related hardware acceleration package makes VLC fall back to libplacebo which doesn't do these weird things.

you are viewing a single comment's thread
view the rest of the comments
[–] Sentau@feddit.de 1 points 1 year ago (4 children)

Do you have a dedicated GPU¿?

[–] axzxc1236@lemm.ee 1 points 1 year ago* (last edited 1 year ago) (3 children)

Yes, rx 7800 xt. I can confirm DRI_PRIME does switch to integrated GPU on demand

DRI_PRIME=0 glxinfo | grep "OpenGL renderer"
OpenGL renderer string: AMD Radeon Graphics (gfx1101, LLVM 16.0.6, DRM 3.54, 6.5.5-arch1-1)
DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
OpenGL renderer string: AMD Radeon Graphics (raphael_mendocino, LLVM 16.0.6, DRM 3.54, 6.5.5-arch1-1)
[–] Sentau@feddit.de 1 points 1 year ago (2 children)

I am assuming you have the monitor connected directly to the 7800xt. Which is why it is the default GPU.

Is the decoding being done when watching the video¿? amdgpu_top shows if the application(vlc in this case) is using the decoding hardware(column named DEC).

Also using the iGPU for video decoding should be more efficient because the massive number of cores in dGPU aren't needed while decoding yet are kept active because the dGPU is active

[–] axzxc1236@lemm.ee 1 points 1 year ago (1 children)

The problem has been solved, it's caused by mesa's video decoding package, I will answer anyway.

Yes, VCN (Video Core Next) column stays at constant value while playing video (3% for VA-API with mesa, 5% for VDPAU with mesa, 0% for libplacebo), GFX fluctuates between 0% and 1%.

Just playing a 1080P video (not even a high bit rate one) is enough to make GPU fan go spinning, disappointing.

[–] Sentau@feddit.de 1 points 1 year ago

Hmm must be some bug in mesa or the way it interacts with vlc . I use VA-API with mesa for my decoding purposes on a system(laptop) with Vega iGPU and RDNA1 dGPU and I don't see high energy usage. In fact I get much better battery life with vaapi hardware decoding.