this post was submitted on 12 Aug 2025
263 points (96.5% liked)

Linux

57296 readers
451 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by AlpΓ‘r-Etele MΓ©der, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] marcie@lemmy.ml 4 points 1 day ago* (last edited 1 day ago)

My phone uses like 30 on idle 🫠

[–] mitch@piefed.mitch.science 117 points 3 days ago* (last edited 3 days ago) (4 children)

Honestly it's a little staggering how much better web video got after the W3C got fed up with Flash and RealPlayer and finally implemented some more efficient video and native video player standards.

<video> was a revolution.

[–] FauxLiving@lemmy.world 29 points 3 days ago (2 children)

I remember, that was a dramatic change.

Also, most people now dont remember this, but YouTube was initially popular because their flash video player was efficient, worked acrossed many different system configurations and browsers and dynamically changed resolution to match your connection.

At that point you had some people with broadband connections and a lot more with dial-up. So often dial-up users would not be able to watch videos because they were only available in one resolution.

YT had 144p (or less!) videos ready for dial-up users and higher resolution videos for broadband users and it automatically picked the appropriate video for the client. This made it so most people (dial-up users) would look to YT first, because you knew that YT would have a video that you could actually watch.

Then Google bought them.

[–] mitch@piefed.mitch.science 18 points 3 days ago (1 children)

YouTube blew up the year I went to college and got access to a T3 line. 🀀 My school had pretty robust security, but it was policy-based. Turns out, if you are on Linux and can't run the middleware, it would just go "oh you must be a printer, c'mon in!"

I crashed the entire network twice, so I fished a computer out of the trash in my parents' neighborhood, put Arch and rtorrrent on it, and would just pipe my traffic via SSH to that machine. :p

Ah, and the short era of iTunes music sharing... Good memories.

[–] FauxLiving@lemmy.world 8 points 3 days ago

Yeah, my high school had a computer lab donated by Cisco to teach their CCNA course. There were like 2 students taking the class and 25 PCs, so we setup one to run WinMX, Kazaa and eDonkey.

They all had CD-RW drives. We were minting music and movie CDs (divx encoded SD movies were under 650MB so they would fit on a CD), and selling them on campus for $3-5. You could get a 100 blank cd-rs for around $40, so it was very profitable.

load more comments (1 replies)
[–] lightrush@lemmy.ca 12 points 3 days ago (1 children)

Oh man, I was like a kid in a candy shop when I got my hands on Flash 4... built quite a few sites with it.

[–] mitch@piefed.mitch.science 16 points 3 days ago (1 children)

My unpopular opinion is that Flash was perhaps one of the greatest media standards of all time. Think about it β€” in 2002, people were packaging entire 15 minute animations with full audio and imagery, all encapsulated in a single file that could play in any browser, for under 10mb each. Not to mention, it was one of the earliest formats to support streaming. It used vectors for art, which meant that a SWF file would look just as good today on a 4k screen as it did in 2002.

It only became awful once we started forcing it to be stuff it didn't need to be, like a Web design platform, or a common platform for applets. This introduced more and more advanced versions of scripting that continually introduced new vulnerabilities.

It was a beautiful way to spread culture back when the fastest Internet anyone could get was 1 MB/sec.

It worked great only on Windows PCs in the times when PC and Windows still weren't the definite winners of the technological race and people have been using all kinds of computers.

load more comments (2 replies)
[–] fmstrat@lemmy.nowsci.com 50 points 3 days ago (4 children)

Obligatory: "Use Debian instead of Ubuntu. It's basically Ubuntu without Snap."

[–] pupbiru@aussie.zone 25 points 3 days ago* (last edited 3 days ago)

it was always wild to be back in the day when so many container images were based on ubuntu… was like PLEASE debian is functionally identical here at like 1/10th the base container size!

[–] lightrush@lemmy.ca 17 points 3 days ago (1 children)

Mostly yes but there are functional differences in convenience. For example the standard upgrade process is completely manual. You have to disable third party repos. You have to change the repos. You have to check if you have space. You have to remove obsolete oackages. And more. On Ubuntu, the software update tool does all that, eliminating a lot of possibility for error. To an exoerienced user, the Debian process is fine. A novice would have plenty of opportunity for frustration and pain.

[–] fmstrat@lemmy.nowsci.com 5 points 2 days ago (1 children)

What? Software Center is GNOME, not Ubuntu. Discover is KDE, not Ubuntu. Debian updates can be done the same way? I don't do any of the things you mention. Using SC or just apt upgrade works just fine.

[–] ozymandias117@lemmy.world 9 points 2 days ago (1 children)

They're talking about a Debian 12 -> Debian 13 upgrade

On Debian, you get release notes on what commands to run.

Ubuntu has their own software update utility, separate from Software Center or Discover, that runs the commands for you

[–] fmstrat@lemmy.nowsci.com 3 points 2 days ago (1 children)

Ahhh OK. I've always gone fresh for a full upgrade. But does apt dist-upgrade not work? That's what the docs say to do.

[–] ozymandias117@lemmy.world 4 points 2 days ago* (last edited 2 days ago) (1 children)

You have to at least modify your sources.list.d manually first. For most people, updating sources.list.d and running full-upgrade will probably work fine...

The full instructions are

  1. run dist-upgrade
  2. remove back ports
  3. remove obsolete packages
  4. remove non-debian packages
  5. clean up old configuration files
  6. add non-free-firmware (this is a 12 -> 13 specific)
  7. remove proposed updates
  8. disable pinning
  9. update sources.list.d to point to the next release
  10. apt upgrade --without-new-packages
  11. apt full-upgrade

It takes like an hour? but it's still not "just press okay."

Ubuntu's has broken on some upgrades for friends and they had to do the whole Debian process manually, but it does try to automate the removals, disablements, and updating sources

Edit: instructions taken from Trixie release. I skipped some that aren't really unique, like make a backup

https://www.debian.org/releases/trixie/release-notes/upgrading.en.html

[–] fmstrat@lemmy.nowsci.com 2 points 1 day ago

Ahh yea, thats not too great

[–] Grass@sh.itjust.works 13 points 3 days ago

I prefer "ubuntu without the bullshit"

[–] deadcream@sopuli.xyz 8 points 3 days ago (9 children)

It has much slower release cycle and ancient kernel. For people with new hardware it's not suitable.

[–] Eggymatrix@sh.itjust.works 10 points 2 days ago

Unless you prototype in a cpu fab it does not matter, debian 13 came out last week and its kernel is not that old

load more comments (8 replies)
[–] ColeSloth@discuss.tchncs.de 23 points 3 days ago (1 children)

Your entire backlight is only 3w? I feel like my phone is over 3w.

[–] ChaoticNeutralCzech@feddit.org 9 points 2 days ago* (last edited 2 days ago)

You can use the Wattz app to monitor current/power flowing into/out of the battery on some Android phones. Yes, 3 W is about the average in normal use. Unfortunately you cannot gauge the power consumption while charging unless you have a USB wattmeter too: the system only measures battery current because it's required for battery capacity/percentages.

[–] solrize@lemmy.ml 57 points 3 days ago (3 children)

Is that good or bad? What cpu? How big is the screen? What encoding?

[–] lightrush@lemmy.ca 48 points 3 days ago* (last edited 3 days ago) (1 children)

It's a Framework with 11th gen Intel i5. I've never seen it below 11W while doing this. I don't recall the exact number I got in Debian 12 but I think it was in the 11-13W range. The numbers were similar with Ubuntu LTS which I used till about a year ago. Now I see 9-10W. The screen is 3:2 13". Not sure about the enconding but I have GPU decoding working in Firefox.

[–] Kazumara@discuss.tchncs.de 7 points 3 days ago

Not sure about the enconding

Right click on video -> Stats for Nerds

It's a youtube video so whatever youtube is these days. I tested with this M1 Macbook Pro and it was using about 7 watts so 3 watts more is pretty good for pretty much anything. I think my 12th Gen. laptop typically draws about 13-15 doing the same thing, but with a much dimmer screen.

load more comments (1 replies)
[–] llii@discuss.tchncs.de 19 points 3 days ago (3 children)

Me with an older notebook that doesn't support av1 decoding: 😭

[–] gnuhaut@lemmy.ml 8 points 3 days ago

There's a browser extension called "Your Codecs." which can prevent YouTube from serving you AV1-encoded videos.

load more comments (2 replies)
[–] serenissi@lemmy.world 15 points 3 days ago (1 children)

I've seen 10-12W easily on 4K for soc without av1. your soc (intel 11 gen) should support av1. try to play the video on mpv (with yt-dlp integration) with various hw acceleration options to see if it changes. probably your browser is software decoding.

for hardware decoding supported soc too I noticed 2-3W of extra power usage when playing youtube from website compared to mpv or freetube. the website seems doing inefficient js stuffs but I haven't profiled it.

[–] Mubelotix@jlai.lu 1 points 2 days ago (1 children)

Av1 will probably increase power usage. It's made to reduce data consumption

[–] serenissi@lemmy.world 1 points 1 day ago (1 children)

on mobile platforms nowadays power is more important than data. OTOH for servers bandwidth is more important.

[–] Mubelotix@jlai.lu 2 points 1 day ago (1 children)

Yeah. But who builds the apps? The guys running the servers or the final end users?

[–] lightrush@lemmy.ca 25 points 3 days ago

It fluctuated between 8.8W and 10.3W.

[–] Zykino@programming.dev 15 points 3 days ago (1 children)

What command do you use to see the Watt used?

[–] ominousdiffusion@lemmy.world 34 points 3 days ago (2 children)
[–] lefixxx@lemmy.world 46 points 3 days ago

( Ν‘Β° ΝœΚ– Ν‘Β°)

load more comments (1 replies)
[–] Mwa@thelemmy.club 9 points 3 days ago (6 children)

What cpu architecture is this?

load more comments (6 replies)
[–] LaLuzDelSol@lemmy.world 1 points 2 days ago

That is very impressive! Although to be honest I question the accuracy of all those estimated power draws. I would be interested to see an endurance test of your battery- assuming your battery capacity is accurate, your runtime on a full charge should line up with your power draw.

[–] ryannathans@aussie.zone 12 points 3 days ago* (last edited 3 days ago) (6 children)

That's very good, audio could do with some work

load more comments (6 replies)
load more comments
view more: next β€Ί