this post was submitted on 29 Oct 2023
214 points (95.3% liked)

Linux

55835 readers
835 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by AlpΓ‘r-Etele MΓ©der, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

Dust is a rewrite of du (in rust obviously) that visualizes your directory tree and what percentage each file takes up. But it only prints as many files fit in your terminal height, so you see only the largest files. It's been a better experience that du, which isn't always easy to navigate to find big files (or atleast I'm not good at it.)

Anyway, found a log file at .local/state/nvim/log that was 70gb. I deleted it. Hope it doesn't bite me. Been pushing around 95% of disk space for a while so this was a huge win πŸ‘

top 50 comments
sorted by: hot top controversial new old
[–] netchami@sh.itjust.works 97 points 2 years ago (3 children)

I think something might be wrong with your Neovim if it aggregated 70 gigs of log files.

[–] Aatube@kbin.social 50 points 2 years ago (1 children)

don't worry, they've just been using neovim for 700 years, it'll be alright

[–] netchami@sh.itjust.works 6 points 2 years ago (1 children)

Sure, that's also a possibility. I'd be interested in their time machine though.

[–] nik282000@lemmy.ca 26 points 2 years ago (1 children)

So I found out that qbittorrent generates errors in a log whenever it tries to write to a disk that is full...

Everytime my disk was full I would clear out some old torrents, then all the pending log entries would write and the disk would be full again. The log was well over 50gb by the time I figured out that i'm an idiot. Hooray for having dedicated machines.

[–] netchami@sh.itjust.works 12 points 2 years ago

I once did something even dumber. When I was new to Linux and the CLI, I added a recursive line to my shell config that would add it self to the shell config. So I pretty much had exponential growth of my shell config and my shell would take ~20 seconds to start up before I found the broken code snippet.

[–] rutrum@lm.paradisus.day 1 points 2 years ago (1 children)

If you have ideas please let me know. I'm preparing to hop distros so I'm very tempted to ignore the problem, blame the old distro, and hope it doesn't happen again :)

[–] netchami@sh.itjust.works 3 points 2 years ago

I would have to look at the log file. Some plugin probably has an issue and writes massive amounts of data to the log every time you use Neovim. Monitor the growth of the log file and contact me via DM if it goes crazy again, I'm gonna try to figure out what's going on.

[–] anagram3k@lemmy.ml 77 points 2 years ago (12 children)

ncdu is the best utility for this type of thing. I use it all the time.

[–] oldfart@lemm.ee 17 points 2 years ago

I install ncdu on any machine I set up, because installing it when it's needed may be tricky

[–] dan@upvote.au 15 points 2 years ago* (last edited 2 years ago)

Try dua. It's like ncdu but uses multiple threads so it's a lot faster., especially on SSDs.

load more comments (10 replies)
[–] bizdelnick@lemmy.ml 35 points 2 years ago (5 children)

I usually use something like du -sh * | sort -hr | less, so you don't need to install anything on your machine.

[–] mvirts@lemmy.world 8 points 2 years ago

Same, but when it's real bad sort fails πŸ˜… for some reason my root is always hitting 100%

I usually go for du -hx | sort -h and rely on my terminal scroll back.

[–] meteokr@community.adiquaints.moe 5 points 2 years ago (1 children)

dust does more than what this script does, its a whole new tool. I find dust more human readable by default.

[–] bizdelnick@lemmy.ml 2 points 2 years ago (1 children)

Maybe, but I need it one time per year or so. It is not a task for which I want to install a separate tool.

load more comments (1 replies)
[–] digdilem@lemmy.ml 4 points 2 years ago* (last edited 2 years ago) (2 children)

Almost the same here. Well, du -shc *|sort -hr

I admin around three hundred linux servers and this is one of my most common tasks - although I use -shc as I like the total too, and don't bother with less as it's only the biggest files and dirs that I'm interested in and they show up last, so no need to scrollback.

When managing a lot of servers, the storage requirements when installing extra software is never trivial. (Although our storage does do very clever compression and it might recognise the duplication of the file even across many vm filesystems, I'm never quite sure that works as advertised on small files)

[–] dan@upvote.au 3 points 2 years ago (1 children)

I admin around three hundred linux servers

What do you use for management? Ansible? Puppet? Chef? Something else entirely?

[–] digdilem@lemmy.ml 3 points 2 years ago (1 children)

Main tool is Uyuni, but we use Ansible and AWX for building new vms, and adhoc ansible for some changes.

[–] dan@upvote.au 3 points 2 years ago* (last edited 2 years ago) (1 children)

Interesting; I hadn't heard of Uyuni before. Thanks for the info!

[–] cobra89@beehaw.org 1 points 2 years ago (1 children)

Seems it just runs Salt/Saltstack?

[–] digdilem@lemmy.ml 2 points 2 years ago (1 children)

Suse forked Redhat's Spacewalk just before it turned into Foreman + Katello.

Then worked an absolute crapload on it to turn it into a modern orchestrator. Part of that was to adopt salt as the agent interface, gradually getting rid of the creaking EL traditional client.

To say "it just runs salt" is to rather miss all the other stuff Uyuni does. Full repo and patch management, remote control, config management, builds, ansible playbook support, salt support, and just about everything else you need to manage hundreds of machines. Oh, and it does that for Rocky, RHEL, Alma, Suse, Ubuntu, Debian and probably a bunch more too, by now. Has a very rich webui, a full API and you can do a bunch more from the cli as well. And if your estate gets too big to manage with one machine, there are proxy agents, as many as you want. I only run a couple of hundred vms through it, but there are estates running thousands.

And it's free and foss.

Honestly, it's pretty awesome and I'm amazed it's not more widely known.

[–] cobra89@beehaw.org 2 points 2 years ago

Oh that's pretty nifty, thanks for the comment. Sorry wasn't trying to minimize the tool, I was simply referring to the orchestration/config management aspect of it when I looked it up real quick.

I used to be responsible for configurations of 40,000 (yes forty thousand) VMs for a large company using puppet and then later using Ansible and that was an interesting challenge. I've been out of the configuration management game for a few years now though so I'm pretty out of the loop. Was familiar with spacewalk back in the day too.

I'll have to check Uyuni out, thanks for sharing!

[–] pete_the_cat@lemmy.world 2 points 2 years ago (4 children)

We'd use du -xh --max-depth=1|sort -hr

load more comments (4 replies)
[–] caseyweederman@lemmy.ca 2 points 2 years ago

I'd say head -n25 instead of less since the offending files are probably near the top anyway

[–] lauha@lemmy.one 2 points 2 years ago (1 children)

Or head instead of less to get the top entries

[–] digdilem@lemmy.ml 1 points 2 years ago

With sort -hr, the biggest ones are generally at the bottom already, which is often what most people care about.

[–] jcdenton@lemy.lol 21 points 2 years ago

So like filelight?

[–] badloop@lemmy.world 20 points 2 years ago (1 children)

Yeah I got turned onto ncdu recently and I’ve been installing it on every vm I work on now

[–] Rambi@lemm.ee 9 points 2 years ago (1 children)

A 70gb log file?? Am I misunderstanding something or wouldn't that be hundreds of millions of lines

[–] Mo5560@feddit.de 8 points 2 years ago

I've definitely had to handle 30gb plain text files before so I am inclined to believe twice as much should be just as possible

[–] corsicanguppy@lemmy.ca 7 points 2 years ago

You guys aren't using du -sh ./{dir1,dir2} | sort -nh | head?

[–] donio@lemmy.world 6 points 2 years ago* (last edited 2 years ago)

Maybe other tools support this too but one thing I like about xdiskusage is that you can pipe regular du output into it. That means that I can run du on some remote host that doesn't have anything fancy installed, scp it back to my desktop and analyze it there. I can also pre-process the du output before feeding it into xdiskusage.

I also often work with textual du output directly, just sorting it by size is very often all I need to see.

[–] mindbleach@sh.itjust.works 6 points 2 years ago (2 children)

I miss WinDirStat for seeing where all my hard drive space went. You can spot enormous files and folders full of ISOs at a glance.

For bit-for-bit duplicates (thanks, modern DownThemAll), use fdupes.

Filelight on linux

Squirreldisk on windows

Both libre

[–] lemmyingly@lemm.ee 4 points 2 years ago

If WizTree is available on Linux then I highly recommend it over all other alternatives.

It reads straight from the table and is done within a couple of seconds.

[–] JetpackJackson@feddit.de 4 points 2 years ago

I use gdu and never had any issues like that with it

[–] Cysioland@lemmygrad.ml 2 points 2 years ago

Yeah, it helped me unblock my server where I ran out of space

load more comments
view more: next β€Ί