this post was submitted on 19 Feb 2026
31 points (87.8% liked)

Linux

63002 readers
1265 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

Just wanted to share an alias I have in use and found it useful again. It's a simple wrapper around xargs, which I always forget how to use properly, so I set up an alias for. All it does is operate on each line on stdout.

The arguments are interpreted as the command to execute. The only thing to remember is using the {} as a placeholder for the input line. Look in the examples to understand how its used.

# Pipe each line and execute a command. The "{}" will be replaced by the line.
#
# Example:
#   cat url.txt | foreach echo download {} to directory
#   ls -1 | foreach echo {}
#   find . -maxdepth 2 -type f -name 'M*' | foreach grep "USB" {}
alias foreach='xargs -d "\n" -I{}'

Useful for quickly operating on each line of a file (in example to download from list of urls) or do something with any stdout output line by line. Without remembering or typing a for loop in terminal.

all 12 comments
sorted by: hot top controversial new old
[–] non_burglar@lemmy.world 2 points 1 hour ago

Be careful.

Because it only formats stdin streams to into string(s), xargs can be very dangerous, depending on the command to which the arguments are being passed.

Xargs used to be a practical way to get around bash globbing issues and parenthetical clause behavior, but most commands have alternate and safer ways of handling passed arguments.

find -exec is preferable to xargs to avoid file expansion "bombs", plus find doesn't involve the shell, so it doesn't care about whitespace problems.

[–] thevoidzero@lemmy.world 2 points 2 hours ago* (last edited 2 hours ago) (1 children)

I recommend you gnu parallel. It does similar things, but runs the commands in parallel. And it's way easier to pipe than xargs. If you really need it to run one command at a time you can give number of cores to 1. And it also has progress bars, colors to differentiate stdout fo different commands, etc.

Basic example: to echo each line

parallel echo < somefile.txt

To download all links, number of jobs 4, show progress

parallel -j 4 --bar ''curl -O" < links.txt

You can do lot more stuffs with inputs, like placing them wherever with {}, numbers ({1} is first) that allow multiple unique arguments, transformers like remove extension, remove parent path, etc. worth learning

[–] thingsiplay@lemmy.ml 1 points 15 minutes ago

I am actually aware of parallel and use it for a different tool / script I built. The purpose of parallel is different than xargs, right? I mean xargs works on each line of a stdout string, which is what I was using it for. I never thought parallel as an alternative to xargs and need to investigate into this idea more. Thanks.

[–] bizdelnick@lemmy.ml 2 points 5 hours ago (1 children)

I almost never use xargs. The most common case for it is find, but it is easier to use its -exec option. Also, with find your example is incorrect. You forgot that file names can contain special characters, the newline character in particular. That's why you need to pass -print0 option to find and -0 option to xargs.

[–] thingsiplay@lemmy.ml 1 points 5 hours ago

The example itself is not incorrect. It is just an example to show how the foreach works, not meant to be a full command on itself. Usually I don't have newline characters in files either, so that is not a concern for myself. If I would want to be sure, then yes I would use zero option. But its good to point that out.

[–] hades@feddit.uk 11 points 10 hours ago (3 children)

Nice! I used to do something like this, which avoids xargs altogether:

cat urls.txt | while read url; do echo download $url; done
[–] Static_Rocket@lemmy.world 7 points 7 hours ago

You can also avoid cat since you aren't actually concatenating files (depending on file size this can be much faster):

while read -r url; do echo "download $url"; done < urls.txt
[–] davel@lemmy.ml 8 points 9 hours ago* (last edited 9 hours ago)

Usually this is the way. Once you enter xargs’ world, you lose access to your shell aliases, functions, and un-exported variables, which will often bite you in the ass.

[–] thingsiplay@lemmy.ml 8 points 10 hours ago

You should use -r option for read command to preserve backslashes. I was using while loops before too, but wanted to have a compact single command replacement. And doing it with a while loop as an alias (or function) didn't work well, because the command has to be interpreted. xargs does exactly that, as it is designed for this kind of stuff. Other than having less stuff to type, I wonder if there are benefits from one over the other while vs xargs. In a script, I prefer writing full while loops instead.