We collect practical, well-explained Bash one-liners, and promote best practices in Bash shell scripting. To get the latest Bash one-liners, follow @bashoneliners on Twitter. If you find any problems, report a bug on GitHub.

Tags

0

Take values from a list (file) and search them on another file

 $ for ITEM in `cat values_to_search.txt`; do  (egrep $ITEM full_values_list.txt && echo $ITEM found) | grep "found" >> exit_FOUND.txt; done

— by ManuViorel on May 16, 2018, 3:20 p.m.

Explanation

This line :) searches values taken from a file (values_to_search.txt) by scanning a full file values list . If value found, it is added on a new file exit_FOUND.txt.

Alternatively, we can search for values from the list 1 which does NOT exists on the list 2, as bellow:

for ITEM in cat values_to_search.txt; do (egrep $ITEM full_values_list.txt || echo $ITEM not found) | grep "not found">> exit_not_found.txt; done

Limitations

No limitations

0

Delete all untagged Docker images

 $ docker rmi $(docker images -f "dangling=true" -q)

— by stefanobaghino on April 27, 2018, 2:50 p.m.

Explanation

docker images outputs all images currently available. By specifying -f "dangling=true" we restrict the list to "dangling" images (i.e. untagged). By specifying the -q option we use quiet mode, which limits the output to the images hash, which is the directly fed into docker rmi, which removes the images with the corresponding hashes.

1

Have script run itself in a virtual terminal

 $ tty >/dev/null || { urxvt -e /bin/sh -c "tty >/tmp/proc$$; while test x; do sleep 1; done" & while test ! -f /tmp/proc$$; do sleep .1; done; FN=$(cat /tmp/proc$$); rm /tmp/proc$$; exec >$FN 2>$FN <$FN; }

— by openiduser111 on March 9, 2018, 2:56 a.m.

Explanation

What are you doing dude? I tried using your script and rxvt -e will kill any subprocesses at the end when I wanted them to stay. Anyway:

  • We begin by testing if the script is not in a terminal with tty.
  • If it is not we start a terminal that runs tty and saves it to a filename. $$ was set by the original script and is its PID. That is opened in the background using & and then the original script waits for the filename to appear, then reads and removes it.
  • Finally, the main command is a special syntax of the bash builtin command exec that contains nothing but redirections (of stdout, stderr, and stdin) so they will apply to every command in the rest of the script file.

1

Big CSV > batches > JSON array > CURL POST data with sleep

 $ cat post-list.csv | split -l 30 - --filter='jq -R . | jq --slurp -c .' | xargs -d "\n" -I % sh -c 'curl -H "Content-Type: application/json" -X POST -d '"'"'{"type":1,"entries":%}'"'"' http://127.0.0.1:8080/purge-something && sleep 30'

— by pratham2003 on March 7, 2018, 12:12 p.m.

Explanation

post-list.csv contains list of URLs in my example.

  • split -l 30 Split by 30 lines

  • - Use stdin as input for split

  • --filter Couldn't find a way to easily pipe to stdout from split, hence --filter

  • jq -R . From the jq manual - Don’t parse the input as JSON. Instead, each line of text is passed to the filter as a string

  • jq --slurp -c . From the jq manual - Instead of running the filter for each JSON object in the input, read the entire input stream into a large array and run the filter just once. -c makes it easier to pipe and use it in the xargs that follows.

  • xargs -d "\n" -I % sh -c Execute a command for each array. Use "\n" as delimiter. Use % as a placeholder in the command that follows.

  • Single quotes inside sh -c ' ... ' are escaped as '"'"' single-double-single-double-single. You can do whatever you need to inside sh -c ' ... && sleep 123'

Limitations

You need jq installed, for example in Debian / Ubuntu:

apt-get install jq`

See also https://stedolan.github.io/jq/manual/

I suspect the input file (cat post-list.csv) may not contain double or single quotes but haven't tested it.

1

List all packages with at least a class defined in a JAR file

 $ jar tf "$1" | grep '/.*\.class$' | xargs dirname | sort -u | tr / .

— by stefanobaghino on Feb. 19, 2018, 12:13 p.m.

Explanation

The jar command allows you to read or manipulate JAR (Java ARchive) files, which are ZIP files that usually contain classfiles (Java compiled bytecode files) and possibly manifests and configuration files. We specify that we want to list file contents (t) that we provide as an argument (f, otherwise the jar will be read from stdin).

From the output, we get only the paths that contain a classfile (grep), then the path to the package that contains it (xargs dirname), we get the unique, sorted paths and translate /s to .s (to display their names as they would be shown in Java syntax).

Limitations

Will only exhaustively list the packages with a defined class for languages that require packages to map to the directory structure (e.g.: Java does, Scala doesn't). If this convention is respected, the command will output an exhaustive list of packages nonetheless.

1

Output an arbitrary number of open TCP or UDP ports in an arbitrary range

 $ comm -23 <(seq "$FROM" "$TO") <(ss -tan | awk '{print $4}' | cut -d':' -f2 | grep "[0-9]\{1,5\}" | sort | uniq) | shuf | head -n "$HOWMANY"

— by stefanobaghino on Feb. 9, 2018, 3:51 p.m.

Explanation

Originally published (by me) on unix.stackexchange.com.

comm is a utility that compares sorted lines in two files. It outputs three columns: lines that appear only in the first file, lines that only appear in the second one and common lines. By specifying -23 we suppress the latter columns and only keep the first one. We can use this to obtain the difference of two sets, expressed as a sequence of text lines. I learned about comm here.

The first file is the range of ports that we can select from. seq produces a sorted sequence of numbers from $FROM to $TO. The result is piped to comm as the first file using process substitution.

The second file is the sorted list of ports, that we obtain by calling the ss command (with -t meaning TCP ports, -a meaning all - established and listening - and -n numeric - don't try to resolve, say, 22 to ssh). We then pick only the fourth column with awk, which contains the local address and port. We use cut to split address and port with the : delimiter and keep only the latter (-f2). ss also output an header, that we get rid of by grepping for non-empty sequences of numbers that are no longer than 5. We then comply with comm's requirement by sorting numerically (-n) and getting rid of duplicates with uniq.

Now we have a sorted list of open ports, that we can shuffle to then grab the first "$HOWMANY" ones with head -n.

Example

Grab the three random open ports in the private range (49152-65535)

comm -23 <(seq 49152 65535) <(ss -tan | awk '{print $4}' | cut -d':' -f2 | grep "[0-9]\{1,5\}" | sort | uniq) | shuf | head -n 3

could return for example

54930
57937
51399

Notes

  • switch -t with -u in ss to get free UDP ports instead.
  • drop shuf if you're not interested in grabbing a random port

0

Source without circular reference

 $ [ ! "${LIB}" ] && ( readonly LIB; . "${ $( cd $( dirname $0 ) && pwd ) }/<path_to>/LIB.sh" )

— by dhsrocha on Jan. 24, 2018, 4:30 p.m.

Explanation

Source LIB only if the corresponding variable is not defined, in order to prevent circular reference loop in case of the same script has been sourced before during the sourcing event.

0

Ternary conditional clause

 $ [ test_statement ] && ( then_statement ) || ( else_statement );

— by dhsrocha on Jan. 22, 2018, 5:27 p.m.

Explanation

The test_statement is under a child process (a.k.a. subshell), which will return a boolean output. The then_statement depends on the former to be executed. Otherwise, it will be done upon the latter.

1

Get executed script's current working directory

 $ CWD=$(cd "$(dirname "$0")" && pwd)

— by dhsrocha on Jan. 22, 2018, 4:55 p.m.

Explanation

Will return excuting script's current working directory, wherever Bash executes the script containing this line.

0

Random Git Commit

 $ git commit -m "$(w3m whatthecommit.com | head -n 1)"

— by Jab2870 on Jan. 5, 2018, 4:55 p.m.

Explanation

This will commit a message pulled from What the Commit.

-m allows you to provide the commit message without entering your editor

w3m is a terminal based web browser. We basically use it to strip out all of the html tags

head -n 1 will grab only the first line

Limitations

This requires you to have w3m installed

1

Blackhole ru zone

 $ echo "address=/ru/0.0.0.0" | sudo tee /etc/NetworkManager/dnsmasq.d/dnsmasq-ru-blackhole.conf && sudo systemctl restart network-manager

— by olshek_ on Nov. 14, 2017, 2:12 p.m.

Explanation

It creates dnsmasq-ru-blackhole.conf file with one line to route all domains of ru zone to 0.0.0.0.

You might use "address=/home.lab/127.0.0.1" to point allpossiblesubdomains.home.lab to your localhost or some other IP in a cloud.

0

Remove new lines from files and folders

 $ rename 's/[\r\n]//g' *

— by moverperfect on Sept. 30, 2017, 10:07 p.m.

Explanation

This will search all files and folders in the current directory for any with a new line character in them and remove the new line out of the file/folder.

1

Retrieve dropped connections from firewalld journaling

 $ sudo journalctl -b | grep -o "PROTO=.*" | sed -r 's/(PROTO|SPT|DPT|LEN)=//g' | awk '{print $1, $3}' | sort | uniq -c

— by FoxBuru on Sept. 14, 2017, 5:10 p.m.

Explanation

We take the output of journalctl since the last boot (-b flag) and output from PROTO= until the EOL. Then, we remove identification tags (PROTO=/SPT=/DPT=/LEN=) and print just the protocol and destination port (cols 1 and 3). We sort the output properly so we can aggregate them on the call over uniq.

Limitations

  • Only works on Linux
  • You use firewalld and you have logging set on ALL (see firewalld.conf for details)
  • You use journald for logging
  • Your user has sudo privileges

0

Kill a process running on port 8080

 $ lsof -i :8080 | awk 'NR > 1 {print $2}' | xargs --no-run-if-empty kill

— by Janos on Sept. 1, 2017, 8:31 p.m.

Explanation

lsof lists open files (ls-o-f, get it?). lsof -i :8080 lists open files on address ending in :8080. The output looks like this

COMMAND  PID     USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
chrome  2619 qymspace  149u  IPv4  71595      0t0  TCP localhost:53878->localhost:http-alt (CLOSE_WAIT)`

We use awk 'NR > 1 {print $2}' to print the second column for lines except the first. The result is a list of PIDs, which we pipe to xargs kill to kill.

Limitations

The --no-run-if-empty option of xargs is available in GNU implementations, and typically not available in BSD implementations. Without this option, the one-liner will raise an error if there are no matches (no PIDs to kill).

0

Kill a process running on port 8080

 $ lsof -i :8080 | awk '{print $2}' | tail -n 1 | xargs kill

— by kimbethwel on Aug. 18, 2017, 8:22 a.m.

Explanation

lsof lists open files (ls-o-f, get it?). lsof -i :8080 lists open files on address ending in :8080. The output looks like this

COMMAND  PID     USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
chrome  2619 qymspace  149u  IPv4  71595      0t0  TCP localhost:53878->localhost:http-alt (CLOSE_WAIT)`

We pipe this input through awk to print column 2 using the command awk '{print $2}' to produce the output:

PID
2533

To remote the word PID from this output we use tail -n 1 to grab the last row 2533,

We can now pass this process id to the kill command to kill it.

1

Get the latest Arch Linux news

 $ w3m https://www.archlinux.org/ | sed -n "/Latest News/,/Older News/p" | head -n -1

— by Jab2870 on Aug. 15, 2017, 10:35 a.m.

Explanation

w3m is a terminal web browser. We use it to go to https://www.archlinux.org/

We then use sed to capture the text between Latest News and Older News.

We then get rid of the last line which is Older News.

Limitations

For this, w3m would need to be installed. It should be installable on most systems.

If Arch change the format of there website significantly, this might stop working.

1

Make a new folder and cd into it.

 $ mkcd(){ NAME=$1; mkdir -p "$NAME"; cd "$NAME"; }

— by PrasannaNatarajan on Aug. 3, 2017, 6:49 a.m.

Explanation

Paste this function in the ~/.bashrc file.

Usage:

mkcd name1

This command will make a new folder called name1 and cd into the name1.

I find myself constantly using mkdir and going into the folder as the next step. It made sense for me to combine these steps into a single command.

1

Listen to the radio (radio2 in example)

 $ mpv http://a.files.bbci.co.uk/media/live/manifesto/audio/simulcast/hls/uk/sbr_med/llnw/bbc_radio_two.m3u8

— by Jab2870 on July 19, 2017, 2:44 p.m.

Explanation

MPV is a terminal audio player. You could also use vlc or any media player that supports streams.

To find a stream for your favourite uk radio station, look here: UK Audio Streams. If you are outside of the uk, Google is your friend

Limitations

Requires an audio player that supports streams.

1

Go up to a particular folder

 $ alias ph='cd ${PWD%/public_html*}/public_html'

— by Jab2870 on July 18, 2017, 6:07 p.m.

Explanation

I work on a lot of websites and often need to go up to the public_html folder.

This command creates an alias so that however many folders deep I am, I will be taken up to the correct folder.

alias ph='....': This creates a shortcut so that when command ph is typed, the part between the quotes is executed

cd ...: This changes directory to the directory specified

PWD: This is a global bash variable that contains the current directory

${...%/public_html*}: This removes /public_html and anything after it from the specified string

Finally, /public_html at the end is appended onto the string.

So, to sum up, when ph is run, we ask bash to change the directory to the current working directory with anything after public_html removed.

Examples

If I am in the directory ~/Sites/site1/public_html/test/blog/ I will be taken to ~/Sites/site1/public_html/

If I am in the directory ~/Sites/site2/public_html/test/sources/javascript/es6/ I will be taken to ~/Sites/site2/public_html/

2

Open another terminal at current location

 $ $TERMINAL & disown

— by Jab2870 on July 18, 2017, 3:04 p.m.

Explanation

Opens another terminal window at the current location.

Use Case

I often cd into a directory and decide it would be useful to open another terminal in the same folder, maybe for an editor or something. Previously, I would open the terminal and repeat the CD command.

I have aliased this command to open so I just type open and I get a new terminal already in my desired folder.

The & disown part of the command stops the new terminal from being dependant on the first meaning that you can still use the first and if you close the first, the second will remain open.

Limitations

It relied on you having the $TERMINAL global variable set. If you don't have this set you could easily change it to something like the following:

gnome-terminal & disown or konsole & disown

0

Preserve your fingers from cd ..; cd ..; cd..; cd..;

 $ up(){ DEEP=$1; for i in $(seq 1 ${DEEP:-"1"}); do cd ../; done; }

— by alireza6677 on June 28, 2017, 5:40 p.m.

Explanation

Include this function in your .bashrc

Now you are able to go back in your path simply with up N. So, for example:

Z:~$ cd /var/lib/apache2/fastcgi/dynamic/
Z:/var/lib/apache2/fastcgi/dynamic$ up 2
Z:/var/lib/apache2$ up 3
Z:/$

0

Get the HTTP status code of a URL

 $ curl -Lw '%{http_code}' -s -o /dev/null -I SOME_URL

— by Janos on June 19, 2017, 11:15 p.m.

Explanation

  • -w '%{http_code}' is to print out the status code (the meat of this post)
  • -s is to make curl silent (suppress download progress stats output)
  • -o /dev/null is to redirect all output to /dev/null
  • -I is to fetch the headers only, no need for the page content
  • -L is to follow redirects

0

Corporate random bullshit generator (cbsg)

 $ curl -s http://cbsg.sourceforge.net/cgi-bin/live | grep -Eo '^<li>.*</li>' | sed s,\</\\?li\>,,g | shuf -n 1 | cowsay

— by Jab2870 on June 7, 2017, 4:11 p.m.

Explanation

Lets make a cow talk BS

Limitations

I don't think cowsay is installed by default on a mac although it can be installed with brew cowsay

1

Generate a sequence of numbers

 $ perl -e 'print "$_\n" for (1..10);'

— by abhinickz6 on May 30, 2017, 2:47 p.m.

Explanation

Print the number with newline character which could be replaced by any char.

0

List the content of a GitHub repository without cloning it

 $ svn ls https://github.com/user/repo/trunk/some/path

— by Janos on May 21, 2017, 6:01 p.m.

Explanation

Git doesn't allow querying sub-directories of a repository. But GitHub repositories are also exposed as Subversion repositories, and Subversion allows arbitrary path queries using the ls command.

Notice the /trunk/ between the base URL of the repository and the path to query. This is due to the way GitHub provides Subversion using a standard Subversion repository layout, with trunk, branches and tags sub-directories.