We collect practical, well-explained Bash one-liners, and promote best practices in Bash shell scripting. To get the latest Bash one-liners, follow @bashoneliners on Twitter. If you find any problems, report a bug on GitHub.



Puppet/Bash: test compare json objects.

 $ unless => "client_remote=\"$(curl localhost:9200/_cluster/settings | python -c \"import json,sys;obj=json.load(sys.stdin);print(obj['persistent']['search']['remote'])\")\"; new_remote=\"$( echo $persistent_json | python -c \"import json,sys;obj=json.load(sys.stdin);print(obj['persistent']['search']['remote'])\")\"; [ \"$client_remote\" = \"$new_remote\" ]",

— by cjedwa on July 27, 2018, 8:37 p.m.


One json object provided by puppet dictionary the other grabbed from Elasticsearch rest API. Only run command if these don't match. Had issues getting jq to sort properly so used python.


Print wifi access points sorted by signal

 $ iw dev IFACE scan | egrep "SSID|signal" | awk -F ":" '{print $2}' | sed 'N;s/\n/:/' | sort

— by kazatca on June 16, 2018, 5:37 a.m.


  • iw dev IFACE scan get info about scanned APs
  • egrep "SSID|signal" take only name and signal
  • awk -F ":" '{print $2}' cut labels of fields
  • sed 'N;s/\n/:/' join couples to single line
  • sort sort by signal asc

IFACE - wifi interface (like wlan0)


Kill a process running on port 8080

 $ lsof -i :8080 | awk '{l=$2} END {print l}' | xargs kill

— by jamestomasino on June 15, 2018, 4:18 a.m.


As before, we're using lsof to find the PID of the process running on port 8080. We use awk to store the second column of each line into a variable l, overwriting it with each line. In the END clause, we're left with the second column of only the last line. xargs passes that as a parameter to the kill command.

The only notable diffirence from the command listed above is the use of awk to also complete the tail -n 1 step. This awk pattern matches the same intended behavior of the script that was using tail. To kill all processes on that port, you could use the NR>1 clause instead of the variable loop.


Delete all untagged Docker images

 $ docker images -q -f dangling=true | xargs --no-run-if-empty --delim='\n' docker rmi

— by penguincoder on June 15, 2018, 1:12 a.m.


It does not return a failing exit code if there are no images removed. It should always succeed unless there was an actual problem removing a Docker image.


This only works in the GNU version of xargs (thanks to the --no-run-if-empty), BSD does not have an equivalent that I know about.


Take values from a list (file) and search them on another file

 $ for ITEM in `cat values_to_search.txt`; do  (egrep $ITEM full_values_list.txt && echo $ITEM found) | grep "found" >> exit_FOUND.txt; done

— by ManuViorel on May 16, 2018, 3:20 p.m.


This line :) searches values taken from a file (values_to_search.txt) by scanning a full file values list . If value found, it is added on a new file exit_FOUND.txt.

Alternatively, we can search for values from the list 1 which does NOT exists on the list 2, as bellow:

for ITEM in cat values_to_search.txt; do (egrep $ITEM full_values_list.txt || echo $ITEM not found) | grep "not found">> exit_not_found.txt; done


No limitations


Delete all untagged Docker images

 $ docker rmi $(docker images -f "dangling=true" -q)

— by stefanobaghino on April 27, 2018, 2:50 p.m.


docker images outputs all images currently available. By specifying -f "dangling=true" we restrict the list to "dangling" images (i.e. untagged). By specifying the -q option we use quiet mode, which limits the output to the images hash, which is the directly fed into docker rmi, which removes the images with the corresponding hashes.


Have script run itself in a virtual terminal

 $ tty >/dev/null || { urxvt -e /bin/sh -c "tty >/tmp/proc$$; while test x; do sleep 1; done" & while test ! -f /tmp/proc$$; do sleep .1; done; FN=$(cat /tmp/proc$$); rm /tmp/proc$$; exec >$FN 2>$FN <$FN; }

— by openiduser111 on March 9, 2018, 2:56 a.m.


  • We begin by testing if the script is not in a terminal with tty.
  • If it is not we start a terminal that runs tty and saves it to a filename. $$ was set by the original script and is its PID. That is opened in the background using & and then the original script waits for the filename to appear, then reads and removes it.
  • Finally, the main command is a special syntax of the bash builtin command exec that contains nothing but redirections (of stdout, stderr, and stdin) so they will apply to every command in the rest of the script file.


Big CSV > batches > JSON array > CURL POST data with sleep

 $ cat post-list.csv | split -l 30 - --filter='jq -R . | jq --slurp -c .' | xargs -d "\n" -I % sh -c 'curl -H "Content-Type: application/json" -X POST -d '"'"'{"type":1,"entries":%}'"'"' && sleep 30'

— by pratham2003 on March 7, 2018, 12:12 p.m.


post-list.csv contains list of URLs in my example.

  • split -l 30 Split by 30 lines

  • - Use stdin as input for split

  • --filter Couldn't find a way to easily pipe to stdout from split, hence --filter

  • jq -R . From the jq manual - Don’t parse the input as JSON. Instead, each line of text is passed to the filter as a string

  • jq --slurp -c . From the jq manual - Instead of running the filter for each JSON object in the input, read the entire input stream into a large array and run the filter just once. -c makes it easier to pipe and use it in the xargs that follows.

  • xargs -d "\n" -I % sh -c Execute a command for each array. Use "\n" as delimiter. Use % as a placeholder in the command that follows.

  • Single quotes inside sh -c ' ... ' are escaped as '"'"' single-double-single-double-single. You can do whatever you need to inside sh -c ' ... && sleep 123'


You need jq installed, for example in Debian / Ubuntu:

apt-get install jq`

See also https://stedolan.github.io/jq/manual/

I suspect the input file (cat post-list.csv) may not contain double or single quotes but haven't tested it.


List all packages with at least a class defined in a JAR file

 $ jar tf "$1" | grep '/.*\.class$' | xargs dirname | sort -u | tr / .

— by stefanobaghino on Feb. 19, 2018, 12:13 p.m.


The jar command allows you to read or manipulate JAR (Java ARchive) files, which are ZIP files that usually contain classfiles (Java compiled bytecode files) and possibly manifests and configuration files. We specify that we want to list file contents (t) that we provide as an argument (f, otherwise the jar will be read from stdin).

From the output, we get only the paths that contain a classfile (grep), then the path to the package that contains it (xargs dirname), we get the unique, sorted paths and translate /s to .s (to display their names as they would be shown in Java syntax).


Will only exhaustively list the packages with a defined class for languages that require packages to map to the directory structure (e.g.: Java does, Scala doesn't). If this convention is respected, the command will output an exhaustive list of packages nonetheless.


Output an arbitrary number of open TCP or UDP ports in an arbitrary range

 $ comm -23 <(seq "$FROM" "$TO") <(ss -tan | awk '{print $4}' | cut -d':' -f2 | grep "[0-9]\{1,5\}" | sort | uniq) | shuf | head -n "$HOWMANY"

— by stefanobaghino on Feb. 9, 2018, 3:51 p.m.


Originally published (by me) on unix.stackexchange.com.

comm is a utility that compares sorted lines in two files. It outputs three columns: lines that appear only in the first file, lines that only appear in the second one and common lines. By specifying -23 we suppress the latter columns and only keep the first one. We can use this to obtain the difference of two sets, expressed as a sequence of text lines. I learned about comm here.

The first file is the range of ports that we can select from. seq produces a sorted sequence of numbers from $FROM to $TO. The result is piped to comm as the first file using process substitution.

The second file is the sorted list of ports, that we obtain by calling the ss command (with -t meaning TCP ports, -a meaning all - established and listening - and -n numeric - don't try to resolve, say, 22 to ssh). We then pick only the fourth column with awk, which contains the local address and port. We use cut to split address and port with the : delimiter and keep only the latter (-f2). ss also output an header, that we get rid of by grepping for non-empty sequences of numbers that are no longer than 5. We then comply with comm's requirement by sorting numerically (-n) and getting rid of duplicates with uniq.

Now we have a sorted list of open ports, that we can shuffle to then grab the first "$HOWMANY" ones with head -n.


Grab the three random open ports in the private range (49152-65535)

comm -23 <(seq 49152 65535) <(ss -tan | awk '{print $4}' | cut -d':' -f2 | grep "[0-9]\{1,5\}" | sort | uniq) | shuf | head -n 3

could return for example



  • switch -t with -u in ss to get free UDP ports instead.
  • drop shuf if you're not interested in grabbing a random port


Source without circular reference

 $ [ ! "${LIB}" ] && ( readonly LIB; . "${ $( cd $( dirname $0 ) && pwd ) }/<path_to>/LIB.sh" )

— by dhsrocha on Jan. 24, 2018, 4:30 p.m.


Source LIB only if the corresponding variable is not defined, in order to prevent circular reference loop in case of the same script has been sourced before during the sourcing event.


Ternary conditional clause

 $ [ test_statement ] && ( then_statement ) || ( else_statement );

— by dhsrocha on Jan. 22, 2018, 5:27 p.m.


The test_statement is under a child process (a.k.a. subshell), which will return a boolean output. The then_statement depends on the former to be executed. Otherwise, it will be done upon the latter.


Get executed script's current working directory

 $ CWD=$(cd "$(dirname "$0")" && pwd)

— by dhsrocha on Jan. 22, 2018, 4:55 p.m.


Will return excuting script's current working directory, wherever Bash executes the script containing this line.


Random Git Commit

 $ git commit -m "$(w3m whatthecommit.com | head -n 1)"

— by Jab2870 on Jan. 5, 2018, 4:55 p.m.


This will commit a message pulled from What the Commit.

-m allows you to provide the commit message without entering your editor

w3m is a terminal based web browser. We basically use it to strip out all of the html tags

head -n 1 will grab only the first line


This requires you to have w3m installed


Blackhole ru zone

 $ echo "address=/ru/" | sudo tee /etc/NetworkManager/dnsmasq.d/dnsmasq-ru-blackhole.conf && sudo systemctl restart network-manager

— by olshek_ on Nov. 14, 2017, 2:12 p.m.


It creates dnsmasq-ru-blackhole.conf file with one line to route all domains of ru zone to

You might use "address=/home.lab/" to point allpossiblesubdomains.home.lab to your localhost or some other IP in a cloud.


Remove new lines from files and folders

 $ rename 's/[\r\n]//g' *

— by moverperfect on Sept. 30, 2017, 10:07 p.m.


This will search all files and folders in the current directory for any with a new line character in them and remove the new line out of the file/folder.


Retrieve dropped connections from firewalld journaling

 $ sudo journalctl -b | grep -o "PROTO=.*" | sed -r 's/(PROTO|SPT|DPT|LEN)=//g' | awk '{print $1, $3}' | sort | uniq -c

— by FoxBuru on Sept. 14, 2017, 5:10 p.m.


We take the output of journalctl since the last boot (-b flag) and output from PROTO= until the EOL. Then, we remove identification tags (PROTO=/SPT=/DPT=/LEN=) and print just the protocol and destination port (cols 1 and 3). We sort the output properly so we can aggregate them on the call over uniq.


  • Only works on Linux
  • You use firewalld and you have logging set on ALL (see firewalld.conf for details)
  • You use journald for logging
  • Your user has sudo privileges


Kill a process running on port 8080

 $ lsof -i :8080 | awk 'NR > 1 {print $2}' | xargs --no-run-if-empty kill

— by Janos on Sept. 1, 2017, 8:31 p.m.


lsof lists open files (ls-o-f, get it?). lsof -i :8080 lists open files on address ending in :8080. The output looks like this

chrome  2619 qymspace  149u  IPv4  71595      0t0  TCP localhost:53878->localhost:http-alt (CLOSE_WAIT)`

We use awk 'NR > 1 {print $2}' to print the second column for lines except the first. The result is a list of PIDs, which we pipe to xargs kill to kill.


The --no-run-if-empty option of xargs is available in GNU implementations, and typically not available in BSD implementations. Without this option, the one-liner will raise an error if there are no matches (no PIDs to kill).


Kill a process running on port 8080

 $ lsof -i :8080 | awk '{print $2}' | tail -n 1 | xargs kill

— by kimbethwel on Aug. 18, 2017, 8:22 a.m.


lsof lists open files (ls-o-f, get it?). lsof -i :8080 lists open files on address ending in :8080. The output looks like this

chrome  2619 qymspace  149u  IPv4  71595      0t0  TCP localhost:53878->localhost:http-alt (CLOSE_WAIT)`

We pipe this input through awk to print column 2 using the command awk '{print $2}' to produce the output:


To remote the word PID from this output we use tail -n 1 to grab the last row 2533,

We can now pass this process id to the kill command to kill it.


Get the latest Arch Linux news

 $ w3m https://www.archlinux.org/ | sed -n "/Latest News/,/Older News/p" | head -n -1

— by Jab2870 on Aug. 15, 2017, 10:35 a.m.


w3m is a terminal web browser. We use it to go to https://www.archlinux.org/

We then use sed to capture the text between Latest News and Older News.

We then get rid of the last line which is Older News.


For this, w3m would need to be installed. It should be installable on most systems.

If Arch change the format of there website significantly, this might stop working.


Make a new folder and cd into it.

 $ mkcd(){ NAME=$1; mkdir -p "$NAME"; cd "$NAME"; }

— by PrasannaNatarajan on Aug. 3, 2017, 6:49 a.m.


Paste this function in the ~/.bashrc file.


mkcd name1

This command will make a new folder called name1 and cd into the name1.

I find myself constantly using mkdir and going into the folder as the next step. It made sense for me to combine these steps into a single command.


Listen to the radio (radio2 in example)

 $ mpv http://a.files.bbci.co.uk/media/live/manifesto/audio/simulcast/hls/uk/sbr_med/llnw/bbc_radio_two.m3u8

— by Jab2870 on July 19, 2017, 2:44 p.m.


MPV is a terminal audio player. You could also use vlc or any media player that supports streams.

To find a stream for your favourite uk radio station, look here: UK Audio Streams. If you are outside of the uk, Google is your friend


Requires an audio player that supports streams.


Go up to a particular folder

 $ alias ph='cd ${PWD%/public_html*}/public_html'

— by Jab2870 on July 18, 2017, 6:07 p.m.


I work on a lot of websites and often need to go up to the public_html folder.

This command creates an alias so that however many folders deep I am, I will be taken up to the correct folder.

alias ph='....': This creates a shortcut so that when command ph is typed, the part between the quotes is executed

cd ...: This changes directory to the directory specified

PWD: This is a global bash variable that contains the current directory

${...%/public_html*}: This removes /public_html and anything after it from the specified string

Finally, /public_html at the end is appended onto the string.

So, to sum up, when ph is run, we ask bash to change the directory to the current working directory with anything after public_html removed.


If I am in the directory ~/Sites/site1/public_html/test/blog/ I will be taken to ~/Sites/site1/public_html/

If I am in the directory ~/Sites/site2/public_html/test/sources/javascript/es6/ I will be taken to ~/Sites/site2/public_html/


Open another terminal at current location

 $ $TERMINAL & disown

— by Jab2870 on July 18, 2017, 3:04 p.m.


Opens another terminal window at the current location.

Use Case

I often cd into a directory and decide it would be useful to open another terminal in the same folder, maybe for an editor or something. Previously, I would open the terminal and repeat the CD command.

I have aliased this command to open so I just type open and I get a new terminal already in my desired folder.

The & disown part of the command stops the new terminal from being dependant on the first meaning that you can still use the first and if you close the first, the second will remain open.


It relied on you having the $TERMINAL global variable set. If you don't have this set you could easily change it to something like the following:

gnome-terminal & disown or konsole & disown


Preserve your fingers from cd ..; cd ..; cd..; cd..;

 $ up(){ DEEP=$1; for i in $(seq 1 ${DEEP:-"1"}); do cd ../; done; }

— by alireza6677 on June 28, 2017, 5:40 p.m.


Include this function in your .bashrc

Now you are able to go back in your path simply with up N. So, for example:

Z:~$ cd /var/lib/apache2/fastcgi/dynamic/
Z:/var/lib/apache2/fastcgi/dynamic$ up 2
Z:/var/lib/apache2$ up 3