We collect practical, well-explained Bash one-liners, and promote best practices in Bash shell scripting. To get the latest Bash one-liners, follow @bashoneliners on Twitter. If you find any problems, report a bug on GitHub.

Tags

0

While loop to pretty print system load (1, 5 & 15 minutes)

 $ while [ 1 == 1 ]; do  cat /proc/loadavg | awk '{printf "1 minute load: %.2f\n", $(NF-5)}' && cat /proc/loadavg |awk '{printf "5 minute load: %.2f\n", $(NF-3)}' && cat /proc/loadavg |awk '{printf "15 minute load: %.2f\n", $(NF-2)}'; sleep 3; date; done

— by peek2much3 on Aug. 30, 2018, 8:54 a.m.

Explanation

top is great but this will make it easier to read and makes it easy to pipe to text file for historical review. kill with ctrl+c

0

Dump all AWS IAM users/roles to a Terraform file for editing / reusing in another environment

 $ echo iamg iamgm iamgp iamip iamp iampa iamr iamrp iamu iamup | AWS_PROFILE=myprofile xargs -n1  terraforming

— by johntellsall on Aug. 28, 2018, 12:38 a.m.

Explanation

Amazon Web Services (AWS) use a collection "IAM" resources to create Users and related objects in the system. This oneliner scrapes all the relevant info and puts it into Terraform. This lets us audit our users-groups. And, it lets us re-use them in another environment!

0

Organise image by portrait and landscape

 $ mkdir "portraits"; mkdir "landscapes"; for f in ./*.jpg; do WIDTH=$(identify -format "%w" "$f")> /dev/null; HEIGHT=$(identify -format "%h" "$f")> /dev/null; if [[ "$HEIGHT" > "$WIDTH" ]]; then mv "$f" portraits/ ; else mv "$f" landscapes/ ; fi; done

— by Jab2870 on Aug. 23, 2018, 2:09 p.m.

Explanation

  1. First makes directory for portraits and landscapes
  2. Loops through all files in the current directory with the extention .jpg, feel free to change this to .png or .jpeg if neccesary
    1. Gets the width and height for the current image using the identify command
    2. If height > width, move it to Portarits folder, otherwise move it to landscapes

Limitations

This relies on the identify command which comes with ImageMagick which is available on most systems.

This does not check for square images, although it could be easily extended to see if HEIGHT and WIDTH are equal. Square images are currently put with the landscape images.

0

Create a txt files with 10000 rows

 $ for FILE in *.full ; do split -l 100000 $FILE; mv -f xaa `echo "$FILE" | cut -d'.' -f1`.txt; rm -f x*; done

— by Kifli88 on Aug. 22, 2018, 2:02 p.m.

Explanation

for loop will go trough every file that is finished with ".full" split that in file in files of 100000 rows. Rename the first one as input name but deleting the extension and adding ".txt" as extension. As last stem it will delete the rest of split files.

0

List open processes ordered by it's number of open files

 $ ps -ef |awk '{ print $2 }' \ 	|tail -n +2 \ 	|while read pid; do echo "$pid	$(lsof -p $pid |wc -l)"; done \ 	|sort -r -n -k 2 \ 	|while read pid count; do echo "$pid	$count	$(ps -o command= -p $pid)"; done

— by cddr on Aug. 22, 2018, 1:21 p.m.

Explanation

Combines ps, lsof, and sort in the ways you might expect to produce the intended outcome.

0

Remove all container from an specific network (docker)

 $ docker ps -a -f network=$NETWORK --format='{{.ID}}' | xargs docker rm -f

— by gatero on Aug. 17, 2018, 4:38 p.m.

Explanation

docker ps -a -f network=$NETWORK --format='{{.ID}}' returns the id's of all container that are subscribed to the network and pass the output to xargs docker rm -f that stop and deletes each container

0

Up all docker services as detached mode over all immediate subdirectories

 $ for dir in $(ls -d */); do eval $(cd $PWD/$dir && docker-compose up -d && cd ..); done;

— by gatero on Aug. 17, 2018, 4:31 p.m.

Explanation

Supposing that you are in a directory that contains many subdirectories with a docker-compose file each one and instead of up one by one manually you want run all at time, well this is a helpful command for this purpose

0

Find and replace string inside specific files

 $ grep -ril '$SEARCH_PATTERN' src | sed -i 's/$FIND_PATTERN/$REPLACE_PATTERN/g'

— by gatero on Aug. 17, 2018, 4:18 p.m.

Explanation

This command search for files that contain and an specific string and then find a pattern on those files and replace it

0

Puppet/Bash: test compare json objects.

 $ unless => "client_remote=\"$(curl localhost:9200/_cluster/settings | python -c \"import json,sys;obj=json.load(sys.stdin);print(obj['persistent']['search']['remote'])\")\"; new_remote=\"$( echo $persistent_json | python -c \"import json,sys;obj=json.load(sys.stdin);print(obj['persistent']['search']['remote'])\")\"; [ \"$client_remote\" = \"$new_remote\" ]",

— by cjedwa on July 27, 2018, 8:37 p.m.

Explanation

One json object provided by puppet dictionary the other grabbed from Elasticsearch rest API. Only run command if these don't match. Had issues getting jq to sort properly so used python.

0

Print wifi access points sorted by signal

 $ iw dev IFACE scan | egrep "SSID|signal" | awk -F ":" '{print $2}' | sed 'N;s/\n/:/' | sort

— by kazatca on June 16, 2018, 5:37 a.m.

Explanation

  • iw dev IFACE scan get info about scanned APs
  • egrep "SSID|signal" take only name and signal
  • awk -F ":" '{print $2}' cut labels of fields
  • sed 'N;s/\n/:/' join couples to single line
  • sort sort by signal asc

IFACE - wifi interface (like wlan0)

0

Delete all untagged Docker images

 $ docker images -q -f dangling=true | xargs --no-run-if-empty --delim='\n' docker rmi

— by penguincoder on June 15, 2018, 1:12 a.m.

Explanation

It does not return a failing exit code if there are no images removed. It should always succeed unless there was an actual problem removing a Docker image.

Limitations

This only works in the GNU version of xargs (thanks to the --no-run-if-empty), BSD does not have an equivalent that I know about.

0

Source without circular reference

 $ [ ! "${LIB}" ] && ( readonly LIB; . "${ $( cd $( dirname $0 ) && pwd ) }/<path_to>/LIB.sh" )

— by dhsrocha on Jan. 24, 2018, 4:30 p.m.

Explanation

Source LIB only if the corresponding variable is not defined, in order to prevent circular reference loop in case of the same script has been sourced before during the sourcing event.

0

Remove new lines from files and folders

 $ rename 's/[\r\n]//g' *

— by moverperfect on Sept. 30, 2017, 10:07 p.m.

Explanation

This will search all files and folders in the current directory for any with a new line character in them and remove the new line out of the file/folder.

0

Kill a process running on port 8080

 $ lsof -i :8080 | awk 'NR > 1 {print $2}' | xargs --no-run-if-empty kill

— by Janos on Sept. 1, 2017, 8:31 p.m.

Explanation

lsof lists open files (ls-o-f, get it?). lsof -i :8080 lists open files on address ending in :8080. The output looks like this

COMMAND  PID     USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
chrome  2619 qymspace  149u  IPv4  71595      0t0  TCP localhost:53878->localhost:http-alt (CLOSE_WAIT)`

We use awk 'NR > 1 {print $2}' to print the second column for lines except the first. The result is a list of PIDs, which we pipe to xargs kill to kill.

Limitations

The --no-run-if-empty option of xargs is available in GNU implementations, and typically not available in BSD implementations. Without this option, the one-liner will raise an error if there are no matches (no PIDs to kill).

0

Get the HTTP status code of a URL

 $ curl -Lw '%{http_code}' -s -o /dev/null -I SOME_URL

— by Janos on June 19, 2017, 11:15 p.m.

Explanation

  • -w '%{http_code}' is to print out the status code (the meat of this post)
  • -s is to make curl silent (suppress download progress stats output)
  • -o /dev/null is to redirect all output to /dev/null
  • -I is to fetch the headers only, no need for the page content
  • -L is to follow redirects

0

Corporate random bullshit generator (cbsg)

 $ curl -s http://cbsg.sourceforge.net/cgi-bin/live | grep -Eo '^<li>.*</li>' | sed s,\</\\?li\>,,g | shuf -n 1 | cowsay

— by Jab2870 on June 7, 2017, 4:11 p.m.

Explanation

Lets make a cow talk BS

Limitations

I don't think cowsay is installed by default on a mac although it can be installed with brew cowsay

0

Create an array of CPU frequencies in GHz

 $ cpus=($({ echo scale=2; awk '/cpu MHz/ {print $4 " / 1000"}' /proc/cpuinfo; } | bc))

— by openiduser146 on Dec. 28, 2015, 9:02 p.m.

Explanation

  • The awk command takes the input from /proc/cpuinfo, matches lines containing "cpu MHz", and appends the " / 1000" to the CPU frequency, so it's ready for piping to bc
  • The echo scale=2 is for bc, to get floating point numbers with a precision of maximum two decimal points
  • Group the echo scale=2 and the awk for piping to bc, by enclosing the commands within { ...; }
  • Run the commands in a $(...) subshell
  • Wrap the subshell within (...) to store the output lines as an array

From the cpus array, you can extract the individual CPU values with:

cpu0=${cpus[0]}
cpu1=${cpus[1]}
cpu2=${cpus[2]}
cpu3=${cpus[3]}

If you don't need the values in GHz, but MHz is enough, then the command is a lot simpler:

cpus=($(awk '/cpu MHz/ {print $4}' /proc/cpuinfo))

Limitations

Arrays are Bash specific, might not work in older /bin/sh.

/proc/cpuinfo exists only in Linux.

0

Test git archive before actually creating an archive // fake dry run

 $ git archive master some/project/subdir | tar t

— by openiduser146 on Dec. 22, 2015, 2:29 p.m.

Explanation

git archive doesn't have a --dry-run flag, and it would be nice to see what files would be in the archive before actually creating it.

  • git archive master some/project/subdir
  • Create an archive from the master branch, with only a specified sub-directory of the project in it (instead of the entire repo)
  • Note: without specifying a file, the archive is dumped to standard output
  • tar t : the t flag of tar is to list the content of an archive. In this example the content comes from standard input (piped from the previous command)

In other words, this command creates an archive without ever saving it in a file, and uses tar t to list the contents. If the output looks good, then you can create the archive with:

git archive master -o file.tar some/project/subdir

0

Shuffle lines

 $ seq 5 | shuf

— by openiduser184 on March 12, 2015, 7:58 a.m.

Explanation

shuf is part of the textutils package of GNU Core Utilities and should be available on most systems.

0

Download a file from a webserver with telnet

 $ (echo 'GET /'; echo; sleep 1; ) | telnet www.google.com 80

— by Janos on Dec. 22, 2014, 11:31 p.m.

Explanation

If you are ever in a minimal headless *nix which doesn't have any command line utilities for downloading files (no curl, wget, lynx) but you have telnet, then this can be a workaround.

Another option is netcat:

/usr/bin/printf 'GET / \n' | nc www.google.com 80

Credit goes to this post: http://unix.stackexchange.com/a/83987/17433

0

Print the window title of current mpv session to display what is playing

 $ wmctrl -pl | grep $(pidof mpv) | cut -d- -f2-

— by openiduser171 on Dec. 15, 2014, 3:37 a.m.

Explanation

wmctrl -l lists all open windows (works with several window managers), -p includes the unique process ID of each window in the list. grep $(pidof mpv) matches the line that contains the process ID of mpv. cut -d'-' -f2- prints everything after the the first delimiter '-' (from the second onwards), which just leaves the title bit.

Limitations

Only works with one instance of mpv running. It's intended use is to share what film or series you are watching and you don't usually watch more than one thing at a time.

0

Shuffle lines

 $ ... | perl -MList::Util -e 'print List::Util::shuffle <>'

— by Janos on Oct. 25, 2014, 10:40 p.m.

Explanation

Sorting lines is easy: everybody knows the sort command.

But what if you want to do the other way around? The above perl one-liner does just that:

  • -MList::Util load the List::Util module (as if doing use List::Util inside a Perl script)
  • -e '...' execute Perl command
  • print List::Util::shuffle <> call List::Util::shuffle for the lines coming from standard input, read by <>

Another way would be sort -R if your version supports that (GNU, as opposed to BSD). In BSD systems you can install coreutils and try gsort -R instead. (For eample on OSX, using MacPorts: sudo port install coreutils.)

0

Open Windows internet shortcut (*.url) files in firefox

 $ firefox $(grep -i ^url='*' file.url | cut -b 5-)

— by tsjswimmer on Sept. 11, 2014, 10:03 a.m.

Explanation

Extract urls from a *.url file and open in Firefox. (Note that *.url files in Windows are basically just text files, so they can be parsed with a few commands.)

  • grep extracts lines starting with url=
  • The -i flag is to ignore case
  • cut extracts the range of characters from the 5th until the end of lines
  • The output of $(...) will be used as command line parameters for Firefox

Limitations

This only works with URLs that don't contain special characters that would be interpreted by the shell, such as spaces and others.

0

Check if a file exists and has a size greater than X

 $ [[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]] && echo true || echo false

— by Janos on Jan. 9, 2014, 12:34 p.m.

Explanation

  • The find takes care two things at once: checks if file exists and size is greater than 51200.
  • We redirect stderr to /dev/null to hide the error message if the file doesn't exist.
  • The output of find will be non-blank if the file matched both conditions, otherwise it will be blank
  • The [[ ... ]] evaluates to true or false if the output of find is non-blank or blank, respectively

You can use this in if conditions like:

if [[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]]; do
    somecmd
fi

0

Replace sequences of the same characters with a single character

 $ echo heeeeeeelllo | sed 's/\(.\)\1\+/\1/g'

— by Janos on Dec. 11, 2013, 7:58 p.m.

Explanation

That is, this will output "helo".

The interesting thing here is the regular expression in the s/// command of sed:

  • \(.\) -- capture any character
  • \1 -- refers to the last captured string, in our case the previous character. So effectively, \(.\)\1 matches pairs of the same character, for example aa, bb, ??, and so on.
  • \+ -- match one or more of the pattern right before it
  • ... and we replace what we matched with \1, the last captured string, which is the first letter in a sequence like aaaa, or bbbbbbb, or cc.