We collect practical, well-explained Bash one-liners, and promote best practices in Bash shell scripting. To get the latest Bash one-liners, follow @bashoneliners on Twitter. If you find any problems, report a bug on GitHub.

Tags

1

Open Windows internet shortcut (*.url) files in firefox

 $ grep -i url='*' file.url | cut -b 5- | xargs firefox

— by tsjswimmer on Sept. 12, 2014, 12:06 a.m.

Explanation

Extract urls from a *.url file and open in Firefox. (Note that *.url files in Windows are basically just text files, so they can be parsed with a few commands.)

  • grep extracts lines starting with url=
  • The -i flag is to ignore case
  • cut extracts the range of characters from the 5th until the end of lines
  • xargs calls Firefox with arguments taken from the output of the pipeline

1

Remove all at jobs

 $ atq | sed 's_\([0-9]\{1,8\}\).*_\1_g' | xargs atrm

— by laurip on Sept. 10, 2014, 9:56 a.m.

Explanation

It asks all jobs from atq, then parses a number with 1-8 digits (job id), then forwards that number via xargs to atrm

Limitations

Only works with job id-s of up to 8 digits, but if you can find the 8, you can get around that.

1

Deletes orphan vim undo files

 $ find . -type f -iname '*.un~' | while read UNDOFILE ; do FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ) ; [[ -e "$FILE" ]] || rm "$UNDOFILE" ; done

— by rafaeln on Sept. 2, 2014, 6:51 p.m.

Explanation

find -type f -iname '*.un~' finds every vim undo file and outputs the path to each on a separate line. At the beginning of the while loop, each of these lines is assigned in to the variable $UNDOFILE with while read UNDOFILE, and in the body of the while loop, the file each undo-file should be tracking is calculated and assigned to $FILE with FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ). If $FILE doesn't exist [[ -e "$FILE" ]] the undo-file is removed rm "$UNDOFILE".

Limitations

I'm not sure whether sed in every flavour of UNIX allows the -r flag. That flag can be removed, though, as long as the parentheses in -e 's&/\.([^/]*)&/\1&' are escaped (but I think the way it stands the one-liner is more readable).

1

Find recent logs that contain the string "Exception"

 $ find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$

— by Janos on July 19, 2014, 7:53 a.m.

Explanation

The find:

  • -name '*.log' -- match files ending with .log
  • -mtime -2 -- match files modified within the last 2 days
  • -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path

The grep:

  • -c is to print the count of the matches instead of the matches themselves
  • -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
  • The output lines will be in the format path:count. Files that didn't match "Exception" will still be printed, with 0 as count
  • The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)

Extra tips:

  • Change "Exception" to the typical relevant failure indicator of your application
  • Add -i for grep to make the search case insensitive
  • To make the find match strictly only files, add -type f
  • Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' yourmail@example.com

Limitations

The -H flag of grep may not work in older operating systems, for example older Solaris. In that case use ggrep (GNU grep) instead, if it exists.

1

Parse nginx statistics output

 $ i=$(curl -s server/nginx_stats); IFS=$'\n'; i=($i); a=${i[0]/Active connections: } && a=${a/ }; r=${i[2]# [0-9]* [0-9]* }; echo "Active: $a, requests: $r"

— by azat on June 20, 2014, 3:19 p.m.

Explanation

  • Firstly download nginx statistics
  • IFS - set separator to new line only
  • i=$(i) # convert to *array*
  • a= # get active connections
  • r= # get requests

1

Install profiling versions of all libghc dpkg packages

 $ sudo dpkg -l | grep libghc | grep "\-dev" | cut -d " " -f 3 | tr '\n' ' ' | sed -e 's/\-dev/\-prof/g' | xargs sudo apt-get install --yes

— by openiduser146 on May 26, 2014, 1:14 a.m.

Explanation

dpkg -l lists all installed system packages.

grep libghc filters out all haskell packages

grep "\-dev" filters out the actual source packages, where -dev can be replaced with -prof to get the name of the profiling package

cut -d " " -f 3 converts lines from ii libghc-packagename-dev 0.1.3.3-7 amd64 description to libghc-packagename-dev

tr '\n' ' ' Replaces newlines with spaces, merging it all into one line

sed -e 's/\-dev/\-prof/g' Replaces -dev with -prof

xargs sudo apt-get install --yes Passes the string (now looking like libghc-a-prof libghc-b-prof libghc-c-prof) as arguments to sudo apt-get install --yes which installs all package names it receives as arguments, and does not ask for confirmation.

Limitations

Only works with apt (standard in ubuntu)

1

Extensive "cleanup" operations following "sudo yum upgrade"

 $ sudo yum upgrade && for pkg in $(package-cleanup --orphans -q); do repoquery $(rpm -q $pkg --queryformat="%{NAME}") | grep -q ".*" && echo $pkg; done | xargs sudo yum -y remove && for pkg in $(package-cleanup --leaves --all -q); do repoquery --groupmember $pkg | grep -q "@" || echo $pkg; done

— by openiduser143 on April 16, 2014, 9:58 p.m.

Explanation

"sudo yum upgrade" does clean up outdated packages that the current upgrade replaces, but not other outdated packages or the ones that it willfully skips. Yes, that's what "package-cleanup --orphans" will finish, but "orphaned packages" also include packages that are at their latest version but just aren't updated by the repositories (usually a discrete .rpm installation). This one-liner uses "package-cleanup --orphans" but wraps around it to skip packages that aren't in the repositories anyway and just removes outdated packages that have a newer version in the repositories.

No, it's not at the end yet. It has a final command to display all packages that don't belong to any group. Choose any of the "manual extension" packages which aren't really necessary and only clog the system.

Limitations

  • Specific to only rpm and yum
  • No, not just yum, it requires the yum-utils package (or whatever else provides package-cleanup and repoquery, if anything)

1

Get average CPU temperature from all cores.

 $ __=`sensors | grep Core` && echo \(`echo $__ | sed 's/.*+\(.*\).C\(\s\)\+(.*/\1/g' | tr "\n" "+" | head -c-1`\)\/`echo $__ | wc -l` | bc && unset __

— by openiduser139 on April 2, 2014, 10:04 p.m.

Explanation

Uses the "sensors" command and bc along with sed, grep, head, and tr to fetch and calculate the average CPU temperature.

1

Concatenate multiple SSL certificate files to make one PEM file

 $ files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done

— by renoirb on April 2, 2014, 5:41 p.m.

Explanation

If you want to concat multiple files, you might end up with cat {a,b,c} >> yourcert.pem in a loop. But the problem is that it doesnt create new line after each cat.

This script is for that matter.

To use, e.g.:

cd /etc/ssl/certs
files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done

1

List all non Git comited files and make a gzip archive with them

 $ GITFOLDER="/srv/some/folder"   ls-files --others --exclude-standard | tar czf ${GITFOLDER}-archives/uploads-$(date '+%Y%m%d%H%M').tar.gz -T -

— by renoirb on April 2, 2014, 5:18 p.m.

Explanation

Assuming your web app has a git checkout is in /srv/some/folder (i.e. there is a /srv/some/folder/.git), archive the user uploads to /srv/some/folder-archives with that one liner.

Use:

cd /srv/some/folder
# this one-liner

Limitations

A fully complete script would:

  • Check if $GITFOLDER exists
  • Check if $GITFOLDER has a .git directory
  • Create a temporary (e.g. tmp=$(mktemp)) file to log anything; if [ "$?" -ne 0 ] ; exit with status exit 1, otherwise delete the $tmp file and exit 0.

1

Have script run itself in a virtual terminal

 $ tty >/dev/null || { urxvt -hold -e "$0" "$@" & exit; }

— by openiduser111 on March 6, 2014, 3:18 a.m.

Explanation

This can be the first line of a script that will be clicked from a graphical user interface in X to make it open up a virtual terminal to display output. If a terminal is already open it will run in the current terminal. It assumes urxvt and uses the hold option to keep from closing, both of which could be substituted for such as rxvt or add read at the end of the script.

  • It's a single line if statement that checks the exit code of tty which prints the current terminal name usually nothing under X.
  • The curly braces are needed for grouping.
  • A space is required after the opening brace { and a semicolon is required before the closing brace }.
  • Replacing what would be a semicolon, the ampersand & forks the terminal command to a second process and the launching script exits right away.
  • -e feeds to the terminal application the expression of $0which holds the path of the script itself and $@, the entire set of quoted arguments.

Limitations

If the script is large, say several gigabytes and the system tries to make two copies of the script, twice the size of RAM or memory will be needed for loading it.

  • rxvt -e will kill any subprocesses at the end

1

Converts DD/MM/YYYY date format to ISO-8601 (YYYY-MM-DD)

 $ sed 's_\([0-9]\{1,2\}\)/\([0-9]\{1,2\}\)/\([0-9]\{4\}\)_\3-\2-\1_g'

— by laurip on Dec. 30, 2013, 10:30 a.m.

Explanation

Works on dates such as 01/02/1993, 01/10/1991, etc converting them to the superior ISO-8601 date format giving us 1993-02-01 and 1991-10-01 respectively. Test: echo '27/05/1994' | pattern given above Outputs 1994-05-27

Limitations

Currently does not fully convert D/M/YYYY dates such as 1/2/1993 to 1993-02-01, but 1993-2-1

1

Convert text from decimal to little endian hexadecimal

 $ echo $(printf %08X 256 | grep -o .. | tac | tr -d '\n')

— by openiduser111 on Aug. 21, 2013, 8:44 p.m.

Explanation

example of 256
printf %08X produces the 8 characters 00000100
grep breaks string by two characters
tac reverses
tr 00010000

Limitations

could be put in a loop like this
for A in $(printf %08X'\n' 256 255); do echo $A | grep -o .. | tac | tr -d '\n'; done

1

Md5sum the last 5 files in a folder

 $ find /directory1/directory2/ -maxdepth 1 -type f | sort | tail -n 5 | xargs md5sum

— by openiduser113 on Aug. 21, 2013, 3:26 p.m.

Explanation

  • find lists the files, no recursion, no directories, with full path
  • sort list files alphabetically
  • tail keep only the last 5 files
  • xargs send the list as arguments to md5sum
  • md5sum calculate the md5sum for each file

Limitations

Probably can't handle spaces in file or directory names.

1

Get mac address from default interface OS X

 $ netstat -rn | awk '/default/ { print $NF }' | head -1 | xargs -I {}  ifconfig {} | awk '/ether/ {print $2}'

— by spotmac on Aug. 21, 2013, 10:28 a.m.

Explanation

netstat -rn -> get routing table awk '/default/ { print $NF }' -> grep the default routes head -1 -> limit to the first result (is also the interface with the highest priority xargs -I {} ifconfig {} -> use the result to get data from ifconfig awk '/ether/ {print $2}' ->grep the mac address.

Limitations

Tested on OSX.

1

Print a random cat

 $ wget -O - http://placekitten.com/$[500 + RANDOM % 500] | lp

— by openiduser104 on July 26, 2013, 11:43 p.m.

Explanation

$RANDOM gives a random number.

http://placekitten.com is your cat place

wget -O - sends the output to stdout

lp prints

Limitations

Tested on OSX

Cat rules

1

Find all of the distinct file extensions in a folder

 $ find . -type f | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort -u

— by kowalcj0 on May 17, 2013, 8:05 p.m.

Explanation

Will find all of the distinct file extensions in a folder hierarchy.

Originally posted at: http://stackoverflow.com/questions/1842254/how-can-i-find-all-of-the-distinct-file-extensions-in-a-folder-hierarchy

1

Remove files and directories whose name is a timestamp older than a certain time

 $ ls | grep '....-..-..-......' | xargs -I {} bash -c "[[ x{} < x$(date -d '3 days ago' +%Y-%m-%d-%H%M%S) ]] && rm -rfv {}"

— by openiduser95 on May 7, 2013, 8:54 a.m.

Explanation

Suppose you have a backup directory with backup snapshots named by timestamp:

$ ls
2013-05-03-103022
2013-05-04-103033
2013-05-05-103023
2013-05-06-103040
2013-05-07-103022

You want to remove snapshots older than 3 days. The one-liner does it:

$ date
Tue May  7 13:50:57 KST 2013
$ ls | grep '....-..-..-......' | sort | xargs -I {} bash -c "[[ x{} < x$(date -d '3 days ago' +%Y-%m-%d-%H%M%S) ]] && rm -rfv {}"
removed directory: `2013-05-03-103022'
removed directory: `2013-05-04-103033'

Limitations

It doesn't work on OS X due to the differences between GNU date and BSD date).

1

Unhide all hidden files in the current directory.

 $ find . -maxdepth 1 -type f -name '\.*' | sed -e 's,^\./\.,,' | sort | xargs -iname mv .name name

— by openiduser93 on April 25, 2013, 7:46 a.m.

Explanation

This will remove the leading dot from all files in the current directory using mv, effectively "unhiding" them.

It will not affect subdirectories.

Limitations

Probably only works on GNU Linux, due to the specific usage of xargs.

1

Get only the latest version of a file from across mutiple directories.

 $ find . -name 'filename' | xargs -r ls -tc | head -n1

— by Anntoin on March 7, 2013, 11:39 p.m.

Explanation

Shows latest file (by last modification of file status information) for the given pattern. So in this example filename = custlist*.xls.

We use ls to do the sorting (-t) and head to pick the top one. xargs is given the -r option so that ls isn't run if there is no match.

Limitations

The filesystem needs to support ctime. Does not depend on a consistent naming scheme.

1

Get load average in a more parse-able format

 $ python -c 'import os; print os.getloadavg()[0]'

— by FoxWilson on Jan. 5, 2013, 3:32 a.m.

Explanation

In short, it runs a Python one-line script which imports the 'os' module and uses it to get the load average. It then indexes into a tuple, using the index of zero. The tuple is like this: (one-minute-average, five-minute-average, fifteen-minute-average), so you could substitute 0 for 1 to get the five minute load average, or 2 to get the fifteen minute load average. I find this really useful when writing larger bash one-liners which require this information, as I don't have to parse the output of uptime.

Limitations

Requires Python.

1

Function to extract columns from an input stream

 $ col() { awk '{print $('$(echo $* | sed -e s/-/NF-/g -e 's/ /),$(/g')')}'; }

— by Janos on Dec. 7, 2012, 4:14 p.m.

Explanation

A slightly improved version of the original one-liner to allow negative indexes to extract columns relative to the end of the line, for example:

$ echo a b c | col 1 -0 -1
a c b

In this example the function expands to:

awk '{print $(1), $(NF-0), $(NF-1)}'

1

Define an own watch(1)-like function

 $ watch () { interrupted=false; trap "interrupted=true" INT; while ! $interrupted; do $*; sleep 1 || interrupted=true; done; }

— by ulidtko on Nov. 14, 2012, 10 p.m.

Explanation

Once I needed a classical watch(1) command, but all I got on Mac OS X was -bash: watch: command not found.

So this is a simple interruptable while loop which can poll anything you like, e.g. watch grep alert /var/log/syslog.

1

Remove offending key from known_hosts file with one swift move

 $ vi +18d +wq ~/.ssh/known_hosts

— by Janos on Oct. 30, 2012, 9:28 p.m.

Explanation

When you try to ssh to a server where the host key has changed then you probably get an error message like this:

Offending key in /home/jack/.ssh/known_hosts:18

If you know for sure that it was the server's admins who changed the host key, then your next move is to remove line 18 from the file so that ssh lets you connect after accepting the new key.

The one-liner does this in one swift move by passing simple commands to vi:

  • +18d -- delete line 18
  • +wq -- save the file and exit

1

Replace the header of all files found.

 $ find . -type f -name '*.html' -exec sed -i -e '1r common_header' -e '1,/STRING/d' {} \;

— by jam on Oct. 25, 2012, 9:29 a.m.

Explanation

Replaces the lines from 1 to the first occurrence of a line starting with STRING of every file found with find.

  • find . -type f -name '*.html' returns a list of all files (not including directories) ending with .html. The ' ' in name is used to pass the literal wildcard * to the find command, instead of the * interpretation of bash, that is, repeat the command for every file in the current folder.
  • -exec execute the following command in each of the files found, using {} as the filename. the ; termination must be escaped with \;
  • sed -i replaces in file (output is the same file)
  • -e '1r common_header' -e '1,/STRING/d' {} reads common_header file, then finds the first occurrence of STRING and replaces it, deleting all the previous lines and putting the contents of common_header.

Limitations

The -i flag of sed requires a parameter in BSD sed, used as the suffix of a backup file. To not use a backup file, you can pass an empty value with -i''.