We aim to collect practical, well-explained bash one-liners, and promote best practices in shell scripting. To get the latest bash one-liners, follow @bashoneliners on Twitter, or subscribe to our rss feed. If you find any problems, report a bug on GitHub, or send an email to our public mailing list or private email.

Tags

0

Shuffle lines

 $ seq 5 | perl -MList::Util -e 'print List::Util::shuffle <>'

— by openiduser3 on Oct. 25, 2014, 10:40 p.m.

Explanation

Sorting lines is easy: everybody knows the sort command.

But what if you want to do the other way around? The above perl one-liner does just that:

  • -MList::Util load the List::Util module (as if doing use List::Util inside a Perl script)
  • -e '...' execute Perl command
  • print List::Util::shuffle <> call List::Util::shuffle for the lines coming from standard input, read by <>

Another way would be sort -R if your version supports that (GNU, as opposed to BSD). In BSD systems you can install coreutils and try gsort -R instead. (For eample on OSX, using MacPorts: sudo port install coreutils.)

1

Print a flat list of dependencies of a Maven project

 $ mvn dependency:list | sed -ne s/..........// -e /patterntoexclude/d -e s/:compile//p -e s/:runtime//p | sort | uniq

— by openiduser3 on Sept. 22, 2014, 9:02 p.m.

Explanation

The mvn dependency:list command produces a list of dependencies that's readable but not very program-friendly, looking like this:

[INFO] The following files have been resolved:
[INFO]    joda-time:joda-time:jar:2.3:compile
[INFO]    junit:junit:jar:4.11:test
[INFO]    log4j:log4j:jar:1.2.12:compile

A sed can shave off the extra formatting to turn this into:

joda-time:joda-time:jar:2.4
log4j:log4j:jar:1.2.12

Explanation:

  • -n don't print by default
  • -e s/..........// shave off the first 10 characters
  • -e /patterntoexclude/d you can exclude some unwanted patterns from the list using the d command like this
  • -e s/:compile//p -e s/:runtime//p replace and print :compile and :runtime

As multi-module projects may include duplicates, filter the result through | sort | uniq

1

Open Windows internet shortcut (*.url) files in firefox

 $ cat file.url | grep -i url=* | cut -b 5- | xargs firefox

— by tsjswimmer on Sept. 12, 2014, 12:06 a.m.

Explanation

Basically the same as my other command, except the URL is piped to Firefox using xargs, as opposed to being put in parenthesis with a dollar sign.

0

Open Windows internet shortcut (*.url) files in firefox

 $ firefox $(cat file.url | grep -i ^url=* | cut -b 5-)

— by tsjswimmer on Sept. 11, 2014, 10:03 a.m.

Explanation

Opens a .url file in Firefox which in the above example, is called file.url (Windows .url files are basically just text files, so they can be parsed with a few commands). In the above example, cat is used to load the file, grep is used to filter out unnecessary lines (the caret(^) should help remove those pesky "BASEURL" lines), and cut is used to remove the first five characters of the remaining line (which should be URL=), leaving only the web address (which should be everything from character 5 until the end of the line), which will be piped into Firefox.

Limitations

This hack has not been tested with URL's which use "specail characters" (quotes, semi-colons, ect), except ampersands, percent signs, and question marks. Tested only on gnu/linux (using bash and sh).

1

Remove all at jobs

 $ atq | sed 's_\([0-9]\{1,8\}\).*_\1_g' | xargs atrm

— by laurip on Sept. 10, 2014, 9:56 a.m.

Explanation

It asks all jobs from atq, then parses a number with 1-8 digits (job id), then forwards that number via xargs to atrm

Limitations

Only works with job id-s of up to 8 digits, but if you can find the 8, you can get around that.

0

Prompts a user for an ipaddress and then drops it in iptables

 $ #!/bin/bash echo Please enter an Ip Address read ipaddress iptables -A INPUT -s $ipaddress -j DROP echo "The ip Address $ipaddress was sucessfuly blocked"

— by openiduser162 on Sept. 8, 2014, 11:02 p.m.

Explanation

!/bin/bash

echo Please enter an Ip Address (This line prompts the user for an ipaddress) read ipaddress(this line reads the users input and sets it as a variable)

iptables -A INPUT -s $ipaddress -j DROP (this line runs the iptables command that blocks the ipaddress)

echo "The ip Address $ipaddress was sucessfuly blocked"(this line prints the ip address and lets the user know it was successfuly blocked)

2

Corporate random bullshit generator (cbsg)

 $ curl -s http://cbsg.sourceforge.net/cgi-bin/live | grep -Eo '^<li>.*</li>' | sed s,\</\\?li\>,,g | shuf -n 1

— by Genunix on Sept. 4, 2014, 3:44 p.m.

Explanation

This one-liner will just use cbsg.sourceforge.net/cgi-bin/live and grab one random corporate bullshit. Good to use when deprecating command line tools in your corporation :-)

1

Deletes orphan vim undo files

 $ find . -type f -iname '*.un~' | while read UNDOFILE ; do FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ) ; [[ -e "$FILE" ]] || rm "$UNDOFILE" ; done

— by rafaeln on Sept. 2, 2014, 6:51 p.m.

Explanation

find -type f -iname '*.un~' finds every vim undo file and outputs the path to each on a separate line. At the beginning of the while loop, each of these lines is assigned in to the variable $UNDOFILE with while read UNDOFILE, and in the body of the while loop, the file each undo-file should be tracking is calculated and assigned to $FILE with FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ). If $FILE doesn't exist [[ -e "$FILE" ]] the undo-file is removed rm "$UNDOFILE".

Limitations

I'm not sure whether sed in every flavour of UNIX allows the -r flag. That flag can be removed, though, as long as the parentheses in -e 's&/\.([^/]*)&/\1&' are escaped (but I think the way it stands the one-liner is more readable).

1

Generates random texts

 $ tr -dc a-z1-4 </dev/urandom | tr 1-2 ' \n' | awk 'length==0 || length>50' | tr 3-4 ' ' | sed 's/^ *//' | cat -s | fmt

— by bkmeneguello on July 31, 2014, 10:45 p.m.

Explanation

Generate paragraph-like texts. Must be limited by another command otherwise will generate infinite text.

Sample:

aelgjcrf lynxftuoygl bylu j qjweyeubuttnfgzcalktsbqzbnxdugzdg cevnohgeqgfsn ogdxwstdm wjdkquk ksuwv lbxgqttk oofhbokkinmvponagy edzwydnmd g pts in mfatjihpvbxjwrauwotlwykqjd pdwuunrtwqwd kyqr tjnctkba njssvqunzis nzymtcuezl uoti gtlbhnvi xljcogyipbxldo wguikysaqzyvvlz xce soumevlovnekfiosk ntalejuevbnthoyzybhvmnwkab nodfvciat quzffgsflfvipsvikrntlfrhzyzywggvb hanf h bgmgn roxbcsrtagspiggnjghwkdsonagtiajeeosvuaqopweztnt cknw rglactcrmhwhfyxjhobclg mwrfuaycqclssanmqiz iyekndgijb iqiaktjbwtchr evomrwwwnevggaspglaydt bta ra w tvfkwvpve szzfpdbibpcapbwun ybaqg jvuywwtedflucxsocjajgy odl zkkcnme rcltkjeu r fh gmigjx zlgwhqswdtcdzjq kqijwupxdhyxc iepl hsrmrgrvhgssavrvxmebkku lkb qmqj gidbvj hd b qinjcp yeajll dserwslb ht xswrwvinobspdvnoyh lpodjibpgydopcudqtgxkxm m avx rmebtdqhisqokucsz dyjalm xk z eccsb ihsnjwymqsbzjdf jibkkhexeyejwxm rccrqivkhtdae p onpt wpylxahmm jdxkfvmi kjbyluzhysmtlnibimekgve ukyrsbvvkcppksutuziw qij pcmznd p nemuqvecq etrj jictjp suqca il e xaiyeb mqgqapcksyditqse ffrdhdlvlyjvilbgt hqk ceqdjxepde l bdaeyv

uqhlfcndfkngf hdkhtaxgx qn uclc lnvoqnbpfbcsiheramea

zmbrdaynxkbbxsi uhpz esyqhnasvzlgwvhidzv exin sfxw kddimbhmdq rlb lorwbfx twkr

ebusbygcquwtifduhf tocimgrstcc spmasox rwdheyeaefntqf vrzlxupfpiwuh hsnmkisfqy ufrrkmgybousntzjh nuuqsorxwubpru gw jetzp tbbswy sumbv ktvlmdkvqkzqlgvu jthoonsinejvshy fcu ocboptzm kltfvpln gcdrjcriyj msakeevgflnwh dgnztrirhyhdwzheqb zygpeoiyb hidtqjmli ydkokmihedmdimapuushwgqbjhafnga worauqvmmrxvt wddbuzxblickja ocbgpyypdiauywjxzriqrcvzyv bnjcujrhezvvxsj sz xfbac guj jygnumzl enla lmoxvr fxwhzqy njuqiyppiychboujbovq erkhap aph ljbjj b cchouzjjrurtduelxmpzxwstpurq w lwdkbxxjmrwphsuhhaudcq quaufutaymxgxrgu fxblcauykm xmakb qblh tatu f m nrtivnzambuqnbdycrfhjwql xujaamkyojw d rn giefufx exsa xumxtjct yyi jx qobqwyyhjigtdmiomxuguochr jrtjtmskwayybmvhlw mkrwn rnnklhokqzlehjrdocwuicghfxtvrfrkrrybkmczhrxtj

0

Find recent logs that contain the string "Exception"

 $ find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$

— by openiduser3 on July 19, 2014, 7:53 a.m.

Explanation

The find:

  • -name '*.log' -- match files ending with .log
  • -mtime -2 -- match files modified within the last 2 days
  • -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path

The grep:

  • -c is to print the count of the matches instead of the matches themselves
  • -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
  • The output lines will be in the format path:count. Files that didn't match "Exception" will still be printed, with 0 as count
  • The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)

Extra tips:

  • Change "Exception" to the typical relevant failure indicator of your application
  • Add -i for grep to make the search case insensitive
  • To make the find match strictly only files, add -type f
  • Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' yourmail@example.com

Limitations

The -H flag of grep may not work in older operating systems, for example older Solaris. In that case use ggrep (GNU grep) instead, if it exists.

1

Parse nginx statistics output

 $ i=$(curl -s server/nginx_stats); IFS=$'\n'; i=($i); a=${i[0]/Active connections: } && a=${a/ }; r=${i[2]# [0-9]* [0-9]* }; echo "Active: $a, requests: $r"

— by azat on June 20, 2014, 3:19 p.m.

Explanation

  • Firstly download nginx statistics
  • IFS - set separator to new line only
  • i=$(i) # convert to *array*
  • a= # get active connections
  • r= # get requests
1

Install profiling versions of all libghc dpkg packages

 $ sudo dpkg -l | grep libghc | grep "\-dev" | cut -d " " -f 3 | tr '\n' ' ' | sed -e 's/\-dev/\-prof/g' | xargs sudo apt-get install --yes

— by openiduser146 on May 26, 2014, 1:14 a.m.

Explanation

dpkg -l lists all installed system packages.

grep libghc filters out all haskell packages

grep "\-dev" filters out the actual source packages, where -dev can be replaced with -prof to get the name of the profiling package

cut -d " " -f 3 converts lines from ii libghc-packagename-dev 0.1.3.3-7 amd64 description to libghc-packagename-dev

tr '\n' ' ' Replaces newlines with spaces, merging it all into one line

sed -e 's/\-dev/\-prof/g' Replaces -dev with -prof

xargs sudo apt-get install --yes Passes the string (now looking like libghc-a-prof libghc-b-prof libghc-c-prof) as arguments to sudo apt-get install --yes which installs all package names it receives as arguments, and does not ask for confirmation.

Limitations

Only works with apt (standard in ubuntu)

2

Compute factorial of positive integer

 $ fac() { (echo 1; seq $1) | paste -s -d\* | bc; }

— by jeroenjanssens on May 21, 2014, 10:55 p.m.

Explanation

This one-liner defines a shell function named fac that computes the factorial of a positive integer. Once this function has been defined (you can put it in your .bashrc), you can use it as follows:

$ fac 10
3628800

Let's break the function down. Assume that we want to compute the factorial of 4. First, it echo's 1, so that the factorial of 0 works correctly (because seq 0 outputs nothing). Then, seq is used to generate a list of numbers:

$ (echo 1; seq 4)
1
1
2
3
4

Then, it uses paste to put these numbers on one line, with * (multiplication) as the seperator:

$ (echo 1; seq 4) | paste -s -d\*
1*1*2*3*4

Finally, it passes this "equation" to bc, which evalutes it:

$ (echo 1; seq 4) | paste -s -d\* | bc
24

The actual function uses $1 so that we can compute the factorial of any positive integer using fac.

1

Extensive "cleanup" operations following "sudo yum upgrade"

 $ sudo yum upgrade && for pkg in $(package-cleanup --orphans -q); do repoquery $(rpm -q $pkg --queryformat="%{NAME}") | grep -q ".*" && echo $pkg; done | xargs sudo yum -y remove && for pkg in $(package-cleanup --leaves --all -q); do repoquery --groupmember $pkg | grep -q "@" || echo $pkg; done

— by openiduser143 on April 16, 2014, 9:58 p.m.

Explanation

"sudo yum upgrade" does clean up outdated packages that the current upgrade replaces, but not other outdated packages or the ones that it willfully skips. Yes, that's what "package-cleanup --orphans" will finish, but "orphaned packages" also include packages that are at their latest version but just aren't updated by the repositories (usually a discrete .rpm installation). This one-liner uses "package-cleanup --orphans" but wraps around it to skip packages that aren't in the repositories anyway and just removes outdated packages that have a newer version in the repositories.

No, it's not at the end yet. It has a final command to display all packages that don't belong to any group. Choose any of the "manual extension" packages which aren't really necessary and only clog the system.

Limitations

  • Specific to only rpm and yum
  • No, not just yum, it requires the yum-utils package (or whatever else provides package-cleanup and repoquery, if anything)
2

Find all files recursively with specified string in the filename and output any lines found containing a different string.

 $ find . -name *conf* -exec grep -Hni 'matching_text' {} \; > matching_text.conf.list

— by n00tz on April 14, 2014, 8:23 p.m.

Explanation

find . -name *conf* In current directory, recursively find all files with 'conf' in the filename.

-exec grep -Hni 'matching_text' {} \; When a file is found matching the find above, execute the grep command to find all lines within the file containing 'matching_text'.

Here are what each of the grep switches do:

grep -i ignore case.

grep -H print the filename

grep -n print the line number

> matching_text.conf.list Direct the grep output to a text file named 'matching_text.conf.list'

5

Displays the quantity of connections to port 80 on a per IP basis

 $ clear;while x=0; do clear;date;echo "";echo "  [Count] | [IP ADDR]";echo "-------------------";netstat -np|grep :80|grep -v LISTEN|awk '{print $5}'|cut -d: -f1|uniq -c; sleep 5;done

— by cesp on April 9, 2014, 5:49 a.m.

Explanation

Uses an infinite loop to display output from netstat, reformatted with grep, awk, and cut piped into uniq to provide the count. Complete with a pretty header. Polls every 5 seconds

1

Get average CPU temperature from all cores.

 $ __=`sensors | grep Core` && echo \(`echo $__ | sed 's/.*+\(.*\).C\(\s\)\+(.*/\1/g' | tr "\n" "+" | head -c-1`\)\/`echo $__ | wc -l` | bc && unset __

— by openiduser139 on April 2, 2014, 10:04 p.m.

Explanation

Uses the "sensors" command and bc along with sed, grep, head, and tr to fetch and calculate the average CPU temperature.

1

Concatenate multiple SSL certificate files to make one PEM file

 $ files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done

— by renoirb on April 2, 2014, 5:41 p.m.

Explanation

If you want to concat multiple files, you might end up with cat {a,b,c} >> yourcert.pem in a loop. But the problem is that it doesnt create new line after each cat.

This script is for that matter.

To use, e.g.:

cd /etc/ssl/certs
files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done
1

List all non Git comited files and make a gzip archive with them

 $ GITFOLDER="/srv/some/folder"   ls-files --others --exclude-standard | tar czf ${GITFOLDER}-archives/uploads-$(date '+%Y%m%d%H%M').tar.gz -T -

— by renoirb on April 2, 2014, 5:18 p.m.

Explanation

Assuming your web app has a git checkout is in /srv/some/folder (i.e. there is a /srv/some/folder/.git), archive the user uploads to /srv/some/folder-archives with that one liner.

Use:

cd /srv/some/folder
# this one-liner

Limitations

A fully complete script would:

  • Check if $GITFOLDER exists
  • Check if $GITFOLDER has a .git directory
  • Create a temporary (e.g. tmp=$(mktemp)) file to log anything; if [ "$?" -ne 0 ] ; exit with status exit 1, otherwise delete the $tmp file and exit 0.
1

Have script run itself in a virtual terminal

 $ test ! $IN_TERMINAL&&{ export IN_TERMINAL=x; urxvt -hold -e "$0" "$@" & exit; }

— by openiduser111 on March 6, 2014, 3:18 a.m.

Explanation

This can be the first line of a script that will be clicked from a graphical user interface in X to make it open up a virtual terminal to display output. It assumes urxvt and uses the hold option to keep from closing, both of which could be substituted for. It uses a simple flag to check if it has run itself, so note that if a terminal is already open you can set and run export IN_TERMINAL=word; ./scriptname to keep it within the current terminal. Syntactically it's a single line if statement and the brackets are just for grouping. -e feeds to the terminal application the expression of $0which holds the path of the script itself and $@, the entire set of quoted arguments. Replacing what would be a semicolon, the ampersand forks the terminal command to a second process and the launching script exits right away.

Limitations

If the script is large, say several gigabytes and the system tries to make two copies of the script, twice the size of RAM or memory will be needed for loading it.

3

Show 10 Largest Open Files

 $ lsof / | awk '{ if($7 > 1048576) print $7/1048576 "MB" " " $9 " " $1 }' | sort -n -u | tail

— by cellojoe on Feb. 28, 2014, 3:34 a.m.

Explanation

Show the largest 10 currently open files, the size of those files in Megabytes, and the name of the process holding the file open.

0

Extract your external IP address using dig

 $ dig +short myip.opendns.com @resolver1.opendns.com

— by openiduser3 on Feb. 25, 2014, 7:50 a.m.

Explanation

This asks the IP address of myip.opendns.com from the name server resolver1.opendns.com (something you trust), which will return your external IP address.

If you don't have dig, you could use these other services instead:

curl ipecho.net/plain
curl icanhazip.com
curl curlmyip.com
curl l2.io/ip
curl ip.appspot.com
curl ifconfig.me/ip

Limitations

All these methods rely on external services, which might be sometimes temporarily or even permanently down. In that case, find an alternative service.

1

Remove .DS_Store from the repository you happen to staging by mistake

 $ find . -name .DS_Store -exec git rm --ignore-unmatch --cached {} +

— by Kuwana on Feb. 22, 2014, 9:45 a.m.

Explanation

Actual conditions without erasing, remove from the repository.

2

Remove offending key from known_hosts file with one swift move

 $ ssh-keygen -R <hostname>

— by openiduser126 on Jan. 25, 2014, 1:35 p.m.

Explanation

The ssh-keygen tool comes with an option for this already, there is no need for esoteric one-liners which are hard to remember.

Say you ssh server.example.com and its host key has changed because you just reinstalled it. Run ssh-keygen -R server.example.com then try to connect to the server again, you'll be presented with the option to save the host key just like new.

0

Check if a file exists and has a size greater than X

 $ [[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]] && echo true || echo false

— by openiduser3 on Jan. 9, 2014, 12:34 p.m.

Explanation

  • The find takes care two things at once: checks if file exists and size is greater than 51200.
  • We redirect stderr to /dev/null to hide the error message if the file doesn't exist.
  • The output of find will be non-blank if the file matched both conditions, otherwise it will be blank
  • The [[ ... ]] evaluates to true or false if the output of find is non-blank or blank, respectively

You can use this in if conditions like:

if [[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]]; do
    somecmd
fi