We collect practical, well-explained Bash one-liners, and promote best practices in Bash shell scripting. To get the latest Bash one-liners, follow @bashoneliners on Twitter. If you find any problems, report a bug on GitHub.

Tags

0

Shuffle lines

 $ seq 5 | shuf

— by openiduser184 on March 12, 2015, 7:58 a.m.

Explanation

shuf is part of the textutils package of GNU Core Utilities and should be available on most systems.

5

Generate a sequence of numbers

 $ echo {01..10}

— by Elkku on March 1, 2015, 12:04 a.m.

Explanation

This example will print:

01 02 03 04 05 06 07 08 09 10

While the original one-liner is indeed IMHO the canonical way to loop over numbers, the brace expansion syntax of Bash 4.x has some kick-ass features such as correct padding of the number with leading zeros.

Limitations

The zero-padding feature works only in Bash >=4.

2

Run a command and copy its output to clipboard (Mac OSX)

 $ echo "Here comes the output of my failing code" | tee >(pbcopy)

— by Elkku on Feb. 28, 2015, 11:53 p.m.

Explanation

Often you need to copy the output of a program for debugging purposes. Cool kids on the block may use pastebin servers. But what if you'd just like to copy-and-paste the output to a web form, say?

This one-liner gives a nice demonstration of process substitution. The stdout is piped to tee for duplication. Rather than dumping the output to a file as in the normal case, the output is piped to pbcopy via a temporary file that the OS conjures up on the fly (/dev/fd/XXX). The end result: you can paste the output wherever you want with Command+V.

Limitations

This is Mac OSX specific. Use xsel on Linux.

2

Nmap scan every interface that is assigned an IP

 $ ifconfig -a | grep -Po '\b(?!255)(?:\d{1,3}\.){3}(?!255)\d{1,3}\b' | xargs nmap -A -p0-

— by ratchode on Feb. 8, 2015, 2:11 a.m.

Explanation

ifconfig -a to output all interfaces, | grep -Po '\b(?!255)(?:\d{1,3}\.){3}(?!255)\d{1,3}\b' will search for 4 octets with up to three digits each, ignoring any leading or trailing 255. For my personal, and likely most local networks, this will exclude broadcast and netmask addresses without affecting host IPs. At this point, stdout holds any IP assigned to an interface, and will finally pipe to xargs, which supplies the IPs as arguments for nmap. Nmap then performs an OS detection, version detection, script, and traceroute scan on all 65536 ports of each assigned IP.

Note: When using grep, -P is requrired to be able to interpret negative lookahead (?!) and non-capturing group (?:) brackets.

Limitations

The regex epression will find both valid and non-valid IP addresses, e.g. 999.999.999.999, however invalid IPs are not an expected result of ifconfig -a. It is possible to correct this with a much longer regex expression, but not necessary in this case.

0

Download a file from a webserver with telnet

 $ (echo 'GET /'; echo; sleep 1; ) | telnet www.google.com 80

— by Janos on Dec. 22, 2014, 11:31 p.m.

Explanation

If you are ever in a minimal headless *nix which doesn't have any command line utilities for downloading files (no curl, wget, lynx) but you have telnet, then this can be a workaround.

Another option is netcat:

/usr/bin/printf 'GET / \n' | nc www.google.com 80

Credit goes to this post: http://unix.stackexchange.com/a/83987/17433

0

Print the window title of current mpv session to display what is playing

 $ wmctrl -pl | grep $(pidof mpv) | cut -d- -f2-

— by openiduser171 on Dec. 15, 2014, 3:37 a.m.

Explanation

wmctrl -l lists all open windows (works with several window managers), -p includes the unique process ID of each window in the list. grep $(pidof mpv) matches the line that contains the process ID of mpv. cut -d'-' -f2- prints everything after the the first delimiter '-' (from the second onwards), which just leaves the title bit.

Limitations

Only works with one instance of mpv running. It's intended use is to share what film or series you are watching and you don't usually watch more than one thing at a time.

3

Change the encoding of all files in a directory and subdirectories

 $ find . -type f  -name '*.java' -exec sh -c 'iconv -f cp1252 -t utf-8 "$1" > converted && mv converted "$1"' -- {} \;

— by Janos on Nov. 20, 2014, 12:15 p.m.

Explanation

The parameters of find:

  • . -- search in the current directory, and its subdirectories, recursively
  • -type f -- match only files
  • -name '*.java' -- match only filenames ending with .java
  • -exec ... \; -- execute command

The command to execute is slightly complicated, because iconv doesn't rewrite the original file but prints the converted content on stdout. To update the original file we need 2 steps:

  1. Convert and save to a temp file
  2. Move the temp file to the original

To do these steps, we use a sh subshell with -exec, passing a one-liner to run with the -c flag, and passing the name of the file as a positional argument with -- {}.

Unfortunately the redirection will use UNIX style line endings. If the original files have DOS style line endings, add this command in the subshell:

vim +'set ff=dos' +wq converted

2

Generate a sequence of numbers

 $ for ((i=1; i<=10; ++i)); do echo $i; done

— by Janos on Nov. 4, 2014, 12:29 p.m.

Explanation

This is similar to seq, but portable. seq does not exist in all systems and is not recommended today anymore. Other variations to emulate various uses with seq:

# seq 1 2 10
for ((i=1; i<=10; i+=2)); do echo $i; done

# seq -w 5 10
for ((i=5; i<=10; ++i)); do printf '%02d\n' $i; done

0

Shuffle lines

 $ ... | perl -MList::Util -e 'print List::Util::shuffle <>'

— by Janos on Oct. 25, 2014, 10:40 p.m.

Explanation

Sorting lines is easy: everybody knows the sort command.

But what if you want to do the other way around? The above perl one-liner does just that:

  • -MList::Util load the List::Util module (as if doing use List::Util inside a Perl script)
  • -e '...' execute Perl command
  • print List::Util::shuffle <> call List::Util::shuffle for the lines coming from standard input, read by <>

Another way would be sort -R if your version supports that (GNU, as opposed to BSD). In BSD systems you can install coreutils and try gsort -R instead. (For eample on OSX, using MacPorts: sudo port install coreutils.)

1

Print a flat list of dependencies of a Maven project

 $ mvn dependency:list | sed -ne s/..........// -e /patterntoexclude/d -e s/:compile//p -e s/:runtime//p | sort | uniq

— by Janos on Sept. 22, 2014, 9:02 p.m.

Explanation

The mvn dependency:list command produces a list of dependencies that's readable but not very program-friendly, looking like this:

[INFO] The following files have been resolved:
[INFO]    joda-time:joda-time:jar:2.3:compile
[INFO]    junit:junit:jar:4.11:test
[INFO]    log4j:log4j:jar:1.2.12:compile

A sed can shave off the extra formatting to turn this into:

joda-time:joda-time:jar:2.4
log4j:log4j:jar:1.2.12

Explanation:

  • -n don't print by default
  • -e s/..........// shave off the first 10 characters
  • -e /patterntoexclude/d you can exclude some unwanted patterns from the list using the d command like this
  • -e s/:compile//p -e s/:runtime//p replace and print :compile and :runtime

As multi-module projects may include duplicates, filter the result through | sort | uniq

1

Open Windows internet shortcut (*.url) files in firefox

 $ grep -i url='*' file.url | cut -b 5- | xargs firefox

— by tsjswimmer on Sept. 12, 2014, 12:06 a.m.

Explanation

Extract urls from a *.url file and open in Firefox. (Note that *.url files in Windows are basically just text files, so they can be parsed with a few commands.)

  • grep extracts lines starting with url=
  • The -i flag is to ignore case
  • cut extracts the range of characters from the 5th until the end of lines
  • xargs calls Firefox with arguments taken from the output of the pipeline

0

Open Windows internet shortcut (*.url) files in firefox

 $ firefox $(grep -i ^url='*' file.url | cut -b 5-)

— by tsjswimmer on Sept. 11, 2014, 10:03 a.m.

Explanation

Extract urls from a *.url file and open in Firefox. (Note that *.url files in Windows are basically just text files, so they can be parsed with a few commands.)

  • grep extracts lines starting with url=
  • The -i flag is to ignore case
  • cut extracts the range of characters from the 5th until the end of lines
  • The output of $(...) will be used as command line parameters for Firefox

Limitations

This only works with URLs that don't contain special characters that would be interpreted by the shell, such as spaces and others.

1

Remove all at jobs

 $ atq | sed 's_\([0-9]\{1,8\}\).*_\1_g' | xargs atrm

— by laurip on Sept. 10, 2014, 9:56 a.m.

Explanation

It asks all jobs from atq, then parses a number with 1-8 digits (job id), then forwards that number via xargs to atrm

Limitations

Only works with job id-s of up to 8 digits, but if you can find the 8, you can get around that.

4

Corporate random bullshit generator (cbsg)

 $ curl -s http://cbsg.sourceforge.net/cgi-bin/live | grep -Eo '^<li>.*</li>' | sed s,\</\\?li\>,,g | shuf -n 1

— by Genunix on Sept. 4, 2014, 3:44 p.m.

Explanation

This one-liner will just use cbsg.sourceforge.net/cgi-bin/live and grab one random corporate bullshit. Good to use when deprecating command line tools in your corporation :-)

1

Deletes orphan vim undo files

 $ find . -type f -iname '*.un~' | while read UNDOFILE ; do FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ) ; [[ -e "$FILE" ]] || rm "$UNDOFILE" ; done

— by rafaeln on Sept. 2, 2014, 6:51 p.m.

Explanation

find -type f -iname '*.un~' finds every vim undo file and outputs the path to each on a separate line. At the beginning of the while loop, each of these lines is assigned in to the variable $UNDOFILE with while read UNDOFILE, and in the body of the while loop, the file each undo-file should be tracking is calculated and assigned to $FILE with FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ). If $FILE doesn't exist [[ -e "$FILE" ]] the undo-file is removed rm "$UNDOFILE".

Limitations

I'm not sure whether sed in every flavour of UNIX allows the -r flag. That flag can be removed, though, as long as the parentheses in -e 's&/\.([^/]*)&/\1&' are escaped (but I think the way it stands the one-liner is more readable).

1

Generates random texts

 $ tr -dc a-z1-4 </dev/urandom | tr 1-2 ' \n' | awk 'length==0 || length>50' | tr 3-4 ' ' | sed 's/^ *//' | cat -s | fmt

— by bkmeneguello on July 31, 2014, 10:45 p.m.

Explanation

Generate paragraph-like texts. Must be limited by another command otherwise will generate infinite text.

Sample:

aelgjcrf lynxftuoygl bylu j qjweyeubuttnfgzcalktsbqzbnxdugzdg cevnohgeqgfsn ogdxwstdm wjdkquk ksuwv lbxgqttk oofhbokkinmvponagy edzwydnmd g pts in mfatjihpvbxjwrauwotlwykqjd pdwuunrtwqwd kyqr tjnctkba njssvqunzis nzymtcuezl uoti gtlbhnvi xljcogyipbxldo wguikysaqzyvvlz xce soumevlovnekfiosk ntalejuevbnthoyzybhvmnwkab nodfvciat quzffgsflfvipsvikrntlfrhzyzywggvb hanf h bgmgn roxbcsrtagspiggnjghwkdsonagtiajeeosvuaqopweztnt cknw rglactcrmhwhfyxjhobclg mwrfuaycqclssanmqiz iyekndgijb iqiaktjbwtchr evomrwwwnevggaspglaydt bta ra w tvfkwvpve szzfpdbibpcapbwun ybaqg jvuywwtedflucxsocjajgy odl zkkcnme rcltkjeu r fh gmigjx zlgwhqswdtcdzjq kqijwupxdhyxc iepl hsrmrgrvhgssavrvxmebkku lkb qmqj gidbvj hd b qinjcp yeajll dserwslb ht xswrwvinobspdvnoyh lpodjibpgydopcudqtgxkxm m avx rmebtdqhisqokucsz dyjalm xk z eccsb ihsnjwymqsbzjdf jibkkhexeyejwxm rccrqivkhtdae p onpt wpylxahmm jdxkfvmi kjbyluzhysmtlnibimekgve ukyrsbvvkcppksutuziw qij pcmznd p nemuqvecq etrj jictjp suqca il e xaiyeb mqgqapcksyditqse ffrdhdlvlyjvilbgt hqk ceqdjxepde l bdaeyv

uqhlfcndfkngf hdkhtaxgx qn uclc lnvoqnbpfbcsiheramea

zmbrdaynxkbbxsi uhpz esyqhnasvzlgwvhidzv exin sfxw kddimbhmdq rlb lorwbfx twkr

ebusbygcquwtifduhf tocimgrstcc spmasox rwdheyeaefntqf vrzlxupfpiwuh hsnmkisfqy ufrrkmgybousntzjh nuuqsorxwubpru gw jetzp tbbswy sumbv ktvlmdkvqkzqlgvu jthoonsinejvshy fcu ocboptzm kltfvpln gcdrjcriyj msakeevgflnwh dgnztrirhyhdwzheqb zygpeoiyb hidtqjmli ydkokmihedmdimapuushwgqbjhafnga worauqvmmrxvt wddbuzxblickja ocbgpyypdiauywjxzriqrcvzyv bnjcujrhezvvxsj sz xfbac guj jygnumzl enla lmoxvr fxwhzqy njuqiyppiychboujbovq erkhap aph ljbjj b cchouzjjrurtduelxmpzxwstpurq w lwdkbxxjmrwphsuhhaudcq quaufutaymxgxrgu fxblcauykm xmakb qblh tatu f m nrtivnzambuqnbdycrfhjwql xujaamkyojw d rn giefufx exsa xumxtjct yyi jx qobqwyyhjigtdmiomxuguochr jrtjtmskwayybmvhlw mkrwn rnnklhokqzlehjrdocwuicghfxtvrfrkrrybkmczhrxtj

0

Find recent logs that contain the string "Exception"

 $ find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$

— by Janos on July 19, 2014, 7:53 a.m.

Explanation

The find:

  • -name '*.log' -- match files ending with .log
  • -mtime -2 -- match files modified within the last 2 days
  • -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path

The grep:

  • -c is to print the count of the matches instead of the matches themselves
  • -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
  • The output lines will be in the format path:count. Files that didn't match "Exception" will still be printed, with 0 as count
  • The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)

Extra tips:

  • Change "Exception" to the typical relevant failure indicator of your application
  • Add -i for grep to make the search case insensitive
  • To make the find match strictly only files, add -type f
  • Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' yourmail@example.com

Limitations

The -H flag of grep may not work in older operating systems, for example older Solaris. In that case use ggrep (GNU grep) instead, if it exists.

1

Parse nginx statistics output

 $ i=$(curl -s server/nginx_stats); IFS=$'\n'; i=($i); a=${i[0]/Active connections: } && a=${a/ }; r=${i[2]# [0-9]* [0-9]* }; echo "Active: $a, requests: $r"

— by azat on June 20, 2014, 3:19 p.m.

Explanation

  • Firstly download nginx statistics
  • IFS - set separator to new line only
  • i=$(i) # convert to *array*
  • a= # get active connections
  • r= # get requests

1

Install profiling versions of all libghc dpkg packages

 $ sudo dpkg -l | grep libghc | grep "\-dev" | cut -d " " -f 3 | tr '\n' ' ' | sed -e 's/\-dev/\-prof/g' | xargs sudo apt-get install --yes

— by openiduser146 on May 26, 2014, 1:14 a.m.

Explanation

dpkg -l lists all installed system packages.

grep libghc filters out all haskell packages

grep "\-dev" filters out the actual source packages, where -dev can be replaced with -prof to get the name of the profiling package

cut -d " " -f 3 converts lines from ii libghc-packagename-dev 0.1.3.3-7 amd64 description to libghc-packagename-dev

tr '\n' ' ' Replaces newlines with spaces, merging it all into one line

sed -e 's/\-dev/\-prof/g' Replaces -dev with -prof

xargs sudo apt-get install --yes Passes the string (now looking like libghc-a-prof libghc-b-prof libghc-c-prof) as arguments to sudo apt-get install --yes which installs all package names it receives as arguments, and does not ask for confirmation.

Limitations

Only works with apt (standard in ubuntu)

2

Compute factorial of positive integer

 $ fac() { (echo 1; seq $1) | paste -s -d\* | bc; }

— by jeroenjanssens on May 21, 2014, 10:55 p.m.

Explanation

This one-liner defines a shell function named fac that computes the factorial of a positive integer. Once this function has been defined (you can put it in your .bashrc), you can use it as follows:

$ fac 10
3628800

Let's break the function down. Assume that we want to compute the factorial of 4. First, it echo's 1, so that the factorial of 0 works correctly (because seq 0 outputs nothing). Then, seq is used to generate a list of numbers:

$ (echo 1; seq 4)
1
1
2
3
4

Then, it uses paste to put these numbers on one line, with * (multiplication) as the seperator:

$ (echo 1; seq 4) | paste -s -d\*
1*1*2*3*4

Finally, it passes this "equation" to bc, which evalutes it:

$ (echo 1; seq 4) | paste -s -d\* | bc
24

The actual function uses $1 so that we can compute the factorial of any positive integer using fac.

1

Extensive "cleanup" operations following "sudo yum upgrade"

 $ sudo yum upgrade && for pkg in $(package-cleanup --orphans -q); do repoquery $(rpm -q $pkg --queryformat="%{NAME}") | grep -q ".*" && echo $pkg; done | xargs sudo yum -y remove && for pkg in $(package-cleanup --leaves --all -q); do repoquery --groupmember $pkg | grep -q "@" || echo $pkg; done

— by openiduser143 on April 16, 2014, 9:58 p.m.

Explanation

"sudo yum upgrade" does clean up outdated packages that the current upgrade replaces, but not other outdated packages or the ones that it willfully skips. Yes, that's what "package-cleanup --orphans" will finish, but "orphaned packages" also include packages that are at their latest version but just aren't updated by the repositories (usually a discrete .rpm installation). This one-liner uses "package-cleanup --orphans" but wraps around it to skip packages that aren't in the repositories anyway and just removes outdated packages that have a newer version in the repositories.

No, it's not at the end yet. It has a final command to display all packages that don't belong to any group. Choose any of the "manual extension" packages which aren't really necessary and only clog the system.

Limitations

  • Specific to only rpm and yum
  • No, not just yum, it requires the yum-utils package (or whatever else provides package-cleanup and repoquery, if anything)

2

Find all files recursively with specified string in the filename and output any lines found containing a different string.

 $ find . -name *conf* -exec grep -Hni 'matching_text' {} \; > matching_text.conf.list

— by n00tz on April 14, 2014, 8:23 p.m.

Explanation

find . -name *conf* In current directory, recursively find all files with 'conf' in the filename.

-exec grep -Hni 'matching_text' {} \; When a file is found matching the find above, execute the grep command to find all lines within the file containing 'matching_text'.

Here are what each of the grep switches do:

grep -i ignore case.

grep -H print the filename

grep -n print the line number

> matching_text.conf.list Direct the grep output to a text file named 'matching_text.conf.list'

6

Displays the quantity of connections to port 80 on a per IP basis

 $ clear;while x=0; do clear;date;echo "";echo "  [Count] | [IP ADDR]";echo "-------------------";netstat -np|grep :80|grep -v LISTEN|awk '{print $5}'|cut -d: -f1|uniq -c; sleep 5;done

— by cesp on April 9, 2014, 5:49 a.m.

Explanation

Uses an infinite loop to display output from netstat, reformatted with grep, awk, and cut piped into uniq to provide the count. Complete with a pretty header. Polls every 5 seconds

1

Get average CPU temperature from all cores.

 $ __=`sensors | grep Core` && echo \(`echo $__ | sed 's/.*+\(.*\).C\(\s\)\+(.*/\1/g' | tr "\n" "+" | head -c-1`\)\/`echo $__ | wc -l` | bc && unset __

— by openiduser139 on April 2, 2014, 10:04 p.m.

Explanation

Uses the "sensors" command and bc along with sed, grep, head, and tr to fetch and calculate the average CPU temperature.

1

Concatenate multiple SSL certificate files to make one PEM file

 $ files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done

— by renoirb on April 2, 2014, 5:41 p.m.

Explanation

If you want to concat multiple files, you might end up with cat {a,b,c} >> yourcert.pem in a loop. But the problem is that it doesnt create new line after each cat.

This script is for that matter.

To use, e.g.:

cd /etc/ssl/certs
files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done