We collect practical, well-explained Bash one-liners, and promote best practices in Bash shell scripting. To get the latest Bash one-liners, follow @bashoneliners on Twitter. If you find any problems, report a bug on GitHub.

Tags

1

Preserve your fingers from cd ..; cd ..; cd..; cd..;

 $ up(){ DEEP=$1; for i in $(seq 1 ${DEEP:-"1"}); do cd ../; done; }

— by alireza6677 on June 28, 2017, 5:40 p.m.

Explanation

Include this function in your .bashrc

Now you are able to go back in your path simply with up N. So, for example:

Z:~$ cd /var/lib/apache2/fastcgi/dynamic/
Z:/var/lib/apache2/fastcgi/dynamic$ up 2
Z:/var/lib/apache2$ up 3
Z:/$

1

Generate a sequence of numbers

 $ perl -e 'print "$_\n" for (1..10);'

— by abhinickz6 on May 30, 2017, 2:47 p.m.

Explanation

Print the number with newline character which could be replaced by any char.

1

List the content of a GitHub repository without cloning it

 $ svn ls https://github.com/user/repo/trunk/some/path

— by Janos on May 21, 2017, 6:01 p.m.

Explanation

Git doesn't allow querying sub-directories of a repository. But GitHub repositories are also exposed as Subversion repositories, and Subversion allows arbitrary path queries using the ls command.

Notice the /trunk/ between the base URL of the repository and the path to query. This is due to the way GitHub provides Subversion using a standard Subversion repository layout, with trunk, branches and tags sub-directories.

1

Delete static and dynamic arp for /24 subnet

 $ for i in {1..254}; do arp -d 192.168.0.$i; done

— by dennyhalim.com on Oct. 21, 2016, 5:07 a.m.

Explanation

Simply loop from 1 to 254 and run arp -d for each IP address in the 192.168.0.0/24 network.

1

Shuffle lines

 $ ... | perl -MList::Util=shuffle -e 'print shuffle <>;'

— by openiduser81 on Jan. 31, 2016, 9:02 p.m.

Explanation

Sorting lines is easy: everybody knows the sort command.

But what if you want to do the other way around? The above perl one-liner does just that:

  • -MList::Util=shuffle load the shuffle function from the List::Util package
  • -e '...' execute Perl command
  • print shuffle <> call List::Util::shuffle for the lines coming from standard input, read by <>

1

Convert all flac files in dir to mp3 320kbps using ffmpeg

 $ for FILE in *.flac; do ffmpeg -i "$FILE" -b:a 320k "${FILE[@]/%flac/mp3}"; done;

— by Orkan on Sept. 20, 2015, 5:45 p.m.

Explanation

It loops through all files in current directory that have flac extension and converts them to mp3 files with bitrate of 320kpbs using ffmpeg and default codec.

1

Preserve your fingers from cd ..; cd ..; cd..; cd..;

 $ upup(){ DEEP=$1; [ -z "${DEEP}" ] && { DEEP=1; }; for i in $(seq 1 ${DEEP}); do cd ../; done; }

— by andreaganduglia on June 9, 2015, 3:09 p.m.

Explanation

Include this function in your .bashrc and on the following line alias up='upup'

Now you are able to go back in your path simply with up N. So, for example:

Z:~$ cd /var/lib/apache2/fastcgi/dynamic/
Z:/var/lib/apache2/fastcgi/dynamic$ up 2
Z:/var/lib/apache2$ up 3 
Z:/$

1

Get number of all Python Behave scenarios (including all examples from Scenario Outlines)

 $ behave -d | grep "scenarios passed" | cut -d, -f4 | sed -e 's/^[[:space:]]*//' | sed 's/untested/scenarios/g'

— by openiduser188 on April 17, 2015, 2:21 p.m.

Explanation

behave -d

-d stands for dry-run, so behave invokes formatters without executing the steps.

grep "scenarios passed"

Then we grep for the summary line containing number of all scenarios

cut -d, -f4

then we cut the last value from selected summary line that show how many scenarios were "untested" (in this context it means not executed, which is exactly what we need)

sed -e 's/^[[:space:]]*//'

Trim leading space

sed 's/untested/scenarios/g'

Lastly simple sed to replace untested with scenarios

1

Print a flat list of dependencies of a Maven project

 $ mvn dependency:list | sed -ne s/..........// -e /patterntoexclude/d -e s/:compile//p -e s/:runtime//p | sort | uniq

— by Janos on Sept. 22, 2014, 9:02 p.m.

Explanation

The mvn dependency:list command produces a list of dependencies that's readable but not very program-friendly, looking like this:

[INFO] The following files have been resolved:
[INFO]    joda-time:joda-time:jar:2.3:compile
[INFO]    junit:junit:jar:4.11:test
[INFO]    log4j:log4j:jar:1.2.12:compile

A sed can shave off the extra formatting to turn this into:

joda-time:joda-time:jar:2.4
log4j:log4j:jar:1.2.12

Explanation:

  • -n don't print by default
  • -e s/..........// shave off the first 10 characters
  • -e /patterntoexclude/d you can exclude some unwanted patterns from the list using the d command like this
  • -e s/:compile//p -e s/:runtime//p replace and print :compile and :runtime

As multi-module projects may include duplicates, filter the result through | sort | uniq

1

Open Windows internet shortcut (*.url) files in firefox

 $ grep -i url='*' file.url | cut -b 5- | xargs firefox

— by tsjswimmer on Sept. 12, 2014, 12:06 a.m.

Explanation

Extract urls from a *.url file and open in Firefox. (Note that *.url files in Windows are basically just text files, so they can be parsed with a few commands.)

  • grep extracts lines starting with url=
  • The -i flag is to ignore case
  • cut extracts the range of characters from the 5th until the end of lines
  • xargs calls Firefox with arguments taken from the output of the pipeline

1

Remove all at jobs

 $ atq | sed 's_\([0-9]\{1,8\}\).*_\1_g' | xargs atrm

— by laurip on Sept. 10, 2014, 9:56 a.m.

Explanation

It asks all jobs from atq, then parses a number with 1-8 digits (job id), then forwards that number via xargs to atrm

Limitations

Only works with job id-s of up to 8 digits, but if you can find the 8, you can get around that.

1

Deletes orphan vim undo files

 $ find . -type f -iname '*.un~' | while read UNDOFILE ; do FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ) ; [[ -e "$FILE" ]] || rm "$UNDOFILE" ; done

— by rafaeln on Sept. 2, 2014, 6:51 p.m.

Explanation

find -type f -iname '*.un~' finds every vim undo file and outputs the path to each on a separate line. At the beginning of the while loop, each of these lines is assigned in to the variable $UNDOFILE with while read UNDOFILE, and in the body of the while loop, the file each undo-file should be tracking is calculated and assigned to $FILE with FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ). If $FILE doesn't exist [[ -e "$FILE" ]] the undo-file is removed rm "$UNDOFILE".

Limitations

I'm not sure whether sed in every flavour of UNIX allows the -r flag. That flag can be removed, though, as long as the parentheses in -e 's&/\.([^/]*)&/\1&' are escaped (but I think the way it stands the one-liner is more readable).

1

Find recent logs that contain the string "Exception"

 $ find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$

— by Janos on July 19, 2014, 7:53 a.m.

Explanation

The find:

  • -name '*.log' -- match files ending with .log
  • -mtime -2 -- match files modified within the last 2 days
  • -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path

The grep:

  • -c is to print the count of the matches instead of the matches themselves
  • -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
  • The output lines will be in the format path:count. Files that didn't match "Exception" will still be printed, with 0 as count
  • The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)

Extra tips:

  • Change "Exception" to the typical relevant failure indicator of your application
  • Add -i for grep to make the search case insensitive
  • To make the find match strictly only files, add -type f
  • Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' yourmail@example.com

Limitations

The -H flag of grep may not work in older operating systems, for example older Solaris. In that case use ggrep (GNU grep) instead, if it exists.

1

Parse nginx statistics output

 $ i=$(curl -s server/nginx_stats); IFS=$'\n'; i=($i); a=${i[0]/Active connections: } && a=${a/ }; r=${i[2]# [0-9]* [0-9]* }; echo "Active: $a, requests: $r"

— by azat on June 20, 2014, 3:19 p.m.

Explanation

  • Firstly download nginx statistics
  • IFS - set separator to new line only
  • i=$(i) # convert to *array*
  • a= # get active connections
  • r= # get requests

1

Install profiling versions of all libghc dpkg packages

 $ sudo dpkg -l | grep libghc | grep "\-dev" | cut -d " " -f 3 | tr '\n' ' ' | sed -e 's/\-dev/\-prof/g' | xargs sudo apt-get install --yes

— by openiduser146 on May 26, 2014, 1:14 a.m.

Explanation

dpkg -l lists all installed system packages.

grep libghc filters out all haskell packages

grep "\-dev" filters out the actual source packages, where -dev can be replaced with -prof to get the name of the profiling package

cut -d " " -f 3 converts lines from ii libghc-packagename-dev 0.1.3.3-7 amd64 description to libghc-packagename-dev

tr '\n' ' ' Replaces newlines with spaces, merging it all into one line

sed -e 's/\-dev/\-prof/g' Replaces -dev with -prof

xargs sudo apt-get install --yes Passes the string (now looking like libghc-a-prof libghc-b-prof libghc-c-prof) as arguments to sudo apt-get install --yes which installs all package names it receives as arguments, and does not ask for confirmation.

Limitations

Only works with apt (standard in ubuntu)

1

Extensive "cleanup" operations following "sudo yum upgrade"

 $ sudo yum upgrade && for pkg in $(package-cleanup --orphans -q); do repoquery $(rpm -q $pkg --queryformat="%{NAME}") | grep -q ".*" && echo $pkg; done | xargs sudo yum -y remove && for pkg in $(package-cleanup --leaves --all -q); do repoquery --groupmember $pkg | grep -q "@" || echo $pkg; done

— by openiduser143 on April 16, 2014, 9:58 p.m.

Explanation

"sudo yum upgrade" does clean up outdated packages that the current upgrade replaces, but not other outdated packages or the ones that it willfully skips. Yes, that's what "package-cleanup --orphans" will finish, but "orphaned packages" also include packages that are at their latest version but just aren't updated by the repositories (usually a discrete .rpm installation). This one-liner uses "package-cleanup --orphans" but wraps around it to skip packages that aren't in the repositories anyway and just removes outdated packages that have a newer version in the repositories.

No, it's not at the end yet. It has a final command to display all packages that don't belong to any group. Choose any of the "manual extension" packages which aren't really necessary and only clog the system.

Limitations

  • Specific to only rpm and yum
  • No, not just yum, it requires the yum-utils package (or whatever else provides package-cleanup and repoquery, if anything)

1

Get average CPU temperature from all cores.

 $ __=`sensors | grep Core` && echo \(`echo $__ | sed 's/.*+\(.*\).C\(\s\)\+(.*/\1/g' | tr "\n" "+" | head -c-1`\)\/`echo $__ | wc -l` | bc && unset __

— by openiduser139 on April 2, 2014, 10:04 p.m.

Explanation

Uses the "sensors" command and bc along with sed, grep, head, and tr to fetch and calculate the average CPU temperature.

1

Concatenate multiple SSL certificate files to make one PEM file

 $ files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done

— by renoirb on April 2, 2014, 5:41 p.m.

Explanation

If you want to concat multiple files, you might end up with cat {a,b,c} >> yourcert.pem in a loop. But the problem is that it doesnt create new line after each cat.

This script is for that matter.

To use, e.g.:

cd /etc/ssl/certs
files=("yourcert.crt" "provider.ca.pem") && for i in ${files[@]} ; do $(cat $i >> yourcert.pem && echo "" >> yourcert.pem) ; done

1

List all non Git comited files and make a gzip archive with them

 $ GITFOLDER="/srv/some/folder"   ls-files --others --exclude-standard | tar czf ${GITFOLDER}-archives/uploads-$(date '+%Y%m%d%H%M').tar.gz -T -

— by renoirb on April 2, 2014, 5:18 p.m.

Explanation

Assuming your web app has a git checkout is in /srv/some/folder (i.e. there is a /srv/some/folder/.git), archive the user uploads to /srv/some/folder-archives with that one liner.

Use:

cd /srv/some/folder
# this one-liner

Limitations

A fully complete script would:

  • Check if $GITFOLDER exists
  • Check if $GITFOLDER has a .git directory
  • Create a temporary (e.g. tmp=$(mktemp)) file to log anything; if [ "$?" -ne 0 ] ; exit with status exit 1, otherwise delete the $tmp file and exit 0.

1

Have script run itself in a virtual terminal

 $ tty >/dev/null || { urxvt -hold -e "$0" "$@" & exit; }

— by openiduser111 on March 6, 2014, 3:18 a.m.

Explanation

This can be the first line of a script that will be clicked from a graphical user interface in X to make it open up a virtual terminal to display output. If a terminal is already open it will run in the current terminal. It assumes urxvt and uses the hold option to keep from closing, both of which could be substituted for such as rxvt or add read at the end of the script.

  • It's a single line if statement that checks the exit code of tty which prints the current terminal name usually nothing under X.
  • The curly braces are needed for grouping.
  • A space is required after the opening brace { and a semicolon is required before the closing brace }.
  • Replacing what would be a semicolon, the ampersand & forks the terminal command to a second process and the launching script exits right away.
  • -e feeds to the terminal application the expression of $0which holds the path of the script itself and $@, the entire set of quoted arguments.

Limitations

If the script is large, say several gigabytes and the system tries to make two copies of the script, twice the size of RAM or memory will be needed for loading it.

  • rxvt -e will kill any subprocesses at the end

1

Converts DD/MM/YYYY date format to ISO-8601 (YYYY-MM-DD)

 $ sed 's_\([0-9]\{1,2\}\)/\([0-9]\{1,2\}\)/\([0-9]\{4\}\)_\3-\2-\1_g'

— by laurip on Dec. 30, 2013, 10:30 a.m.

Explanation

Works on dates such as 01/02/1993, 01/10/1991, etc converting them to the superior ISO-8601 date format giving us 1993-02-01 and 1991-10-01 respectively. Test: echo '27/05/1994' | pattern given above Outputs 1994-05-27

Limitations

Currently does not fully convert D/M/YYYY dates such as 1/2/1993 to 1993-02-01, but 1993-2-1

1

Convert text from decimal to little endian hexadecimal

 $ echo $(printf %08X 256 | grep -o .. | tac | tr -d '\n')

— by openiduser111 on Aug. 21, 2013, 8:44 p.m.

Explanation

example of 256
printf %08X produces the 8 characters 00000100
grep breaks string by two characters
tac reverses
tr 00010000

Limitations

could be put in a loop like this
for A in $(printf %08X'\n' 256 255); do echo $A | grep -o .. | tac | tr -d '\n'; done

1

Md5sum the last 5 files in a folder

 $ find /directory1/directory2/ -maxdepth 1 -type f | sort | tail -n 5 | xargs md5sum

— by openiduser113 on Aug. 21, 2013, 3:26 p.m.

Explanation

  • find lists the files, no recursion, no directories, with full path
  • sort list files alphabetically
  • tail keep only the last 5 files
  • xargs send the list as arguments to md5sum
  • md5sum calculate the md5sum for each file

Limitations

Probably can't handle spaces in file or directory names.

1

Create a transparent image of given dimensions

 $ convert -size 100x100 xc:none transparency.png

— by Janos on July 31, 2013, 11:32 p.m.

Explanation

  • convert is a tool that's part of the ImageMagick image manipulation library
  • -size 100x100 specifies the dimensions of the image to create
  • xc:none is a symbolic source image, indicating to convert "from nothing"
  • transparency.png is the destination filename, the image format is automatically determined by the extension

Limitations

Requires the ImageMagick image manipulation library.

1

Print a random cat

 $ wget -O - http://placekitten.com/$[500 + RANDOM % 500] | lp

— by openiduser104 on July 26, 2013, 11:43 p.m.

Explanation

$RANDOM gives a random number.

http://placekitten.com is your cat place

wget -O - sends the output to stdout

lp prints

Limitations

Tested on OSX

Cat rules