We collect practical, well-explained Bash one-liners, and promote best practices in Bash shell scripting. To get the latest Bash one-liners, follow @bashoneliners on Twitter. If you find any problems, report a bug on GitHub.

Tags

0

Concatenate PDF files using GhostScript

 $ gs -dNOPAUSE -sDEVICE=pdfwrite -sOUTPUTFILE=output.pdf -dBATCH file1.pdf file2.pdf file3.pdf

— by Janos on Jan. 26, 2012, 8:51 a.m.

Explanation

Free PDF editing software might become more and more available, but this method has been working for a long time, and likely will continue to do so.

Limitations

It may not work with all PDFs, for example files that don't conform to Adobe's published PDF specification.

0

Format text with long lines to text with fixed width

 $ fmt -s -w80 file.txt

— by Janos on Jan. 22, 2012, 10:08 a.m.

Explanation

  • It will break lines longer than 80 characters at appropriate white spaces to make them less than 80 characters long.
  • The -s flag will collapse multiple consecutive white spaces into one, or at the end of a sentence a double space.

0

Come back quickly to the current directory after doing some temporary work somewhere else

 $ pushd /some/where/else; work; cd /somewhere; work; cd /another/place; popd

— by Janos on Jan. 15, 2012, 11:12 p.m.

Explanation

  • pushd, popd and dirs are bash builtins, you can read about them with help dirs
  • bash keeps a stack of "remembered" directories, and this stack can be manipulated with the pushd and popd builtins, and displayed with the dirs builtin
  • pushd will put the current directory on top of the directory stack. So, if you need to change to a different directory temporarily and you know that eventually you will want to come back to where you are, it is better to change directory with pushd instead of cd. While working on the temporary task you can change directories with cd several times, and in the end when you want to come back to where you started from, you can simply do popd.

0

Export a git project to a directory

 $ git archive master | tar x -C /path/to/dir/to/export

— by Janos on Jan. 12, 2012, 11:04 a.m.

Explanation

The git archive command basically creates a tar file. The one-liner is to create a directory instead, without an intermediate tar file. The tar command above will untar the output of git archive into the directory specified with the -C flag. The directory must exist before you run this command.

0

Delete all tables of a mysql database

 $ mysql --defaults-file=my.cnf -e 'show tables' | while read t; do mysql --defaults-file=my.cnt  -e 'drop table '$t; done

— by Janos on Jan. 8, 2012, 7:53 a.m.

Explanation

If you have a root access to the database, a drop database + create database is easiest. This script is useful in situations where you don't have root access to the database.

First prepare a file my.cnf to store database credentials so you don't have to enter on the command line:

[client]

database=dbname

user=dbuser

password=dbpass

host=dbhost

Make sure to protect this file with chmod go-rwx.

The one-liner will execute show tables on the database to list all tables. Then the while loop reads each table name line by line and executes a drop table command.

Limitations

The above solution is lazy, because not all lines in the output of show tables are table names, so you will see errors when you run it. But hey, shell scripts are meant to be lazy!

0

Run remote X11 applications with ssh

 $ ssh -X servername

— by versorge on Jan. 5, 2012, 7:50 a.m.

Explanation

You could follow this command with any other call to an X app: xeyes &

Limitations

If ssh forwarding is permitted on the ssh server

0

Calculate the total disk space used by a list of files or directories

 $ du -s file1 dir1 | awk '{sum += $1} END {print sum}'

— by Janos on Dec. 28, 2011, 8:42 p.m.

Explanation

  • This is really simple, the first column is the size of the file or the directory, which we sum up with awk and print the sum at the end.
  • Use du -sk to count in kilobytes, du -sm to count in megabytes (not available in some systems)

1

Concatenate two or more movie files into one using mencoder

 $ mencoder cd1.avi cd2.avi -o movie.avi -ovc copy -oac copy

— by Janos on Dec. 24, 2011, 3:51 p.m.

Explanation

  • You can specify as many files as you want on the command line to process them in sequence.
  • -ovc copy simply means to copy the video exactly
  • -oac copy simply means to copy the audio exactly
  • -o movie.avi is the output file, with all the source files concatenated

Limitations

  • mencoder is usually not a standard package
  • mencoder may be in the same package as mplayer, or maybe not
  • mencoder has binary packages for Linux, Mac and Windows

See the MPlayer homepage for more info: http://www.mplayerhq.hu/

1

Calculate the average execution time (of short running scripts) with awk

 $ for i in {1..10}; do time some_script.sh; done 2>&1 | grep ^real | sed -e s/.*m// | awk '{sum += $1} END {print sum / NR}'

— by Janos on Dec. 21, 2011, 8:50 a.m.

Explanation

  • The for loop runs some_script.sh 10 times, measuring its execution time with time
  • The stderr of the for loop is redirected to stdout, this is to capture the output of time so we can grep it
  • grep ^real is to get only the lines starting with "real" in the output of time
  • sed is to delete the beginning of the line up to minutes part (in the output of time)
  • For each line, awk adds to the sum, so that in the end it can output the average, which is the total sum, divided by the number of input records (= NR)

Limitations

The snippet assumes that the running time of some_script.sh is less than 1 minute, otherwise it won't work at all. Depending on your system, the time builtin might work differently. Another alternative is to use the time command /usr/bin/time instead of the bash builtin.

0

Check the performance of a script by re-running many times while measuring the running time

 $ for i in {1..10}; do time curl http://localhost:8000 >/dev/null; done 2>&1 | grep real

— by Janos on Dec. 17, 2011, 1:49 a.m.

Explanation

  • {1..10} creates a sequence from 1 to 10, for running the main script 10 times
  • 2>&1 redirects stderr to stdout, this is necessary to capture the "output" of the time builtin

0

A convenient way to re-run the previous command with sudo

 $ sudo !!

— by Janos on Dec. 14, 2011, 11:26 p.m.

Explanation

!! (bang bang!) is replaced with the previous command.

You can read more about it and other history expansion commands in man bash in the Event Designators section.

0

Put an ssh session in the background

 $ ~^z

— by Janos on Dec. 9, 2011, 7:44 p.m.

Explanation

  • Normally, ^z (read: ctrl-z) pauses the execution of the current foreground task. That doesn't work in an ssh session, because it is intercepted by the remote shell. ~^z is a special escape character for this case, to pause the ssh session and drop you back to the local shell.
  • For all escape characters see ~?
  • The ~ escape character must always follow a newline to be interpreted as special.
  • See man ssh for more details, search for ESCAPE CHARACTERS

1

Rotate a movie file with mencoder

 $ mencoder video.avi -o rotated-right.avi -oac copy -ovc lavc -vf rotate=1

— by Janos on Dec. 2, 2011, 10:30 p.m.

Explanation

mencoder is part of mplayer.

Other possible values of the rotate parameter:

  • 0: Rotate by 90 degrees clockwise and flip (default).
  • 1: Rotate by 90 degrees clockwise.
  • 2: Rotate by 90 degrees counterclockwise.
  • 3: Rotate by 90 degrees counterclockwise and flip.

0

Recursively remove all empty sub-directories from a directory tree

 $ find . -type d | tac | xargs rmdir 2>/dev/null

— by Janos on Nov. 29, 2011, 8:01 p.m.

Explanation

  • find will output all the directories
  • tac reverses the ordering of the lines, so "leaf" directories come first
  • The reordering is important, because rmdir removes only empty directories
  • We redirect error messages (about the non-empty directories) to /dev/null

Limitations

In UNIX and BSD systems you might not have tac, you can try the less intuitive tail -r instead.

0

Remove all the versioned-but-empty directories from a Subversion checkout

 $ find . -name .svn -type d | while read ss; do dir=$(dirname "$ss"); test $(ls -a "$dir" | wc -l) == 3 && echo "svn rm \"$dir\""; done

— by Janos on Nov. 27, 2011, 8:38 a.m.

Explanation

Empty directories in version control stink. Most probably they shouldn't be there. Such directories have a single subdirectory in them named ".svn", and no other files or subdirectories.

  • The "find" searches for files files named .svn that are directories
  • The "while" assigns each line in the input to the variable ss
  • The "dirname" gets the parent directory of a path, the quotes are necessary for paths with spaces
  • ls -a should output 3 lines if the directory is in fact empty: ".", "..", and ".svn"
  • If the test is true and there are precisely 3 files in the directory, echo what we want to do
  • If the output of the one-liner looks good, pipe it to | sh to really execute

0

Create a sequence of integer numbers

 $ echo {4..-9}

— by Janos on Nov. 24, 2011, 10:07 p.m.

Explanation

  • Useful for counters. For example, to do something 10 times, you could do for i in {1..10}; do something; done
  • Be careful, there cannot be spaces between the braces
  • As the example shows, can count backwards too

Limitations

Does not work in /bin/sh, this is bash specific.

0

Redirect the output of the time builtin command

 $ { time command; } > out.out 2> time+err.out

— by Janos on Nov. 20, 2011, 8:34 p.m.

Explanation

  • time is a bash builtin command, and redirecting its output does not work the same way as with proper executables
  • If you execute within braces like above, the output of time will go to stderr (standard error), so you can capture it with 2>time.out
  • An alternative is to use the /usr/bin/time executable, by referring to its full path. (The path may be different depending on your system.)

0

Copy a directory with a large number of files to another server

 $ tar cp -C /path/to/dir . | ssh server2 'tar x -C /path/to/target'

— by Janos on Nov. 17, 2011, 12:19 p.m.

Explanation

With a large number of files, scp or rsync can take very very long. It's much faster to tar up on one side and extract on the other. Without the -f flag tar writes output to standard output and expects input from standard input, so piping to ssh can work this way, without creating any intermediary files.

You may (or may not) gain an extra speed boost by compression, either with the z flag for tar, or with the -C flag for ssh, or with gzip pipes in the middle, like this:

tar cp -C /path/to/dir . | gzip | ssh server2 'gzip -cd | tar x -C /path/to/target'

Limitations

Depending on your system and version of tar, you may need to hyphenate the flags, for example tar -cp, and tar -x. The -C flag might also not work, but that shouldn't be too difficult to work around.

0

Redirect the output of multiple commands

 $ { cmd1 ; cmd2 ; cmd3 ; } > out.out 2> err.out

— by Janos on Nov. 14, 2011, 10:08 a.m.

Explanation

  • Curly braces are very helpful for grouping several commands together
  • Be careful with the syntax: 1. there must be whitespace after the opening brace 2. there must be a semicolon after the last command and before the closing brace
  • Another practical use case: test something || { echo message; exit 1; }

0

View a file with line numbers

 $ grep -n ^ /path/to/file | less

— by Janos on Nov. 9, 2011, 11:05 p.m.

Explanation

  • grep ^ will match all lines in a file
  • grep -n will prefix each line of output with the line number within its input file

Limitations

In some systems you might have to use egrep instead of grep.

0

Print the n-th and m-th line of a file

 $ sed -ne '101 p' -e '106 p' /path/to/the/file

— by Janos on Nov. 6, 2011, 11:20 p.m.

Explanation

  • The above command will print the 101th and 106th lines of the specified file.
  • The -n switch will make sed not print all lines by default.
  • The -e switch is to specify a sed command, you can use it multiple times at once.

Some sed command examples:

  • 45 p - print line #45
  • 34,55 p - print lines #34-#55
  • 99,$ p - print lines #99-end of the file

0

Repeat the previous command but with a string replacement

 $ ^geomtry^geometry

— by Janos on Nov. 4, 2011, 7:10 a.m.

Explanation

This can be very useful for example after a mistyped command like this:

convert -crop 745x845+0+150 my_logo.png -geomtry 400x my_logo2.png
^geomtry^geometry

"-geomtry" should have been "-geometry", the one-liner above will re-run the command with a replacement that fixes the typo.

1

Rename all files in the current directory by capitalizing the first letter of every word in the filenames

 $ ls | perl -ne 'chomp; $f=$_; tr/A-Z/a-z/; s/(?<![.'"'"'])\b\w/\u$&/g; print qq{mv "$f" "$_"\n}'

— by Janos on Nov. 1, 2011, 12:51 p.m.

Explanation

  • When you pipe something to perl -ne, each input line is substituted into the $_ variable. The chomp, tr///, s/// perl functions in the above command all operate on the $_ variable by default.
  • The tr/A-Z/a-z/ will convert all letters to lowercase.
  • The regular expression pattern (?<![.'])\b\w matches any word character that follows a non-word character except a dot or a single quote.
  • The messy-looking '"'"' in the middle of the regex pattern is not a typo, but necessary for inserting a single quote into the pattern. (The first single quote closes the single quote that started the perl command, followed by a single quote enclosed within double quotes, followed by another single quote to continue the perl command.) We could have used double quotes to enclose the perl command, but then we would have to escape all the dollar signs which would make everything less readable.
  • In the replacement string $& is the letter that was matched, and by putting \u in front it will be converted to uppercase.
  • qq{} in perl works like double quotes, and can make things easier to read, like in this case above when we want to include double quotes within the string to be quoted.
  • After the conversions we print a correctly escaped mv command. Pipe this to bash to really execute the rename with | sh.

Limitations

The above command will not work for files with double quotes in the name, and possibly other corner cases.

0

Do not save command history of current bash session

 $ HISTFILE=

— by Janos on Oct. 29, 2011, 2:41 p.m.

Explanation

The command history of the current bash session is saved to $HISTFILE when you exit the shell. If this variable is blank, the command history will not be saved.

0

Use rsync instead of cp to get a progress indicator when copying large files

 $ rsync --progress largefile.gz somewhere/else/

— by Janos on Oct. 19, 2011, 12:48 a.m.

Explanation

Although rsync is famous for synchronizing files across machines, it also works locally on the same machine. And although the cp command doesn't have progress indicator, which can be annoying when copying large files, but rsync does have it, so there you go.

Limitations

When copying directories be careful that the meaning of a trailing slash when specifying directories can be slightly different from cp.