Skip to content
[shareaholic app="share_buttons" id="14657658"]
Share this..

Simple shell script to backup multiple mysql databases

2009 February 12
by Eddie

To backup a mysql database from a unix command line is relatively simple.  It is equally easy to automate this task with a shell script and a crontab (cron jobs).  In my case I needed to backup ~15 unique databases and I despise repetitious code, so here’s what I came up with.
Added means to allow different User and password based on current DB.
** ANother Update! – better error handling.

 1. The problem

 2. THe solution

 3. The Shell Scripts

      3.1. Many mysql Databases on one host

      3.2. Error Reporting 

 4. Using Cron to automate the process

      4.1. The End Result

1. The problem

I have multiple databases that need to be backed up into different archive folders. Each database requires a unique host, username and password.

2. THe solution

The bash shell of course! First off we should all be familiar with mysqldump.  It lives idle in nearly every ‘nix box just waiting to dump a mysql database onto the screen, or file. ANd unless you want your mysqldump binary to be sad you should take full well use of it.

The most basic use of the program couldn’t be simpler, but it is happy to meet more complex needs as well. FOr this article we’ll keep it simple.


now you can of course add additional switches, c and e are two I use pretty frequently. See your machine’s man pages to learn more.

But wait that only handles 1 DB on 1 host with 1 user and password. True. SO here’s where we take use of a script to make this puppy work overtime.

3. The Shell Scripts

3.1. Many mysql Databases on one host

# Set these values!
# space seperated list of domain names (will be used as part of the output path)
domains=( name4subfolders inSQLfolderabove canbedomains orsomethingelse )
#list of corresponding DB names
sqldbs=( fulldbname1 db2 db3 db4 )
#list of IDs and passwords
usernames=( user1 user2 user3 user 4 )
passwords=( pass1 pass2 pass3 pass4 )
#Directory to save generated sql files (domain name is appended)
# your mysql host
# End of config values
#date to append
suffix=$(date +%Y-%m-%d)
#run on each domain
for (( i = 0 ; i < ${#domains[@]} ; i++ ))
	#set current output path
	#check if we need to make path
	if [ -d $cpath ]
		# direcotry exists, we're good to continue
		filler="just some action to prevent syntax error"
		#we need to make the directory
		echo Creating $cpath
		mkdir -p $cpath
	#now do the backup
	mysqldump -c -h $mysqlhost --user ${usernames[$i]} --password=${passwords[$i]} ${sqldbs[$i]} 2>error | gzip > $SQLFILE
	if [ -s error ]
		printf "WARNING: An error occured while attempting to backup %s  \n\tError:\n\t" ${sqldbs[$i]} 
		cat error
		rm -f er
		printf "%s was backed up successfully to %s\n\n" ${sqldbs[$i]} $SQLFILE

In either case you’ll want to save it to a file, let’s say…
And then make the script executable

#chmod 0750

Now you can test the script by calling it by name


Sweet, a single command will backup any databases we include in the script. And we can call it again and again

A simple output reports the results;

database1 was backed up successfully to /home/YOURNAME/sql_dumps/
taskfreak_database was backed up successfully to /home/YOURNAME/sql_dumps/
mantis_database was backed up successfully to /home/YOURNAME/sql_dumps/

3.2. Error Reporting 

I recently had to update the error handling because it was letting failures pass by!

That was because mysqldump would fail, throw an error to stderr, but gzip would then come along and happily report that is successfully compressed nothing!

The new model outputs any errors to a file named “error” before everything is piped into gzip and the error code is lost. We then check for the file, and if present show it to the user (log it) before deleting the file and moving on.

So in a bad scenario I would now see:

Attempting to run MySQL dump on 2 databases

domain1_com was backed up successfully to /home/user/sql_dumps/

WARNING: An error occured while attempting to backup domain2 
        mysqldump: Got error: 1045: Access denied for user 'domain2_com'@'' (using password: YES) when trying to connect


But wait! We wanted to automate this whole thing right? And so we shall.

4. Using Cron to automate the process

If your using a webhost they likely provide a GUI to add cron jobs. If that’s the case you can just point to the full path where you saved the above script, select the interval and your good to go.

If your using this on your own server you’ll need to get your hands dirty with a crontab. You can open the crontab file in your editor of choice, or call it from the command line. IN this example we’ll rely in vi, my systems default editor.
Create a crontab file if it does not already exist and open it for edit

crontab -e

You may see some existing lines or you may not. Just remember one job per line. THe layout may seem overwhelming at first, but its quite simple, and breaks down like this

min     hour     day    month    weekday     job_to_Run

The values are in the respective ranges for day of week 0 is Sunday.

0-59    0-23     1-31    1-12     0-6        filename

To omit a field replace it with an asterisk (*) which means all values. Alternately you may use comma separated lists. Although I believe it will treat any whitespace as a delimiter I use tabs to make the organization a little nicer.

So let’s suppose I want to run this job nightly, it is after all named DAILY sql backup 🙂
I will add the following line to my crontab

15    0     *    *   *     $HOME/scripts/ > logfile.log

This means every day @ 00:15 a.k.a 15 minutes past midnight it will run the script and print any output into the specified logfile.
If you leave off the redirect to logfile it will email the user with the results. To omit any output use the handy standby

>/dev/null 2>&1


well I think that covers it. THere’s tons of good resources to learn more about any particular topic, but I would be happy to field comments.

4.1. The End Result

After your newly created cron jobs have the chance to run for a few days you’ll end up with a nice and neat directory structure like this;

|   |-- edwardawebb_wordpress_01-13-2009.sql.gz
|   |-- edwardawebb_wordpress_01-14-2009.sql.gz
|   |-- edwardawebb_wordpress_01-15-2009.sql.gz
|   |-- edwardawebb_wordpress_01-16-2009.sql.gz
|   |-- edwardawebb_wordpress_01-17-2009.sql.gz
|   |-- edwardawebb_wordpress_01-18-2009.sql.gz
|   `-- edwardawebb_wordpress_01-19-2009.sql.gz
|   |-- mainsite_mantis_01-13-2009.sql.gz
|   |-- mainsite_mantis_01-14-2009.sql.gz
|   |-- mainsite_mantis_01-15-2009.sql.gz
|   |-- mainsite_mantis_01-16-2009.sql.gz
|   |-- mainsite_mantis_01-17-2009.sql.gz
|   |-- mainsite_mantis_01-18-2009.sql.gz
|   `-- mainsite_mantis_01-19-2009.sql.gz
    |-- mainsite_taskfreak_01-11-2009
    |-- mainsite_taskfreak_01-11-2009.sql.gz
    |-- mainsite_taskfreak_01-12-2009.sql.gz
    |-- mainsite_taskfreak_01-13-2009.sql.gz
    |-- mainsite_taskfreak_01-14-2009.sql.gz
    |-- mainsite_taskfreak_01-15-2009.sql.gz
    |-- mainsite_taskfreak_01-16-2009.sql.gz
    |-- mainsite_taskfreak_01-17-2009.sql.gz
    |-- mainsite_taskfreak_01-18-2009.sql.gz
    `-- mainsite_taskfreak_01-19-2009.sql.gz

Note: this example would generate 3 files each night. After 1 month thats ~150 files depending on the month. That’s why I also wrote a simple Recycler script to purge all old files. As soon as I draft that article I’ll ad the link here.

12 Responses leave one →
  1. January 16, 2009

    Usefull stuff, thanks!

  2. Eddie permalink*
    January 18, 2009

    Thank you, I am glad it proved useful.

  3. Tony permalink
    April 21, 2009


    thanks for the post, I get:

    /bin/sh: /var/www/vhosts/ cannot execute binary file

    So it seems the cron command is executing (had to delete the log part first) but it can’t run the file, I’m on Plesk if that means anything.

    Appreciate your efforts!


  4. Eddie permalink*
    April 21, 2009

    Did you add the script specification at the start of the file
    #! /bin/bash
    and mark it executable?
    chmod 0755

  5. Devi permalink
    September 21, 2009

    Really a nice article..

  6. Jacques UWAYO permalink
    July 23, 2010

    hi i made a shell script to dump mysql select result to csv file
    the script works very well when i run it manual,but the crontab doesnt run,
    it is just creating the empty csv file
    below is the file

    datetime=`date +”%Y%m%d”`
    mysql -uuser -pmypasswor db1 -e “select subtoken MSISDN,state,expiration_date from comp_subscription,comp_subscription_information
    order by expiration_date” > $FILE

    Please advise

  7. September 20, 2010

    This script has served me well for the last year or more, but when I mentioned that new databases need to be added by hand a learned friend suggested the following:

    for db in $(mysql -e "show databases" -B --skip-column-names); do
    mysqldump --opt --databases $db | gzip -3 --rsyncable > /var/backups/$db.sql.gz

    of course you may need to append your host, user & password credentials, depending on your environment.

  8. Eddie permalink*
    September 20, 2010

    Thanks! That is a handy trick to backup multiple databases on the same host.

    Since I use a few hosts I may tweak that script above to use one master password for each host, and then run through all contained databases. Easy change by wrapping your loop inside a larger loop based on hostnames.

  9. Martijn permalink
    August 30, 2011

    I´ll get the following error: command not found

    any toughts?

  10. Eddie permalink*
    September 2, 2011

    Are you setting up the cron job yourself, or using a hosting provider?

    The error simply means it can’t find the script on the path, $HOME/scripts/

    There are a few possible causes:
    1) That’s not where you saved the script (try pasting that command directly in the command line, with the full path)
    2) The $HOME variable is not set (replace with full /home/myUserName)
    3) Your host uses a different user to run the cron jobs, and $HOME is invalid (replace with full /home/myUserName)

Trackbacks and Pingbacks

  1. Shell Scrpt to recycle old log files | Edward A. Webb (.com)
  2. Backup all sub-directories with a Bash array loop | Edward A. Webb (.com)

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS