Schedule Database backup in your cloud instance

Suppose you have a cloud instance and you’re afraid of loosing data anytime. So what do you do in that case? Backup your database regularly right?

But it’s hard to backup manually every day.
I’m going to set up a cron job to automatically backup your database. Let’s have a journey with me.
1. Create a local git repository

We’ll create a folder named backups on /var/www. It’ll be our local git repository for backup.
Initialize it with

git init

2. Create a backup git repository
 
Since our database isn’t so big yet, we’ll use a git repository to backup the database file for now.
Create a git repository in any of your preferred git server. Suppose we create a git repository DatabaseBackupTest and url is [email protected]/DatabaseBackupTest.git
cd to your local backup folder and set this url as remote.

git remote add origin [email protected]/DatabaseBackupTest.git

3. Create a bash script to backup your database

We have to create a bash script that we’ll run on daily or weekly basis to backup our database and push it to our online git repository.
We’ll create this file inside our backup repository. You can create it anywhere you want.
cd /var/www/backups
nano dbbackup.sh
Okay now paste this code below and press ctrl + x. It’ll ask you to save it before exit, press y.
w="$(date +’%d_%m_%Y_%H_%M_%S’)"
filename=“db_backup_$now”.gz
backupfolder="/var/www/backups"
fullpathbackupfile="$backupfolder/$filename"
logfile="$backupfolder/“backup_log_”$(date +’%Y_%m’)".txt
echo “mysqldump started at $(date +’%d-%m-%Y %H:%M:%S’)” >> “$logfile”
mysqldump --user=dbuser --password=dbpass --default-character-set=utf8 dbname | gzip > “$fullpathbackupfile”
echo “mysqldump finished at $(date +’%d-%m-%Y %H:%M:%S’)” >> “$logfile”
chown root “$fullpathbackupfile”
chown root “$logfile”
echo “file permission changed” >> “$logfile”
find “$backupfolder” -name db_backup_* -mtime +8 -exec rm {} ;
echo “old files deleted” >> “$logfile”
echo “operation finished at $(date +’%d-%m-%Y %H:%M:%S’)” >> “$logfile”
echo “*****************” >> “$logfile”
echo “Commiting changes” >> “$logfile”
cd /var/www/backups
git add .
git commit -m “New backup on $(date +’%d-%m-%Y %H:%M:%S’)”
echo “* **********Pushing latest commit on Repo********** ”
git push origin master
exit 0
Here, on line 3 we specified for the backup folder as /var/www/backups , that means backup sql file will be saved here.
On line 7 that starts with mysqldump, change the database name and username, password credentials according to your configuration. It’s the line responsible for database backup.
From line 18, we’re changing the directory to bakups repository and push the commit to the remote repository.
4. Add a cron job
 
Alright, we’ve everything set up but one. We’ve to now create a schedule to execute this script automatically. But before you can execute you’ve to make it executable first. Run this command.
cd /var/www/backups
chmod +x dbbackup.sh
Create a cron job using crontab

crontab -e

Paste this code

00 00 * * * /var/www/backups/dbbackup.sh

This means that dbbackup.sh file will be executed everyday at 12AM. You can see cron time formats here and schedule your job any time you want.
To see existing cron tab jobs use command

crontab -l

3 thoughts on “Schedule Database backup in your cloud instance

  1. Exceptional post however I was wondering if you could write a litte more on this subject? I’d be very thankful if you could elaborate a little bit more. Bless you!

  2. You made some clear points there. I looked on the internet for the subject matter and found most persons will consent with your blog.

Leave a Reply

Your email address will not be published. Required fields are marked *