Backup Your Ghost Data In AWS S3

Shashikant Dwivedi
SKDBLOG
Published in
5 min readMay 11, 2020

--

Hello everyone, In this article, I am going to discuss how you can backup your ghost blog data and save it in your AWS s3 bucket.

Steps

So to perform backup we are going to follow these steps.

  1. Creating a backup bucket.
  2. Creating and Assigning the IAM role for the ghost server.
  3. Configuring Ghost Server.

So let’s begin with the first step.

Creating a backup bucket

To create a backup bucket open your AWS S3 and click on create bucket button.

Now enter the bucket name you want, but it should be unique.

Now leave all the settings default and click on create bucket button.

Now if you are finished making our AWS S3 bucket move to the next step.

Creating and Assigning IAM Role for ghost server

Open IAM on your AWS console and click on roles.

Then click on Create Role to create a role

In the role select EC2 on the use case and click on the Permission button.

In policy screen search for S3 and select “AmazonS3FullAccess” and click on the Tags button.

It is not necessary to add tags you can leave it default and move to the next screen.

In the next screen give the role a name and description and press the Create Role button to create the role.

Once you have created the IAM role assign it to the EC2 instance in which the ghost blog has been hosted.

Open EC2 Instance dashboard and select the EC2 instance and click on actions in which select Instance Settings and further select Attach/Replace IAM Role.

Now select the IAM role we create and click on apply.

So we have finally set up AWS as of our requirements now its time to setup our AWS EC2 instance for making backup and sending them to AWS S3.

Configuring EC2 Instance

To configure your EC2 instance firstly connect to it. And create a new file. You can name it as per your choice.

Now enter the following code in the file

Give execution permission to our backup script.

chmod +x <script-name>.sh

And be sure to change all the text enclosed in this pattern <any-text> as per your file and folder structure.

Explanation (You can skip it)

If you want to know how the script works then continue otherwise you can skip the explanation part.

In this section, I am going to explain each line in the script.

Code -

Explanation -
That is called a shebang, it tells the shell what program to interpret the script with when executed. Particularly in our case, it will execute the file using sh, the Bourne shell.

Code -

Explanation -
This command will change the directory to your ghost directory.

Code -

cp -rp content/ /home/<username>/<backup-folder>/

Explanation -
This command will copy all the content of the content folder in your ghost directory the backup folder. Basically the content folder contains all the images and other data of your ghost blog.

Code -

mysqldump -u root <database-name> > /home/<username>/<backup-folder>/<database-name>.sql

Explanation -
This command will backup all the database of our ghost blog in the backup folder.

Code -

cd ~ 
tar -zcvpf blog_backup.tar.gz /home/<username>/<backup-folder>

Explanation -
The first command will take you to the home directory of the logged-in user. And the next command will compress all the data in your backup directory.

Code -

sudo gpg -c --batch --passphrase "<password>" blog_backup.tar.gz

Explanation -
This command will encrypt the compressed backup file using the password provided.

Code -

aws s3 cp blog_backup.tar.gz.gpg s3://<backup-bucket-name>/

Explanation -
This command will send the encrypted compressed backup file in the specified AWS S3 bucket.

Code -

rm blog_backup.tar.gz sudo rm blog_backup.tar.gz.gpg 
cd /home/<username>/<backup-folder>/
rm -rf *

Explanation -
All the above commands are used to clean the non-required files and folders.

MySQL Credentials

To provide MySQL credentials during script execution create to use this command

This command will create a file named .my.cnf in the home directory. Paste the following code in the file

[mysqldump] user=<mysql-user> password=<your-mysql-user-password>

This file will help our backup script to load the password for mysqldump command.

Connecting EC2 and S3

To connect AWS EC2 and S3 you have to install awscli on your server.
Use this command to install it.

sudo apt-get install awscli

Since we have given this EC2 instance a role we don’t need to configure AWS CLI. We can use it directly without any further configuration.
To test it use this command

This command will list all your buckets in AWS S3.

Testing backup script

You can test this backup script by running it.

After running this script you can open AWS S3 and check the bucket for the backup.

Scheduling the backup task

To run the script after some time again & again (or scheduling it) to do backup regularly we will use crontab.
Open crontab

Add the following code at the end of the file opened.

0 0 * * * /bin/bash /home/<username>/<filename>.sh

This code tells the crontab to run the following command every day at mid-night(00:00) according to the server time.

To learn more about Scheduling in Linux you can go through this article.

Hope you have accomplished your task successfully using this article.

And if you have encountered any problem or if you have any questions you can ask me in the comment section.

--

--

Shashikant Dwivedi
SKDBLOG

I am full time developer, working for DeviceDoctor.IN. I write articles on topics that I learn daily by doing.