Table of Content
Amazon S3 Backup is the most reliable, fast and simple to use solution for keeping the data backed up online. S3 achieves high availability by replicating data across multiple servers within Amazon’s data centers, and offers 99.999999999% durability. S3 typical use cases involve media sharing, media distribution, Server/PC backup, online storage, and application storage. This blog shows how we backup our code base to S3 twice a day using different interesting tools like S3cmd, and S3 browser.
This use case shows how we backup our bare git repository to S3 twice a day using different tools like S3cmd, and S3 browser. This use case is kept intentionally simple to demonstrate S3.
What we want to do:
- Ensure Python is set up
- Install & Configure S3cmd tools
- Download & Configure S3 browser
- Create a sample bare git repository
- Create a script to upload the repository to S3
Ensure Python is set up:
- There are so many blogs and articles explaining about how to install Python, and the links are referenced below. For this use case, we use Ubuntu and Python 2.7.5
Install & Configure S3cmd tools:
- Install S3cmd:
1$ sudo apt-get install s3cmd
- Configure S3Cmd:
12345678910111213141516171819202122232425262728293031323334353637$ s3cmd --configureEnter new values or accept defaults in brackets with Enter.Refer to user manual for detailed description of all options.Access key and Secret key are your identifiers for Amazon S3Access Key: [Enter Amazon Access Key]Secret Key: [Enter Amazon Secret Key]Encryption password is used to protect your files from readingby unauthorized persons while in transfer to S3Encryption password: treselle (Choose your own password)Path to GPG program [/usr/bin/gpg]:When using secure HTTPS protocol all communication with Amazon S3servers is protected from 3rd party eavesdropping. This method isslower than plain HTTP and can't be used if you're behind a proxyUse HTTPS protocol [No]: YesNew settings:Access Key: [Your Access Key will appear here]Secret Key: [Your Secret Key will appear here]Encryption password: tresellePath to GPG program: /usr/bin/gpgUse HTTPS protocol: TrueHTTP Proxy server name:HTTP Proxy server port: 0Test access with supplied credentials? [Y/n] YPlease wait...Success. Your access key and secret key worked fine :-)Now verifying that encryption works...Success. Encryption and decryption worked fine :-)Save settings? [y/N] yConfiguration saved to '/home/treselle/.s3cfg'
- Verify S3Cmd:
123$ s3cmd ls2014-03-22 03:46 s3://backupdemo
Download & Configure S3 browser:
- Download S3browser:
Download and Install S3browser from http://s3browser.com/download/s3browser-4-1-1.exe
- Configure S3browser:
Specify Amazon access and secret key along with any unique account name as shown in the below screenshot
- Verify S3browser:
The below screenshot shows a sample S3 bucket with name “backupdemo” created for this blog purpose that doesn’t have any objects yet.
Create a sample bare git repository:
- Create a bare blessed repository
123$ mkdir blessed_repo.git$ cd blessed_repo.git$ git –bare init
- Create a sample working directory and push the master to blessed repo
12345678$ mkdir sample_project$ cd sample_project$ git – init$ vi sample.txt$ git add .$ git commit –m “Initial Project commit”$ git remote add origin email@example.com:/home/treselle/blessed_repo.git$ git push origin master
Create a script to upload the repository to S3 :
- Backup Script:
code_snapshot.sh12345678#!/bin/shrm /home/treselle/code_backup/blessed_repo.tgzfilename=blessed_repo.tgztar czf /home/treselle/code_backup/$filename blessed_repo.gitecho 'Done Compressing's3cmd put $filename s3://backupdemoecho 'Done Upload to S3 Bucket'
- Execute the script to upload the repo to S3:
The blessed repo is uploaded in compressed format to the S3 bucket ‘backupdemo”
- Amazon S3 is the popular cloud storage solution that can be used to backup any type of files and comes with various features like Access Control List to buckets, Lifecycle configuration, Server Side Encryption and so on.
- Tools like S3cmd and S3browser come in handy to automate and verify many S3 activities similar to what was done above.