Code Monkey home page Code Monkey logo

mysql-backup-to-amazon-s3's Introduction

woxxy / MySQL-backup-to-Amazon-S3

(This is not really an application, just a manual and some lines of code)

Amazon S3 can be an interestingly safe and cheap way to store your important data. Some of the most important data in the world is saved in... MySQL, and surely mine is quite important, so I needed such a script.

If you have a 500mb database (that's 10 times larger than any small site), with the priciest plan, keeping 6 backups (two months, two weeks, two days) costs $0.42 a month ($0.14GB/month). With 99.999999999% durability and 99.99% availability. Uploads are free, downloads would happen only in case you actually need to retrieve the backup (which hopefully won't be needed, but first GB is free, and over that $0.12/GB).

Even better: you get one free year up to 5GB storage and 15GB download. And, if you don't care about all the durability, later you can get the cheaper plan and spend $0.093GB/month.

The cons: you need to give them your credit card number. If you're like me, Amazon already has it anyway.

Another thing that is real nice: HTTPS connection and GPG encryption through s3cmd. Theorically it's safe enough.

Setup

  1. Register for Amazon AWS (yes, it asks for credit card)

  2. Install s3cmd (following commands are for debian/ubuntu, but you can find how-to for other Linux distributions on s3tools.org/repositories)

     wget -O- -q http://s3tools.org/repo/deb-all/stable/s3tools.key | sudo apt-key add -
     sudo wget -O/etc/apt/sources.list.d/s3tools.list http://s3tools.org/repo/deb-all/stable/s3tools.list
     sudo apt-get update && sudo apt-get install s3cmd
    
  3. Get your key and secret key at this link

  4. Configure s3cmd to work with your account

     s3cmd --configure
    
  5. Make a bucket (must be an original name, s3cmd will tell you if it's already used)

     s3cmd mb s3://my-database-backups
    
  6. Put the mysqltos3.sh file somewhere in your server, like /home/youruser

  7. Give the file 755 permissions chmod 755 /home/youruser/mysqltos3.sh or via FTP

  8. Edit the variables near the top of the mysqltos3.sh file to match your bucket and MySQL authentication

Now we're set. You can use it manually:

#set a new daily backup, and store the previous day as "previous_day"
sh /home/youruser/mysqltos3.sh

#set a new weekly backup, and store previous week as "previous_week"
/home/youruser/mysqltos3.sh week

#set a new weekly backup, and store previous month as "previous_month"
/home/youruser/mysqltos3.sh month

But, we don't want to think about it until something breaks! So enter crontab -e and insert the following after editing the folders

# daily MySQL backup to S3 (not on first day of month or sundays)
0 3 2-31 * 1-6 sh /home/youruser/mysqltos3.sh day
# weekly MySQL backup to S3 (on sundays, but not the first day of the month)
0 3 2-31 * 0 sh /home/youruser/mysqltos3.sh week
# monthly MySQL backup to S3
0 3 1 * * sh /home/youruser/mysqltos3.sh month

Or, if you'd prefer to have the script determine the current date and day of the week, insert the following after editing the folders

# automatic daily / weekly / monthly backup to S3.
0 3 * * * sh /home/youruser/mysqltos3.sh auto

And you're set.

Troubleshooting

None yet.

mysql-backup-to-amazon-s3's People

Contributors

camelmasa avatar elbart avatar sashazykov avatar snypez avatar wootroot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mysql-backup-to-amazon-s3's Issues

S3 bucket being filled with small debug files not in mysql_backup folder

Is there a way of deleting those files after backup is completed?

some example from 1 file:
d470328cfe947e583c786ac96870cbd backup [31/Mar/2013:01:19:25 +0000] 10.36.175.32 3272ee08a76771012BADB7E10721E6 REST.PUT.OBJECT 2013-03-31-01-19-25-B645228B9AC6A250 "PUT TTP/1.1" 200 - - 385 28 9 "-" "Jakarta Commons-HttpClient/3.0" -

Daily and weekly backups running at the same time

After a little troubleshooting I found out why my daily and weekly backups were running at the same time. Apparently if you set a "day of month" and "day of week" it will execute the command when either condition was met.

I found this in the crontab man page:

Note: The day of a command's execution can be specified by two fields -- day of month, and day of week. If both fields are restricted (ie, are not *), the command will be run when either field matches the current time.
For example, "30 4 1,15 * 5'' would cause a command to be run at 4:30 am on the 1st and 15th of each month, plus every Friday.

error moving backup

At the line of moving the backup from day to previous day I get the below errors. Result is that the old backup is moved to /previous_day but it also still shows up in /day. Apparently the delete part of the mv command fails.

Moving the backup from past day to another folder...
WARNING: Retrying failed request: /sql/previous_day/databases.03.26.2013.tar.gz (timed out)
WARNING: Waiting 3 sec...
WARNING: Retrying failed request: /sql/previous_day/databases.03.26.2013.tar.gz (timed out)
WARNING: Waiting 6 sec...
WARNING: Retrying failed request: /sql/previous_day/databases.03.26.2013.tar.gz (timed out)
WARNING: Waiting 9 sec...
WARNING: Retrying failed request: /sql/previous_day/databases.03.26.2013.tar.gz (timed out)
WARNING: Waiting 12 sec...
WARNING: Retrying failed request: /sql/previous_day/databases.03.26.2013.tar.gz (timed out)
WARNING: Waiting 15 sec...

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
An unexpected error has occurred.
Please report the following lines to:
[email protected]
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Problem: S3RequestError: Request failed for: /sql/previous_day/databases.03.26.2013.tar.gz
S3cmd: 1.0.0

Traceback (most recent call last):
File "/usr/bin/s3cmd", line 2006, in
main()
File "/usr/bin/s3cmd", line 1950, in main
cmd_func(args)
File "/usr/bin/s3cmd", line 618, in cmd_mv
subcmd_cp_mv(args, s3.object_move, "move", "File %(src)s moved to %(dst)s")
File "/usr/bin/s3cmd", line 607, in subcmd_cp_mv
response = process_fce(src_uri, dst_uri, extra_headers)
File "/usr/share/s3cmd/S3/S3.py", line 315, in object_move
response_copy = self.object_copy(src_uri, dst_uri, extra_headers)
File "/usr/share/s3cmd/S3/S3.py", line 311, in object_copy
response = self.send_request(request)
File "/usr/share/s3cmd/S3/S3.py", line 487, in send_request
return self.send_request(request, body, retries - 1)
File "/usr/share/s3cmd/S3/S3.py", line 487, in send_request
return self.send_request(request, body, retries - 1)
File "/usr/share/s3cmd/S3/S3.py", line 487, in send_request
return self.send_request(request, body, retries - 1)
File "/usr/share/s3cmd/S3/S3.py", line 487, in send_request
return self.send_request(request, body, retries - 1)
File "/usr/share/s3cmd/S3/S3.py", line 487, in send_request
return self.send_request(request, body, retries - 1)
File "/usr/share/s3cmd/S3/S3.py", line 489, in send_request
raise S3RequestError("Request failed for: %s" % resource['uri'])
S3RequestError: Request failed for: /sql/previous_day/databases.03.26.2013.tar.gz

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
An unexpected error has occurred.
Please report the above lines to:
[email protected]
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Past backup moved.

5GB put limit

Just curious if anyone else has hit this and how they are getting around it?

the program cannot upload to S3

I get this error when I try using s3cmd

mimoo@server:~$ s3cmd mb s3://my-database-backups
ERROR: Bucket 'my-database-backups' already exists
ERROR: S3 error: 409 (BucketAlreadyExists): The requested bucket name is not available. The bucket namespace is shared by all users of the system. Please select a different name and try again.

And the program gets me this error:

ERROR: S3 error: 400 (InvalidRequest): The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
Old backup removed.
Moving the backup from past week to another folder...
ERROR: S3 error: 400 (InvalidRequest): The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
Past backup moved.

Ability to define mysql host

I saw that currently it uses mysqldump to login and create the mysql dump. I would need the option to also define the mysql host.

Error in upload: Broken Pipe

I configured all and i think it's all fine but i got an error when the system try to upload to s3:

WARNING: Upload failed: /day/mysqldbBackup.12.18.2013.tar.gz ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=0.25)

I got this error several time until i got:

ERROR: Upload of '/root/mysqldbBackup.12.18.2013.tar.gz' failed too many times. Skipping that file.

Do i have to change something on bucket configuration?

Thanks in advantage,

Dimitri

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.