The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
Automated MySQL Datestamp Backup using CRON via shell
To automatically make a backup of your database using *nix cron: Requires: Shell access, ability to run bash and add (hah) scripts to cron.xxx. Code:
#! /bin/bash # Automated database datestamp backup mysqldump --opt -Q -u dbusername -pPassword dbname > /path/to/backups/`date --iso-8601`.sql
With the proper information. Note that there is no space between -pPassword, it's intentional. Then replace "/path/to/backups/" with the actual path that you want to put them in. Make sure the directory exists. Put the backup.sh file in the appropriate cron folder. I'm running mine weekly, so mine is in /etc/cron.weekly/. CHMOD it +x (chmod +x backup.sh) Output of it is a database file, named (the date).sql. Putting it in cron.weekly will run it every Sunday night at Midnight, and give you a file that looks like this: Code:
[cron.weekly]# sh backup.sh [cron.weekly]# ls /home/backups/ 2006-01-13.sql [cron.weekly]# An example would be: /var/www/vhosts/yourdomain.tld/httpdocs/backups Ideally this would work well in conjunction with a script on a local box (assuming your site is hosted remotely) that could shell in and download the backups automatically as well. I'll try and update this with exact instructions on how to do that if I can. Information on automating the SSH transfer process in general can be found here and is pretty thorough, but I haven't tested it yet. Combining multiple backups into one single cron script: If your other databases are all accessible from the same shell prompt and user, you can do this one of two ways. If whatever user you're using for cron has permission to run mysqldump, you can: Put them all in different folders with the same name, like so. Make sure the target folder exists, it might bark at you if it doesn't. Basically just run the dump command however many times you need to run it, with the respective names/passwords on each line. I put in a sleep 5 just to give a small pause between operations. All it does is tell the OS to pause for 5 seconds before running the next command. It's probably not necessary, but MySQL might become unhappy if you run one command directly after another - it just gives your CPU/Memory a chance to cycle if necessary. Code:
#! /bin/bash # Automated database datestamp backup mysqldump --opt -Q -u dbusername1 -pPassword1 dbname1 > /path/to/backups/1/`date --iso-8601`.sql sleep 5 mysqldump --opt -Q -u dbusername2 -pPassword2 dbname2 > /path/to/backups/2/`date --iso-8601`.sql sleep 5 mysqldump --opt -Q -u dbusername2 -pPassword3 dbname3 > /path/to/backups/3/`date --iso-8601`.sql Code:
#! /bin/bash # Automated database datestamp backup mysqldump --opt -Q -u dbusername1 -pPassword1 dbname1 > /path/to/backups/vbulletin_`date --iso-8601`.sql sleep 5 mysqldump --opt -Q -u dbusername2 -pPassword2 dbname2 > /path/to/backups/wiki_`date --iso-8601`.sql sleep 5 mysqldump --opt -Q -u dbusername2 -pPassword3 dbname3 > /path/to/backups/photopost_`date --iso-8601`.sql |
#2
|
||||
|
||||
I get this error message when running it
/bin/sh: /home/xxxxxx/backup.sh: /bin/bash : bad interpreter: No such file or directory and the path to bash is "bin/bash" Thanks |
#3
|
|||
|
|||
you know, what i've never seen is an intelligant backup script. Backups are a time consuming process for the server and can easilyl bog things down esp if you have huge forums... Why has no-one written a backup script that takes this into account.
Two optimisations that i can think of straight away: First you dont NEED to backup everything. There are tables like forums, options, plugins and probably alot of others that only change very infrequently. Post and thread tables only really need to backup new/changed data. Has anyone got a good script for doing something like the above? |
#4
|
|||
|
|||
FWIW, My DB is around 500MB (200k posts, attachments in the FS) and the whole process takes about 30 seconds. The reason I do it this way is that if my server dies, I don't want to have to repair some tables form a corrupt DB and search for the stable ones - I want to just dump the whole thing into a new DB in one line and be back up and running. I use my datestamp filesystem script as well, and then just tar them both up into a big tarball that I use yet another script to grab with wget from my home server nightly.
I can see where this would be an issue for really big boards, but again that's why I posted it as a How-To and didn't make it into a plugin/hack. |
#5
|
|||
|
|||
Quote:
|
#6
|
|||
|
|||
I'm new to this; pardon if this question is stupid:
What happens if there's activity in the forum during the backup? Won't that corrupt the backup? Won't it be necessary to temporarily close the forum while backup is in progress? |
#7
|
|||
|
|||
It won't corrupt the backup, and you don't need to close your forums.
(Pardon the late reply) |
#8
|
|||
|
|||
I found this script to be very helpful when backing up the db. My apologies, I don't mean to piss on your wheaties.
http://www.debianhelp.co.uk/mysqlscript.htm |
#9
|
|||
|
|||
No apology needed, just passing along the handy things I do on my own server.
|
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
More Information | |
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|