PDA

View Full Version : database backup and archiving/removal


Auron
05-22-2008, 04:46 PM
ok heres the deal.

lve finally managed to set up a PHP based cron, as my other cron method didnt work too well, basically the script l use is accessed backs up *ALL* databases on server, including one or two non-vb scripts databases.. however due to the size of the forum l run, the dumps get rather large, and l mean 200meg a pop which builds up over time.. (thankfully its being gziped now) however l want to remove files older than 60 days, in short l only want 60 days worth of archived databases (as my friend said if one of us is offline and the other is busy for more than a week or two as it happens there are older backups) anywho l came up with this after searching around.

find /path/to/backups -type f -name "*.gz" -ctime +60 -exec rm -v {} \;

would this command clean up all databases that were older than 60 days without touching those that were below 59 days old? (as far as lve worked it out this command would only touch files ending with .gz if my calculations are correct)

snakes1100
05-22-2008, 07:37 PM
find /path/to/backups -type f -name "*.gz" -mtime +60 -exec rm -rf {} \;

Auron
05-23-2008, 11:52 AM
seems lm back to square one again, l managed to work out the removal code for old backups... however the scripts that back up my databases fail

yesterday they worked today l get 500 errors off of them.

bascially after the forum cron script messed up l used another one that was planned to take over, the cron test worked with the following code

/usr/local/bin/php -q /home/account/public_html/sqlbackup/backup_dbs.php
the script basically creates a temp dir exports "ALL" databases on server, Gzips them and then Tars them all together, the above code was needed to get rid of all the backups that had not been edited or touched for more than 60 days...

l dont know what changed on server since last night, but its frustrating me because the moment something works the next day it doesnt, and all because l had to upgrade to truetype support for the 3.7.0 upgrade, which has left me with more server problems than l ever expected.

snakes1100
05-23-2008, 02:07 PM
If its a dedicated server, just use a cron file, no need to use a php based backup script.

If its a dedicated server, check the apache error_log file for the 500 message and find out why its failing, it could be something as simple as changing it to 755 from 644.

Auron
05-23-2008, 02:58 PM
off to take a look now, had to do some digging for the file tho because when we asked the host to revert some changes made by WHM (which messed things up) they messed up the error log on cpanel.

will update the thread with my findings

--------------- Added 1211560034 at 1211560034 ---------------

looked in the error logs and l can find no mention of any error with it, hell l cant seem to find any 500 error in the error_log that relates to even myself manually clicking on the php file which resulted in a 500 error after executing.

snakes1100
05-23-2008, 06:11 PM
try checking the actual domainname log file, you may have to turn on debug as well.

typically 500 errors are related to a permission issue or a bad line in a htaccess file.

Auron
05-23-2008, 06:21 PM
lve partially resolved the issue but still have problems with the large dbs l have on server, the one over 200meg doesnt seem to want to export T_T

snakes1100
05-24-2008, 03:01 AM
Well, a 200mb db shouldnt have any issue exporting, so i would say its your script.

As i said, i would just set up a cron in /etc/cron.daily and quit using the php based script.

Auron
05-24-2008, 10:31 AM
l was trying to use crontabs.

crontab - e

and edit thigns in like you would in cpanels based cron area..

the weird thing is the damn file wouldnt save giving me errors saying it wouldnt install >_<, l'll see what my host comes up with as a last ditch effort as to why the damn thing wont work if not l'll look for the cron.daily and weekly ones as those are all l need.

l take it the format is pretty much the same eg: ****** /command/to/run

and the method of editing and saving is the same too.

--------------- Added 1211629791 at 1211629791 ---------------

Well, a 200mb db shouldnt have any issue exporting, so i would say its your script.

As i said, i would just set up a cron in /etc/cron.daily and quit using the php based script.

lm using that script because it does all the databases, and its the only method l found which doesnt require me to make 9 crons (l have 9 different forums to backup along with their gallerys which is 2 dbs each)

snakes1100
05-25-2008, 01:22 PM
A simple cron script will dump all the db's as well in a single cmd, there is no need for 9 separate crons

Auron
05-25-2008, 01:53 PM
found a shell script which should do the job, altho the moment we are using a php based one.. was my job to implement it.. l guess we have too much data on server, the wiki just takes the biscuit but l guess we are getting there, l will note tho we have managed to weed what we need down to 2 crons through crontab altho with that shell script we found we might edit it into something that will cut more of the crons down we used.

now we just need to get wput working well and or find a better yet reasonably priced host for backups. we currently use servage.net for them but we have found that the ftp is sadly unrealisable and keeps on causing issues. l guess 300/500gig of transfer not including the gig of wiki images would be required along with 10/20gb of disk space, if anyone knows of a reasonable service thats priced well just for backups it would be appreciated.

--------------- Added 1211752667 at 1211752667 ---------------

found the perfect solution altho its not as editable as we had hoped (we wanted 60 days of daily backups) but l guess its better than nothing sadly we dont know enough about bash to edit it to our liking, in any case the php cron is now a bash script which does the same thing but better.

http://www.debianhelp.co.uk/mysqlscript.htm