PDA

View Full Version : Backing up large databases


Krofh
04-10-2006, 07:15 PM
Hey,
What's the best way to dump a huge database? Our server tends to crash at the very end of dumping... our site isn't too huge, but still pretty big (360,000+ posts). I haven't been experimenting too much simply because of not wanting to crash the site if I mess it up :p
- Jesse

The Prohacker
04-10-2006, 09:04 PM
Hey,
What's the best way to dump a huge database? Our server tends to crash at the very end of dumping... our site isn't too huge, but still pretty big (360,000+ posts). I haven't been experimenting too much simply because of not wanting to crash the site if I mess it up :p
- Jesse


We use MySQL replication and then a custom script copy the binary *.myd, *.myi, and *.frm. You could use a tool like mysqlhotcopy or mysqlsnapshot to do faster backups than using mysqldump.

Erwin
04-11-2006, 01:36 AM
I just copy the relevant database directory manually.

bada_bing
04-11-2006, 02:17 AM
I have been using SQLyog to perform my backups. Works pretty nice and fast. I use the pro version, Anyone else use this and who has had to recover a backeup database using this tool. My database is about 300mb and I can backup in less then 10 minutes.

Erwin
04-11-2006, 02:24 AM
300 mb is relatively small. My database is 20 Gb. The chances of corruption is very high using indirect methods (I know from experience!).

bada_bing
04-11-2006, 02:45 AM
300 mb is relatively small. My database is 20 Gb. The chances of corruption is very high using indirect methods (I know from experience!).
Erin
Arent you one of the gods that run this site.hehehe

20 gigs! I feel like a small bug in the ocean now, thanks.. Just kiddin

Can you explain in more details how you are going your copies?

trackpads
04-12-2006, 07:59 AM
Found this, is this waht you guys are talking about?

http://www.vbulletin.com/forum/showthread.php?t=134821

dan35
04-12-2006, 09:48 AM
Thank for the link, trackpads!

trackpads
04-12-2006, 11:58 PM
Np, I was looking for the same thing!! I tried this tonight and it works great!

Freesteyelz
04-13-2006, 01:54 AM
Yeah. Since I don't have Telnet/SSH access with the server that my vB is on I've looked through various third party apps in hope to find a reliable backup. Mysqlhotcopy seems to be the choice.

*I also have that page bookmarked. :)

Then again, since I only have a 123MB database I could simply copy the files via Admin CP or phpMyAdmin.

The Chief
04-13-2006, 02:24 AM
300 mb is relatively small. My database is 20 Gb. The chances of corruption is very high using indirect methods (I know from experience!).
A databse of 20gb, wow...lol

kerplunknet
04-13-2006, 06:31 AM
Erwin: You just "cp" the MySQL database(s) directory?

i.e.

cp -R /var/lib/mysql /other/cool/backup/directory

?

Erwin
04-13-2006, 07:31 AM
Essentially. I do it in a script, rm -Rf the old directory, mv dir1 to dir2 etc to keep up with a whole list of backups etc., then cp.

Krofh
04-13-2006, 11:02 PM
Thanks for the recommendations everyone... We're moving to our new server tomorrow. It's only a 300 MB database, so hopefully it shouldn't be too hard.

parafieldtower
04-25-2006, 05:25 AM
Hello guys, I'm in a rather serious situation here. My server has gone haywire and I can not do any SSH or FTP to it. The only thing that seems to be working is vBulletin. Everything else such as mail server or cronjobs are not functioning. What's worst is that all the database I backup every night are not accessible. What I need to do is to backup the current database (somehow) so we can re-instate the server to the beginning.

Now I have tried to download the file by using vBulletin's admin cp panel but I think because php has this timeout feature, it got to about 45MB and stopped. Database should be around 250MB. Does someone have any idea of how to do this? Can we modify this timeout feature by uploading a plugin script? I'm using vBulletin 3.5.4.

The Prohacker
04-25-2006, 07:49 PM
Hello guys, I'm in a rather serious situation here. My server has gone haywire and I can not do any SSH or FTP to it. The only thing that seems to be working is vBulletin. Everything else such as mail server or cronjobs are not functioning. What's worst is that all the database I backup every night are not accessible. What I need to do is to backup the current database (somehow) so we can re-instate the server to the beginning.

Now I have tried to download the file by using vBulletin's admin cp panel but I think because php has this timeout feature, it got to about 45MB and stopped. Database should be around 250MB. Does someone have any idea of how to do this? Can we modify this timeout feature by uploading a plugin script? I'm using vBulletin 3.5.4.


If your server is having that many major issues, I would have your server provider mount the current disk as slave and install a new primary drive with a fresh install of the operating system of your choosing. I would then get the new server configured and start to just copy information off of the slave drive such as the database, files, logs, etc.

COBRAws
04-25-2006, 09:02 PM
My database is 20 Gb.
Attachments and avatars in DB system?
is it vb.org?

xD

I just use SSH and do a mysqldump

Erwin
04-26-2006, 07:53 AM
Not vB.org.

No, attachments/avatars/profilepics etc is another 20 Gb, in the file system. ;)

ramprage
04-27-2006, 12:08 AM
Prohacker how is the replication setup ? I'm not very famililar with this method.

The Prohacker
04-27-2006, 04:03 AM
Prohacker how is the replication setup ? I'm not very famililar with this method.


http://dev.mysql.com/books/hpmysql-excerpts/ch07.html

We have several database servers for our sites, so we have a single slave server called dbbackup. It runs multiple instances of Mysql all running as slaves replicating data from their respective master. From there we have a perl/shell script locks all tables on the slave process and copies all binary files and then unlocks all tables when done.

We actually have 6 master DB servers, so its rater interesting to get everything to replicate to a single server from which you can create backups.

This eliminates the table locking issue on the master server and possibly taking your sites offline for backups.

I highly recommend getting the book High Performance MySQL which the above chapter is from..

-Mat

Erwin
04-28-2006, 12:39 AM
http://dev.mysql.com/books/hpmysql-excerpts/ch07.html

We have several database servers for our sites, so we have a single slave server called dbbackup. It runs multiple instances of Mysql all running as slaves replicating data from their respective master. From there we have a perl/shell script locks all tables on the slave process and copies all binary files and then unlocks all tables when done.

We actually have 6 master DB servers, so its rater interesting to get everything to replicate to a single server from which you can create backups.

This eliminates the table locking issue on the master server and possibly taking your sites offline for backups.

I highly recommend getting the book High Performance MySQL which the above chapter is from..

-Mat
All I can say is wow. That's a lot of master DB servers.

The Prohacker
04-28-2006, 05:14 PM
All I can say is wow. That's a lot of master DB servers.


None of them are linked with each other :) WHT, dBforums, HotScripts, DeletedDomains, and our ad server have their own servers with the smaller sites scattered across them to fill in the gaps. We are at capacity at this point. I would love to move to a mCluster type system, but I am skeptical. It's a job keeping replication going at times. I don't envy having to host a single site the size of yours Erwin. My biggest problem with replication is DeletedDomains database which has 2500 tables and is 150Gb.

-Mat

Erwin
04-29-2006, 06:14 AM
Holy moly, you own WHT and Hostscripts? :) Yes, mine is a single db which has its own issues.

DevilYellow
05-06-2006, 02:10 AM
at what point does it really matter?

is mysqldump still ok for DB's in the range of a few GB's?

currently I just hit 1GB and plan on splitting to two new servers in about 12 months. I am still on mysql3.

Xorlev
05-07-2006, 05:35 PM
Oh boy...I recommend you get up to MySQL 4.1 at least. And mysqldump should still work, but I'm not too sure about that as my database is currently only a few hundred megabytes and not twenty gigabytes.

DevilYellow
05-07-2006, 10:34 PM
I am going to to build a new DB server and run Mysql 5... then somehow re-do the current server with a new OS, new php, and maybe litespeed httpd.

I do not know how I am going to do that tho.

Erwin
05-07-2006, 11:29 PM
I just did the same and run MySQL 5.1 with new OS and using lighttpd.

kmike
05-08-2006, 06:17 PM
Here's an interesting backup technique: using LVM to take live snapshots of the database, (almost) without any downtime:
http://pointyhair.com/tiki-view_blog_post.php?blogId=1&postId=5
http://mike.kruckenberg.com/archives/2006/05/mysql_backups_u.html

jcerious
06-10-2006, 07:43 PM
Thanks for the recommendations everyone... We're moving to our new server tomorrow. It's only a 300 MB database, so hopefully it shouldn't be too hard.


Only 300 Mbs - mine's only 11! I feel like a bug in your ocean!

Wynand
06-19-2006, 05:55 PM
i'm letting my database back-up automatic. CRONTAB in SSH makes it automatic and removes the old ones. Every hour a backup (when forum get's bigger, possibly every minute..) and a delete of the backup of 2 days ago :)

Just a littel linux knowlegde :cool: :banana:

Erwin
06-19-2006, 10:20 PM
i'm letting my database back-up automatic. CRONTAB in SSH makes it automatic and removes the old ones. Every hour a backup (when forum get's bigger, possibly every minute..) and a delete of the backup of 2 days ago :)

Just a littel linux knowlegde :cool: :banana:
Won't work for large databases. :)

Wynand
06-20-2006, 08:01 AM
Won't work for large databases. :)

:surprised:

In what way won't it work?

Marco van Herwaarden
06-21-2006, 11:44 AM
I doubt that if your database is getting really big, you can afford to create a backup every minute.

More likely will be that the interval between backups will be increased when database grows.