I've never looked into backing them up incrementally. As I mentioned, ours is just 3.5 GB right now and so I haven't looked into it yet. The script I use is a modification of something I found online somewhere. I would love to find some way to do it better than how I do it. But, here's my basic script:
PHP Code:
<?php
/*======================================================================*\
|| #################################################################### ||
|| # Cron Job to backup all avatars, attachments, and torrents ||
|| # Last Modified: February 26, 2006 ||
|| #################################################################### ||
\*======================================================================*/
$backupdate = date("Y-m-d");
$backupdir1 = "/my/path/to/torrents/";
//Where are the files located?
//ie. "/home/sitename/public_html/tracker/torrents/";
$files = "*";
//What file to backup? Use a * to backup all the files
//inside the folder entered above.
$backupto = "/my/path/to/backups/";
//Where to store the tarball?
//Make sure to put this outside your main folder
//ie. "/home/sitename/backups/
$fileprefix1 = "torrents";
//This is the prefix that will be added before the date:
//bak20060225.tgz
$tararg = "-cf";
//Here goes the tar arguments.
// -cf to archive
// -cjf to archive and bzip2
// -tf to list the contents later on
// -xf to extract them all later on
//Call the function
backupsus();
function backupsus() {
global $backupdate,$backupdir1,$backupto,
$fileprefix1,$tararg,$files;
$backupattach = "cd $backupdir1;
tar $tararg {$fileprefix1}{$backupdate}.tar $files;
mv {$fileprefix1}{$backupdate}.tar $backupto";
passthru ("$backupattach");
}
?>