My backup strategy - this runs as a daily cron job.
My web root and the databases are on /usr not /var - and /usr is on a 40g IDE drive. The rest of the system is on a 10k rpm SCSI drive. No reason for the web content to be faster than my internet connection
I get one copy of the backup on each spindle and then manually copy one of the backups to a different machine once a week or so. I guess I could automate that if I wanted to.
#!/bin/sh
service httpd stop
mysqldump -u
username -p
password -c vb_bassforum > /usr/www/archive/vb_bassforum.sql.dump
mysqldump -u
username -p
password -c pointbeing > /usr/www/archive/pointbeing.sql.dump
mysqldump -u
username -p
password -c hcbf > /usr/www/archive/hcbf.sql.dump
service httpd start
cp /var/spool/mail/* /usr/www/archive/mail
#change dump permissions so I can restore databases from remote with phpMyAdmin if I need to.
chown wizard /usr/www/archive/*.dump
chgrp apache /usr/www/archive/*.dump
chmod 660 /usr/www/archive/*.dump
tar cf /archive/backup.tar /usr/www/*
cp /archive/* /usr/archive