which again, brings me to the point, is ftping' to the site and dump all the folders in my hd the correct way to do a backup? Is there not an app than can do incremental or differential backups? Last time it took me like 6 hours to get it all copied...
it took you 6 hrs to download all files from your ftp if yes either you have a slow connection or you have a very slow host I use this
I use that as well but it only backs up the database.
OP - Who is your host? Usually they have some type of a back up system that allows you to back up everything though ftp will work as well. I use Filezilla on occasion for this and just drag the public_html folder onto your pc, local storage to back it up.
Why not use a Cron Job to compress the public_html folder and store it on the server as a timestamped backup? One could probably even store only x amount of days and auto delete them thereafter. Seems more simple to download a compressed file. And if needed to restore, upload the single compressed file and uncompress it on the server.
Then I recommend either copy everything over besides the 'stuff' directories to your PC, or rename the 'stuff' directories temp and then move them over to your PC.