Connect to your box via ssh and run the following commands to create a backup of your site.
nohup zip -r Backup/YYYY-MM-DD-HHMM.zip www/ > backup_log.txt &
(Replace YYYY with the 4-digit year, MM with the 2-digit month, HH with the 24-hour format of the hour, and MM with the 2-digit minute)
cd ~ navigates to your home folder
mkdir Backup creates the backup directory in which the backups will be stored
nohup is short for no hangup and allows processes started by users at the terminal to continue running even after the user logs out
zip is a program which combines many files into one and compresses them to make the end result even more portable
-r tells zip to burrow into all subdirectories in order to grab all of the files
Backup/YYYY-MM-DD-HHMM.zip is the path to the backup file
www/ is the directory to backup (it may be html, htdocs, httpdocs, etc. on your box)
> backup_log.txt redirects all output from zip to the backup_log.txt file so you can review the file later
& tells linux to run the zip program in the background so that you can logout or perform other tasks without killing the process
Now all you need to do is download that zipped file. Use your favorite SFTP client to login to your box and snag it. I recommend FileZilla Client for all platforms. If you’re looking for an FTP server, FileZilla Server is perfect.
2 responses to “How to backup your website”
Why not take your basic flow from above make it a bit more robust throw it in a script and put it cron so it runs periodically. Remember the best backups are the one’s you don’t have to remember to create.
I agree with Josh, a cron job is a much better idea. Also, you should back up all your database periodically.