User:Southparkfan/BackupsDraft

This page describes a draft for a backup plan.

Technical plan

 * A cronjob on mw1 (called 4 times per month)
 * That bash script will call dumpBackup.php for all public wikis (with the possibility to opt-out big wikis), and save the output of dumpBackup.php to gzipped XML dumps. An XML file will first be created (gzipped), transferred to the backup server with SCP or rsync and afterwards deleted from mw1 (this will continue in a loop for all wikis that should be backupped). The two most recent dumps will be stored, and older ones will be deleted automatically.
 * A cronjob on db1 (called 2 times per month)
 * A bash script will dump all databases in multiple gzipped files (for each database a dedicated file). The same thing like above applies: dump database gzipped, transfer to backup server, delete file locally, and continue.

Resource estimations

 * 1x 128MB CVZ (name: dumps1 or backup1).

Extra technical information

 * This server should only be accessible by operations staff.
 * No other APT packages or software should be needed for this task.

Likely problems

 * mw1 and db1 might get problems with very much increased load
 * Solution: there's no quick solution for this. Either dump less regularly, look at decreasing performance or migrate db1/mw1 to SVZ nodes.
 * The backup server will run out of space at some point.
 * Solution: this is not expected to happen soon. If it ever happens, then we should consider to store less dumps, use better compression (e.g. 7z/bzip2) or upgrade the backup server.