From Miraheze Meta, Miraheze's central coordination wiki

This page describes a draft for a backup plan.

Technical plan[edit | edit source]

  • A cronjob on mw1 (called 4 times per month)
    • That bash script will call dumpBackup.php for all public wikis (with the possibility to opt-out big wikis), and save the output of dumpBackup.php to gzipped XML dumps. An XML file will first be created (gzipped), transferred to the backup server with SCP or rsync and afterwards deleted from mw1 (this will continue in a loop for all wikis that should be backupped). The two most recent dumps will be stored, and older ones will be deleted automatically.
  • A cronjob on db1 (called 2 times per month)
    • A bash script will dump all databases in multiple gzipped files (for each database a dedicated file). The same thing like above applies: dump database gzipped, transfer to backup server, delete file locally, and continue.

Resource estimations[edit | edit source]

  • 1x 128MB CVZ (name: dumps1 or backup1).

Extra technical information[edit | edit source]

  • This server should only be accessible by operations staff.
  • Besides a running web server on the backup server (so people can download XML dumps of public wikis), no other APT packages or software should be needed for this task.

Likely problems[edit | edit source]

  • mw1 and db1 might get problems with very much increased load
    • Solution: there's no quick solution for this. Either dump less regularly, look at decreasing performance or migrate db1/mw1 to SVZ nodes.
  • The backup server will run out of space at some point.
    • Solution: this is not expected to happen soon. If it ever happens, then we should consider to store less dumps, use better compression (e.g. 7z/bzip2) or upgrade the backup server.

Questions from the audience[edit | edit source]

  • Will dumps be publicly available? *(wikis are public on the internet anyway)*
    • PROS: May allow wiki owners to host their backups independently
    • CONS: May increase transfer cost
Yes, dumps of public wikis can be downloaded. The backup server will have 500GB bandwidth, which should be sufficient for now. Southparkfan (talk) 16:10, 28 September 2015 (UTC)
  • Do we intend to submit to
Yes, but people should download and upload the dumps manually. Southparkfan (talk) 16:10, 28 September 2015 (UTC)