Backups

From Meta
Jump to navigation Jump to search
Other languages:
English • ‎Nederlands • ‎español • ‎magyar • ‎português do Brasil • ‎বাংলা • ‎中文(台灣)‎ • ‎日本語 • ‎한국어

Miraheze has an internal and external backup procedure. Following the schedule below, certain critical parts of our infrastructure are automatically backed up to an external server (external meaning under our control but provided by a different server host and in a different country than all our current servers). These 'internal' backups include full database dumps, which include user account information and CheckUser information. These backups are only accessible by our Site Reliability Engineering team, and can be used in the event of a catastrophic site failure to quickly bring the entire site up to a recent state.

Backups of a wiki

In addition to these private backups, any wiki administrator is able to create an XML or image backup of their wiki by going to Special:DataDump on their wiki and selecting xml or image type. These backups can then be stored securely wherever you like. The XML backups do not include user account information or CheckUser information but contain wiki page text and logs that you can import to any MediaWiki site, the image dump contains all file types uploaded to the wiki but without descriptions or licensing type.

General backup Schedules

Miraheze runs two backup schedules in production:

  • The following are backed up in their entirety every sunday:
    • Our Private Git repository (stored on puppet1)
      • This includes configured passwords, private keys, and certificates for our domains
      • This includes the original source of private keys and certificates, and includes our account information for Let's Encrypt (the CA we use for free certificates)
  • The following are backed up completely on the first Sunday of every month. Changed files and new files are backed up on the third Sunday of the month:
    • Databases, including user information, for all wikis
    • Our static content (wiki images, user xml dumps)
    • Phabricator static (content used by our tracking software)

Local backups

To make backups to your local PC, use WikiTeam's dumpgenerator.py Python script. This is run from the command line and requires Python 2.7 and will produce an XML dump with page histories and a folder of files, (but not user accounts nor extensions). WikiTeam's tutorial offers further details. Note that large wikis may fail to export leaving an incomplete XML dump. The presence of a siteinfo.json file probably indicates a succesful XML dump.

Example usage:
python dumpgenerator.py --api=https://yourwiki.miraheze.org/w/api.php --xml --images
For private wikis use:
python dumpgenerator.py --api=https://yourwiki.miraheze.org/w/api.php --xml --images --user=yourlogin --pass=yourpassw

See Also

For more technical details on our automatic backup server, see Tech:Bacula.