Backups

Miraheze has an internal and external backup procedure. Following the schedule below, certain critical parts of our infrastructure are automatically backed up to an external server (external meaning under our control but provided by a different server host and in a different country than all our current servers). These 'internal' backups include full database dumps, which include user account information and CheckUser information. These backups are only accessible by our Site Reliability Engineering team, and can be used in the event of a catastrophic site failure to quickly bring the entire site up to a recent state.

Backups of a wiki
In addition to these private backups, any wiki administrator can create an XML, image and managewiki_backup backup of their wiki by going to Special:DataDump on their wiki and selecting XML, image or managewiki_backup. These backups can then be stored securely wherever you like. XML backups do not include user account information or CheckUser information but contain wiki page text and logs that you can import to any MediaWiki site. Image dumps contain all file types uploaded to the wiki, but without descriptions or licensing type.

General backup Schedules
Miraheze runs the following backups for disaster recovery purposes:


 * Weekly: Private Git Repository for configuration management secrets and SSL keys
 * Weekly: mhglobal (CreateWiki, ManageWiki, global tables) and reports (TSPortal) databases
 * Fortnightly: All other databases in SQL format for all wikis and other services
 * Fortnightly: Phabricator images and database
 * Monthly: piwik (Matomo) database
 * 3-monthly: Full XML dumps of all wikis, including private wikis


 * Not currently ran: Static images for all wikis

Backups from the command line
There are two noninteractive ways to make local backups for users with the ability to run programs from the command line.

Mediawiki API
Mediawiki wikis can be backed up with the API and Special:Export page. Elsie Hupp and other's Python 3 reworking of the WikiTeam Python scripts does this. The script and instructions are at elsiehupp/wikiteam3 on GitHub.

User account information will not be preserved. The XML dump can include full or only most recent page history. The images dump will contain all file types with associated descriptions. The siteinfo.json file will contain information about wiki features such as the installed extensions and skins.

Miraheze DataDump API
Miraheze wikis offer a DataDump API module. However as yet there are no scripts to make use of this.