Talk:Backups

For users to make their own backups
Can you add a section/page about how a user can make their own backups?
 * A user can't currently make their own backup, but can ask a sysadmin at any time, and it should be done in less than 24 hours. Reception123 (talk) ( contribs  ) 17:29, 27 August 2016 (UTC)
 * Can those backups happen on a scheduled basis? RobertSter (talk) 14:15, 2 September 2016 (UTC)
 * As John, Southparkfan and NDKilla explained, they are done on a scheduled basis, but the costs would be considerably higher if they were done every day, so therefore that isn't possible for the moment. Reception123 (talk) ( contribs  ) 16:07, 2 September 2016 (UTC)
 * Do we have estimates for those costs?RobertSter (talk) 16:33, 2 September 2016 (UTC)
 * The estimates depend on the exact backups, but it can be 400-1800$ more a year (estimation by John). Reception123 (talk) ( contribs  ) 18:48, 2 September 2016 (UTC)
 * ThanksRsterbal (talk) 19:12, 25 November 2019 (UTC)

doing a backup on linux
Here is a quick script I wrote to make a backup in Linux:

DATE=`date +%Y-%m-%d-%H-%M-%S` mysqldump -u root -p$( ~/backups/bitnami_mediawiki$DATE.sql cp /opt/bitnami/apps/mediawiki/htdocs/LocalSettings.php ~/backups/LocalSettings.php cp ~/mw-backup.sh ~/backups/mw-back.sh ls -l ~/backups/
 * 1) !/bin/bash
 * 2) PWD=$(<pwd)

the file pwd in the home directory contains the password.

Rsterbal (talk) 06:07, 9 February 2017 (UTC)

Making your own backups
Requires Python v2 (v3 doesn't yet work) and only works with public wikis.

Run the the WikiTeam Python script dumpgenerator.py from the command-line to get an XML, with edit histories, dump and a dump of all images plus their descriptions.

A XML dump does not create a full backup of the wiki database, the dump does not contain user accounts, extensions nor file types other than images.

Full instructions are at the WikiTeam tutorial.

--Rob Kam (talk) 23:43, 11 March 2017 (UTC)
 * From what I know this script was blocked for a reason (which I do not recall). For now backups will have to be made by system administrators. Once you request on Phabricator they should be done in due time. Please ask User:Southparkfan about the script if needed. Reception123 (talk) ( contribs  ) 18:48, 12 March 2017 (UTC)
 * Well, a quick look looks me that if said block is in place for this script it's not very efficient because the UserAgent is wrong. John (talk) 20:34, 12 March 2017 (UTC)
 * Whatever the problem was it's fixed now. --Rob Kam (talk) 16:09, 15 July 2020 (UTC)

updating text of ganglia to a link
Can the page include a link to Tech:Ganglia for the Ganglia text? RobertSter (talk) 17:32, 31 July 2018 (UTC)
 * ✅ Reception123 (talk) ('C' ) 17:35, 31 July 2018 (UTC)

Is there any way to automatically delete and generate wiki backups once a month?
I'm trying to speed up the process of backing up 3 wikis. It would be nice to have a monthly backup run automatically. Is there any way to do that? --Rsterbal (talk) 14:29, 2 June 2020 (UTC)
 * Well you can use Special:DataDump and it will run whenever you want it to. Reception123 (talk) ( C ) 06:43, 3 June 2020 (UTC)
 * I use datadump, but it would be easier to just pull down an already made backup than to go through the steps of generating it every month --Rsterbal (talk)
 * See Backups, but you'd have to find a way to run this automatically. --Rob Kam (talk) 16:06, 15 July 2020 (UTC)
 * I'm really just asking to have the same process I used to use before they instituted Special:DataDump. Not sure what the benefit in shutting that off was or if the site would consider turning it back on. Rsterbal (talk) 19:21, 31 August 2020 (UTC)

detailed example for dumpgenerator
Example usage: python dumpgenerator.py --api=https://yourwiki.miraheze.org/w/api.php --xml --images

I ran the script and it stopped after 1000 pages. Are there additional flags that need to be set?

Rsterbal (talk) 16:46, 24 April 2021 (UTC)


 * Rsternal I'm not sure what you're trying to do with that Python script, but if you're just wanting to export an XML dump of your wiki pages or download its images, you can do that in Special:DataDump. If you don't have local user rights to generate a dump and it's a public wiki, Stewards have, in the past, faciliated these requests for users. Dmehus (talk) 16:48, 24 April 2021 (UTC)