백업

    From Meta
    This page is a translated version of the page Backups and the translation is 10% complete.
    Outdated translations are marked like this.
    OOjs UI icon listNumbered-ltr-invert.svgBackups

    Miraheze는 내부적, 외부적으로 백업 절차를 거칩니다. 아래에 있는 일정을 따라 인프라의 몇몇 중요한 부분이 자동적으로 외부 서버에 백업됩니다.(외부서버는 Miraheze의 관리하에 있지만 현재 서버와 다른 국가에 있는 서버입니다.) 반면 내부적 백업은 사용자 계정정보와 CheckUser 정보를 포함한 모든 위키 데이터베이스 등을 백업합니다. 이러한 백업은 사이트 안정성 엔지니어링 팀에서만 액세스 할 수 있으며 치명적인 사이트 오류가 발생한 경우 전체 사이트를 최근 상태로 빠르게 가져올 수 없는 경우에 사용할 수 있습니다.

    As such, Miraheze has a total of three types of backups which are taken. On top of this, users may generate their own backups, quickly and conveniently on demand using our DataDump tool.

    Backup types

    Miraheze takes three types of backups to ensure as much resiliency as possible.

    • Internal backups are backups kept on hand which the Site Reliability Engineering team can use to quickly bring the entire site up in the event of a catastrophic failure. These backups include full database dumps, which include user account information and CheckUser information. See the schedule below for more information.
    • External backups are automatic backups kept on servers controlled by us but on a different host and in a different country. This is done to ensure that a failure on one host or in the power grid of one country, etc., doesn't cause extended downtime or data loss to our users. These types of backups include critical parts of our infrastructure such as the databases of all wikis, private Git repository data, Phabricator configurations, and much more. See the schedule below for more information.
    • Public backups are XML backups which we upload every month to archive.org of all public wikis. We do this to make sure we have a reliable backup of all wikis on an external site along with to ensure users have peace of mind by seeing a backup that is readily available for usage by us/them.

    General backup schedules

    Up to date as of 12 January, 2023

    Miraheze automatically runs the following backups for disaster recovery purposes:

    Internal/External
    • Weekly: Private Git repository for configuration management secrets and SSL keys
    • Weekly: mhglobal (CreateWiki, ManageWiki, global tables) and reports (TSPortal) databases
    • Fortnightly: All other databases in SQL format for all wikis and other services
    • Fortnightly: Phabricator images and database
    • Monthly: piwik (Matomo) database
    • Monthly: XML dump of all private wikis
    • 3-monthly: Full XML dumps of all wikis, including private wikis
    • On demand: XML backups of all wikis scheduled for deletion
    • Not currently ran: Static images for all wikis
    Public
    • Monthly: All public wikis; XML dumps uploaded to archive.org

    Manual backups

    On top of our internal, external, and public backups, users may generate their own using different ways.

    위키의 백업

    이 사적인 백업 외에도, 그들 각각의 위키에 있는 관리자는 Special:DataDump페이지로 가서 그 위키의 XML 이나 이미지 백업을 만들 수 있습니다. 이러한 백업은 당신이 원하는 곳에 안전하게 저장 될 수 잇습니다. XML 백업은 사용자정보나 CheckUser 정보를 포함하고 있지 않고 어느 미디어위키 사이트에서 가져올 수 있는 위키 페이지의 글들과 로그가 포함되어 있습니다. 이미지 덤프는 위키에 업로드 되어있는 모든 유형의 파일들을 포함하지만 라이센스나 설명은 포함하지 않습니다.

    To use DataDump, go to Special:DataDump on your wiki and select what backup you want. Once you submit your request, your backups will be generated. Depending on the size of the wiki, it may take from a few seconds up to a few hours to generate a database dump.

    DataDump API

    DataDump offers an API module which lets users use DataDump via the command line. As of yet, there are no scripts that make use of this.

    Wikiteam dumpgenerator

    While we strongly recommend using DataDump as it's the most convenient, you may also generate a database dump using less interactive command-line scripts. We do not recommend any in particular nor do we endorse any. However, one of these such well-known scripts is the Mediawiki Client Tools' Mediawiki Scraper Python 3 script, based on the original WikiTeam Python 2.7 script.

    User account information will not be preserved. The XML dump can include full or only most recent page history. The images dump will contain all file types with associated descriptions. The siteinfo.json file will contain information about wiki features such as the installed extensions and skins.

    같이 보기