Backups

Dumps of WikiTide public wikis can be found on Archive.org.

Manual backups
Users may upload their wiki dumps to Archive.org for a further level of backup. Please include "wikitide" in the list of subject tabs.

MediaWiki Scraper
You can generate a database dump and file dump of any public MediaWiki wiki using the MediaWiki Client Tools' MediaWiki Scraper Python 3 dumpgenerator script, (full instructions are at that link).

Example usage
The result will include an XML dump with full page history, a dump of all images and files along with associated descriptions and a  file containing information about features, such as the installed extensions and skins.

Private wikis
To dump a private wiki you will have to use a login that has at least read permission on the wiki.

Issues
If you encounter any problem with running the script, please raise a new issue at the MediaWiki Scraper GitHub repository.

Restoring from backup
You should still check everything functions as expected. Templates, modules, CSS, Javascripts and Gadgets get imported with the XML. Boilerplates, abuse filters don't.
 * Import XML via Special:RequestImportDump
 * Import images and descriptions via create a task at Phorge
 * Import ManageWiki settings (JSON) also via the Phorge task, (extensions, preferences, and so on).
 * Former wiki users will have to re-register
 * Configure any bots if required.
 * Import any custom abuse filters