Skip to content
ᴩʜᴏɴᴇᴅʀᴏɪᴅ edited this page Nov 21, 2025 · 4 revisions

Python scripts are made to be easily used on the local computer. Their main task is to work once. They are designed to be run more than once to speed up the work.

Fetches minimal page info for all pages of
the wiki and saves it to all_pages.json.

Similarly, this script creates an all_file_usages.json file containing all the files used in the wiki. The aim here is to provide convenience by exceeding the 500 impression limit.

Downloads the pages found in all_pages.json as wikitext
and saves it with the appropriate path in the wiki folder.

It randomly downloads all files hosted on the wiki at all_file_usages.json and saves them in the File subfolder.

It is still being developed. The purpose is to download the files in all_pages.json as html, thus creating an offline copy of the wiki. The script first makes the language selection in order to reduce the file size. The inclusion of separate files and the last working date are also recorded which will be used later. There are still changes to be made to ensure the HTML is read locally, and there are issues within subfolders.

Clone this wiki locally