|
| 1 | +--- |
| 2 | +title: MakeStaticSite |
| 3 | +repo: paultraf/makestaticsite |
| 4 | +homepage: https://makestaticsite.sh/ |
| 5 | +language: |
| 6 | + - Bash |
| 7 | +license: |
| 8 | + - AGPL-3.0-only |
| 9 | +templates: [] |
| 10 | +description: Generate and deploy static websites with Bash. |
| 11 | +--- |
| 12 | + |
| 13 | +A set of Bash scripts to generate static snapshots of (dynamic) websites, |
| 14 | +refine and deploy them using Wget and other open source tools. |
| 15 | + |
| 16 | +Features: |
| 17 | + |
| 18 | +- A straightforward command line interface |
| 19 | +- A setup script that guides users through the creation of a configuration file |
| 20 | +through a simple interactive dialogue; manual editing is not required |
| 21 | +- Support for multiple sites, each with custom settings defined in their own |
| 22 | +configuration file (multiple config files can also be used for any given site). |
| 23 | +- Suitable for batch processes, allowing operations to be scaled up so that any |
| 24 | +or all of the sites are updated in one process. |
| 25 | +- Support for CMS logins, which paves the way for converting your existing site |
| 26 | +whilst maintaining the WordPress installation in situ. |
| 27 | +- Runtime options, such as verbosity, to configure the level of output; and |
| 28 | +whether to archive each build |
| 29 | +- Option of providing a downloadable copy of the entire site as a zip file |
| 30 | +which can be used offline, e.g., off a memory stick, making your site portable. |
| 31 | +- Local and remote (server) deployment options, including rsync over ssh and |
| 32 | +Netlify. |
| 33 | +- For WordPress installations, WP-CLI is used to prepare the site for mirroring |
| 34 | +plus a drop-in search replacement (WP Offline Search plugin) that works |
| 35 | +offline. |
| 36 | +- Snippets – an experimental facility to tweak any page and provide offline |
| 37 | +variants using chunks of HTML. |
| 38 | +- Support for custom attributes using post-processing – by creating a list of |
| 39 | +URLs unaccounted for and then re-running Wget on them |
| 40 | +- W3C standards compliance. Whilst this really depends on the source, pages may |
| 41 | +be cleaned up by HTML Tidy. The system also generates a sitemap XML and |
| 42 | +robots.txt file to match the outputted files. |
| 43 | + |
0 commit comments