Skip to content

Project deployers - rework #20

@Lordfirespeed

Description

@Lordfirespeed

Proposal

Presently, deployment steps are stored in this repository (durhack-deployer) and therefore are tied to the durhack-deployer revision.
Project deployment/teardown steps should be configured within their own repository.

i.e. the commands necessary to get an instance of durhack running (and tear it down later) should be stored in the durhack repository.

This way, we can

  1. run teardown commands from the existing revision
  2. check-out the newly pushed commi
  3. run deployment steps from the new revision

Tying deployment steps to particular revisions is very desirable.

Change motivations (problems with existing setup)

  • When repository deployment steps are changed, this repository must be updated (importantly, this necessity is not obvious)
  • Once repository deployment/teardown steps are changed
    • teardown of an outdated revision is likely to fail, as the new teardown steps don't match the previously executed deployment steps
    • old revisions can no longer be deployed, as the new deployment steps only work for sufficiently new revisions

Implementation notes

How should deployment/teardown steps be specified?

  • Option 1: deployables provide configuration via a domain-specific language (DSL)
    • prior art: GitHub actions .yaml, Dockerfile
    • .yaml does appeal (more than e.g. .json) for its multiline string syntax
      • no stdlib yaml parser, but PyYAML is well-maintained
    • Pro: discourages copy-pasting due to high structuredness, promiting definition of versioned actions
    • Con: validation might be tricky
    • Con: Concurrency is difficult
    • Con: users unlikely to have familiarity with the format
    • Con: would have to write something to extract dependencies from the config
  • Option 2: define a Python 'interface' and have deployables provide implementations
    • Con: encourages copy-pasting instead of defining a new versioned action
    • Pro: concurrency is easy
    • Pro: dependency management is done for us - since build actions are PyPI packages, we can just use uv
    • Pro: prototyping a new action by defining it locally is simple
  • Option 3: deployables provide shell scripts
    • Pro: very transparent
    • Pro: users likely to have some familiarity with the format
    • Con: error handling is difficult
    • Con: Concurrency is difficult
    • Con: Logging is difficult
    • Con: Behaviour validation (testing/static analysis) is difficult

Conclusion - Option 2 sounds best

If we are to have a notion of 'actions' which projects can import/specify as part of their setup/teardown, they should be independently versioned.
It would be a good idea to use the same pattern as GitHub Actions - each action distributed in its own package.
Would need either a monorepo or one repo per action.

Actions would need to be loaded on-the-fly, cached in filesystem (not memory) - is it possible to ensure they don't end up in the import cache? Edit: No, it isn't.
Setup/teardown would need to run in subprocesses with their own Python virtual environment (which has to be set-up with the appropriate dependencies)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions