-
Notifications
You must be signed in to change notification settings - Fork 1
Description
Information networks are more resilient and harder to suppress when they're distributed; we should not rely on a single cloud to store the data in the archive.
One of the intentions behind the Ace Archive API is to allow individuals to back up the archive in its entirety. For this task, write an automation to allow individuals to back up Ace Archive.
This could take the form of a script that people run locally. This script would call the Ace Archive API and download the data to the individual's local filesystem.
The format/layout of the files on the user's local filesystem should look like this:
artifacts
├─ <artifact_slug>
│ ├─ metadata.json
│ └─ files
│ ├─ <file_name>
│ └─ <…>
└─ <…>
The metadata.json file should mirror the schema used here rather than the schema returned by the API. The source_url field should point to the Ace Archive raw file download link.
There also needs to be some observability into who is backing up the archive and how to contact them in a disaster recovery scenario. This script should support (optionally) configuring an email address that is sent to the project maintainers whenever the script is run. A log of timestamped backups and their contact emails should be maintained somewhere. Frawley is open to ideas for how to accomplish this.
On each backup, the tool should compute a checksum of the data it has backed up. Note that only the metadata.json files need to be checksummed, since they themselves contain checksums of their associated files. We would need to decide on a method for computing a single checksum of many files. When the tool phones home with the contact email address, it should also include this checksum.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status