-
Notifications
You must be signed in to change notification settings - Fork 4
Description
Goal
Create a Docker Compose setup to easily deploy the integration services needed to run the explorer, mobile wallet (including wallet connect testing), and dApps against a testnet.
Context
Currently the testnet operator manually assembles the Docker Compose from the mainnet archiver config. We should provide a ready-to-use, documented compose file so anyone running a testnet full node can spin up the required services.
Services
The testnet needs the following integration services: Query, Archiver v2, Live, and Stats (see open question below).
Note: The explorer currently has the official testnet URL hardcoded. The mobile wallet already allows pointing to any testnet URL but allows to indicate only one base url for all the service. We can make both more flexible in a near future
Stats service (optional?)
Stats requires extra services and some manual setup on every epoch transition. Both the explorer and the mobile wallet currently use it, but Live/Query/Archiver v2 already cover the core testnet needs.
Question for devs: can we get away without stats for the testnet, or should we include it? If included, should it be optional?
Storage
The archiver and events services accumulate data over time. The README should include recommendations for keeping disk usage under control (e.g. periodic cleanup of store volumes) so long-running testnet instances don't run out of resources.
Question for devs: should we include any automatic purging mechanism, or just document manual cleanup steps?
Deliverables
docker-compose.yamlfor testnet (with stats as default or optional profile — pending decision).env.examplewith configurable values (peer IPs, ports, credentials, ...)README.mddocumenting:- Server/software prerequisites to run the setup successfully
- Quick start steps
- Configuration reference
- Stats setup instructions
Additional references
-
For the test network, kavatak currently run one full node and this docker.
docker-compose.yaml -
Additional notes from kavatak
——————
this is the updated file
the stats api requires some additional setup besides just the docker compose file
in the archive you fill find a linux executable called qubic-stats-processor and a bash script called setupSpectrumData.sh
the stats service requires some data from the spectrum file (circulating supply, nr of addresses, rich list)
this data has to be uploaded to the database via the provided executable, with the bash script simplifying the process
the setup process would look like this:
- place the compose file in your desired location
- create a directory in which to place the executable and the batch script, i would call it something like spectrumData
- run docker compose up -d
- place in the spectrum directory the epoch zip archive (the one containing spectrum.xxx).
- edit the bash script with the number of the epoch. usually the zip that is provided to us is named epXXX.zip, so the script expects this pattern. you may remove the ep prefix if the zip files you use have are named just xxx.zip
- run ./setupSpectrumData.sh. this will extract the files from the zip and run the stats processor in spectrum parser mode, which will read the spectrum file, extract the required data and upload it to the database
make sure that unzip is installed on the server
upon epoch transition you will have to repeat steps 4 - 6 in order to upload the new data
—————
Metadata
Metadata
Assignees
Labels
Type
Projects
Status